With the introduction of deep learning, artificial intelligence is steadily making its way into new applications such as autonomous driving, augmented reality or achieve better quality in other applications such as customer support, fraud detection, speech recognition and translation, image and video analysis and enhancement.
Because of the nature of deep learning which requires significant computational capabilities, it is rational for many applications to be run on cloud-based infrastructure. The top cloud computing platforms are all betting big on democratizing artificial intelligence including both hardware and on-demand models. Some deep learning models require running on cloud due to the size of the model or processing time and some others do so because of business justifications.
On the other hand, there are applications where low-latency requirements of deep learning make us run it on edge devices. The edge can also provide additional benefits in terms of privacy, bandwidth efficiency, and scalability which could be vital for some applications. However, edge devices are less powerful than cloud servers, and many are subject to energy constraints. Hence, new resource and energy-oriented deep learning models are required.
Short Bio:
Dr. Hamid Reza Vaezi Jose received the B.Sc. and M.Sc. degrees in Computer Engineering from Sharif University of Technology, Iran, in 2006 and 2008, respectively. He then received his PhD from the school of Computer Science at Simon Fraser University, Vancouver in 2013. Next, he joined Flyover R&D team at Apple Inc. just after graduation. He then joined the Applied Science group at Microsoft Research in January 2017 as a senior research scientist. His research interests include image processing, computer vision, machine learning, color and illumination.
All rights reserved by . Designed By : Hamayeshnegar ( Online portal management and Arbitration Conference ) ver.10.0.4