A Solid State plc group company

Providing advanced imaging products, embedded systems and solutions

FPGAs for CPU and GPU processing

Active Silicon’s AI Series – part 4: Cloud-based FPGAs offer accelerated machine learning

November 21, 2017

We have established in our AI series that FPGAs are one of the key technologies in the development of AI. While DNN training may still be best carried out on a GPU, FPGAs are offering unprecedented opportunities in allowing engineers to customize and revise their systems. Previous obstacles have included a limited number of developers with the knowledge and experience necessary to make FPGAs widely appealing, but new developments placing FPGAs in the cloud, and making them available to more common languages, will inevitably encourage wide adoption.

After much anticipation, Amazon have now launched their Elastic Compute Cloud (EC2) F1 instances. The F-series of instances were introduced last year and follow a naming convention linked to the number of CPU cores – F1 has just one core. Using Xilinx chips, these new F1 instances offer customizable cloud-based FPGAs chargeable by the hour; with no long-term commitments or up-front payments, the chips can be programmed multiple times without additional costs. Amazon’s AWS cloud services support all major frameworks, attracting AI development from multiple sectors.

Similarly, Microsoft is in the process of implanting FPGAs across its Azure cloud services, using Intel Stratix 10 FPGAs. As we covered in August, Project Brainwave aims to accelerate the development of DNNs and offer more advanced machine learning to the masses. A recent announcement from Intel, that its FPGAs will be powering the Alibaba cloud, further endorses the growth of FPGAs “as a service”, suggesting it could be embraced by a wide audience.

Interestingly, Google are still sticking to providing their cloud-based AI developments via ASICs rather than FPGAs. Tensor Processing Units (TPUs) are designed to support TensorFlow on GPUs and CPUs with the understanding that the amount of machine learning computation required to train and run AI applications is best supported on the GPU; Google maintains that “Our neural net-based ML service has better training performance and increased accuracy compared to other large scale deep learning systems”.

Of course, engineers have more decisions to make than just a straight choice between the location of their technology. In addition to their FPGAs, Intel have opened doors to an array of deep learning options with products including their Loihi neuromorphic chip, Movidius Neural Compute Stick and Movidius Myriad X Vision VPU SOC. The stick brings AI into the realms of a “plug and play” add-on for end users, truly enabling access to advanced CNNs via the Caffe framework. Loihi, Intel’s first self-learning chip, professes to work like the human brain and get smarter and faster over time. Myriad X is specifically designed for combining and enhancing imaging, visual processing and deep learning. Up to 8 HD resolution RGB cameras can be connected to the chip, and accelerators allow processing up to 700 million pixels per second, all, of course, meeting today’s low power expectations.

Undeniably, the focus on hardware and software development specifically for augmenting Artificial Intelligence is intensifying. This means that it’s not just big-budget autonomous vehicle manufacturers and research-heavy medical applications that can benefit from AI. Machine vision with integrated AI will be able to offer levels of inspection and detection that really were previously limited to human intervention. While resources to program the hardware are still costly, and systems still require a high level of training and integration, adoption of accelerated machine leaning processes is likely to progress relatively slowly, but as new technologies emerge and become mainstream, wider implementation will become commonplace.

As the world of computer imaging progresses apace, we’ll keep you informed of AI developments and ensure our products are compatible with all the latest advancements, whether they’re the in the cloud, on a chip, or coming to an inspection line near you.

View our news updates online or subscribe to our newsletter.