The Evolution of AI Inferencing - StoryJanuary 19, 2021
The AI inference market has changed dramatically in the last three or four years. Previously, edge AI didn’t even exist and most inferencing capabilities were taking place in data centers, on super computers or in government applications that were also generally large-scale computing projects.
The Four Stages of Inference Benchmarking - StoryFebruary 17, 2020
This blog discusses how to benchmark inference accelerators to find the one that is the best for your neural network.
Evolution of Embedded FPGA From Aerospace, Networking and Communications to Artificial Intelligence, and More - BlogJune 27, 2019
This article will review the various generations of eFPGA, ending with the current features available today.
Taking the Top off TOPS in Inferencing Engines - StoryJanuary 30, 2019
With the explosive growth of AI, there has become an intense focus on new specialized inferencing engines that can deliver the performance that AI demands.
Using embedded FPGA as a reconfigurable accelerator - Eletter ProductDecember 19, 2017
Embedded FPGA is a fairly new technology, but it is quickly finding a home in a range of applications. One use growing in popularity? Connecting it to a processor bus as a reconfigurable accelerator.