Prophesee Releases Its Entire Event-Based Vision Software Suite for Free

By Chad Cox

Production Editor

Embedded Computing Design

June 14, 2022

News

Image Provided by Prophesee

The latest version of the 5X award-winning suite includes a full set of Machine Learning tools, new key Open-Source modules, ready-to-use applications, code samples, and allows for completely free evaluation, development, and release of products with the included commercial license.

Prophesee SA, the creator of the most advanced neuromorphic vision systems, announced today that the latest release of its award-winning Metavision® Intelligence suite will be made available free of charge - in its entirety and for all modules - providing an accelerated path to explore and implement differentiated machine vision applications that leverage the performance and efficiency of event-based vision. From initial adoption to commercial development and release of market-ready products, the industry's most comprehensive suite of software tools and code samples will be available for free.

Engineers can easily develop computer vision applications on a PC using this advanced toolkit for a variety of markets, including industrial automation, IoT, surveillance, mobile, medical, automotive, and more.

Metavision Intelligence 3.0's free modules are accessible via C++ and Python APIs and include a comprehensive machine learning toolkit. The suite also provides a no-code option via the Studio tool, allowing users to play pre-recorded datasets provided for free without the need for an event camera. Users can stream or record events from their event camera in seconds using an event camera.

The suite includes 95 algorithms, 67 code samples, and 11 ready-to-use applications. High-speed counting, vibration monitoring, spatter monitoring, object tracking, optical flow, ultra-slow motion, machine learning, and other algorithms are available as plug-and-play. It offers users both C++ and Python APIs, as well as extensive documentation and a diverse set of samples organized by implementation level, to gradually introduce the concept of event-based machine vision.

New features enable a faster transition to custom solutions.

The latest release includes improvements to help developers reduce time to production, allowing them to stream their first events in minutes or even build their own event camera from scratch using the provided camera plugins under an open-source license as a foundation.

They now have the ability to port their applications to Windows or Ubuntu operating systems. By providing source code access to key sensor plugins, Metavision Intelligence 3.0 features enable access to the full potential of advanced sensor features (e.g., anti-flickering, bias adjustment).

The Metavision Studio tool has also improved the user experience by improving onboarding guidance, UI, ROI, and bias setup.

New Core Machine Learning modules will bridge the gap between frame-based and event-based vision systems.

An open-source event-to-video converter and a video-to-event simulator are among the core ML modules. The event-to-video converter builds grayscale images based on events using a pretrained neural network. This enables users to make the most of their existing development resources by processing event-based data and building algorithms on top of it.

The video-to-event pipeline breaks down the barrier of data scarcity in the event-based domain by enabling the conversion of conventional frame-based datasets to event-based datasets.

Developers can easily download the Metavision Intelligence Suite and begin building products leveraging Prophesee sensing technologies for free.

“We have seen a significant increase in interest and use of Event-Based Vision and we now have an active and fast-growing community of more than 4,500 inventors using Metavision Intelligence since its launch. As we are opening the event-based vision market across many segments, we decided to boost the adoption of MIS throughout the ecosystem targeting 40,000 users in the next two years. By offering these development aids, we can accelerate the evolution of event-based vision to a broader range of applications and use cases and allow for each player in the chain to add its own value,” said Luca Verre, co-founder and CEO of Prophesee.

For more informtion visit prophesee.ai.

Chad Cox. Production Editor, Embedded Computing Design, has responsibilities that include handling the news cycle, newsletters, social media, and advertising. Chad graduated from the University of Cincinnati with a B.A. in Cultural and Analytical Literature.

More from Chad