With Map in Hand, Machine Learning is the Next Stop

By Jared Matkin

Global Partner Marketing, Enterprise Linux and Open Source Technology


June 24, 2019


With Map in Hand, Machine Learning is the Next Stop

While machine learning is still relatively new to the embedded computing space, each of the major processor vendors has its own methodology for implementing machine learning.

Once you’ve plotted your course for an IoT development, you’re ready to take it to the next level. To be sure you’re on the right path, check out my recent blog Chart the Right Course for Your IoT System Development. That next level could include such features as active speaker recognition or predictive maintenance, to name a few. One technique used to accomplish features like these is to implement machine learning. That’s the best way to maximize the performance of your system by taking advantage of the available higher performance hardware and associated software.

While machine learning is still relatively new to the embedded computing space, each of the major processor vendors—Intel, AMD, nVidia, and so on—has its own methodology for implementing machine learning, and each is coming on very quickly with newer techniques. An implementation that employs the SUSE tools can be enabled by any of those processors. And for each, a container is available (or will be very shortly) based on a flash image that has support for TensorFlow, which is an open-source framework for machine learning.

In just a few short years, TensorFlow has been readily accepted as a free, open-source framework by the developer community and has claimed a top position in the machine-learning landscape thanks to its ease of use and deployment across a host of platforms. The framework lets you develop the neural networks employed in machine-learning applications. The number of programming languages that work with TensorFlow continues to grow, and it includes all the major ones like C++, Python, Java, and JavaScript. And third-party packages are available for many others.

It’s becoming commonplace to deploy machine learning at the Edge of the IoT, moving away from the Cloud for a variety of reasons. First, the aforementioned microprocessor capabilities are adequate to handle many of the processing needs. Second, handling the machine learning at the Edge eliminates any latencies that are associated with pushing data out to the cloud and back. In a real-time or near real-time application, that time savings could make or break your application. For example, if you’re designing a medical device, having an action occur in real time could mean life or death for a patient. Or, for autonomous drive applications, moving information to the Cloud could result in a crash (a literal one).

Autonomous vehicles are an example of where machine learning is playing a significant role. Each of the various subsystems must be linked together.

Finally, processing the data at the Edge removes the cost associated with transmitting and receiving data to/from the Cloud. Assuming cellular is your transmission medium, costs could rise quickly when you consider the huge amount of data that is pushed around for a machine-learning application. It’s generally only the most compute-intensive applications that require the massive resources only available in the Cloud.

SUSE’s Embedded Linux operating system can be built into an end application to enable a secure, flexible, and scalable system, one that’s based on a proven, time-tested OS. As IoT system designers have discovered, the OS makes it easy to develop, maintain, grow and manage embedded Linux IoT systems across an array of platforms, applications and industries. Its open-source nature allows for a variety of devices, hardware and appliances.

While enabling developers to operate in an environment that they’ve grown accustomed to, SUSE Embedded Linux Operating Systems provide immediate access to tools and training, real-time security, and full support throughout the product lifecycle, including code fixes, patches and updates. A flexible subscription model can minimize costs and allow for product expansion.

Now you have your roadmap and your marching orders. Let machine learning guide your application and, if implemented properly, it’ll tell you what the next steps are. All that’s left is ensuring a secure connection throughout the entire IoT system, and that’s what we’ll look at in the next blog in this series.

Jared Matkin

Marketing Manager | Embedded Systems


Jared has been a part of the embedded business unit at SUSE for three years. During his time there he has served a cross-functional role, collaborating and planning with product management, solution architects, and sales management to identify embedded applications for SUSE’s open source products and solutions, while creating content and stories focused on the benefits of utilizing Linux for embedded system development.

Marketing, communications and branding professional focused on the planning and execution of initiatives that grow pipeline, solidify brand value, increase market share, drive business, and contribute to profitability. I?m a seeker of opportunity within organizations to bring people and ideas together in order to identify and develop stories and strategies that directly strengthen and support business objectives and sales targets while empowering internal teams.

More from Jared