Article

Beyond the datasheet: The no-code route to MEMS-based machine learning at the edge

Philip Ling
virtual reality headset
Conversations about why artificial intelligence and machine learning are reshaping embedded electronics continue.

What's Next Magazine

Conversations about why artificial intelligence (AI) and machine learning (ML) are reshaping embedded electronics continue. It is a complex topic for engineering teams and business leaders.

As a subset of AI, machine learning is, perhaps, a little simpler to address. It can deliver technical and commercial benefits, with a lower cost to implement. New value propositions for implementing ML at the edge in industrial, commercial, medical and automotive applications are emerging regularly.

Fundamentally, ML assists in applications where the parameters are well defined. In some ways, ML is the software equivalent of an industrial robot. It has a task, it knows what that task is, and it ignores anything that isn’t related to that task. AI, on the other hand, can be thought of as the autonomous humanoid robot that is free to figure it out for itself.

Even though it may be simpler, implementing ML still involves gathering data, training a model on a large computer, and then porting that model to the target. Often, the target is a smaller processor or microcontroller with a different instruction set from the processor used for training.

The introduction of MEMS sensors with integrated machine learning capability is disrupting this approach to deploying ML. In place of a microcontroller, the sensors have an integrated and dedicated machine learning core. These cores are not programmable in the traditional sense. Engineers don’t need to port the model from one instruction set to another or integrate it into the application code.

Once the machine learning core has been trained to recognize specific activities, the MEMS sensor communicates these detections to a host via user-assigned values in a register. The values correlate to actions and are configured during training. For example, “0” may be assigned to “walking,” while “1” may be assigned to “sitting” for a wearable sensor.

The recognition is based on data coming from the sensing elements and all of the processing takes place in the machine learning core. Although the sensor data is still available to the host processor, it doesn’t need to run an ML model to make sense of the MEMS sensor data.

Even though ML is still an emerging technology, this is a significant deviation from conventional development and deployment. The key to this approach is in the training and configuration of the machine learning core.

What is a machine learning core?

A machine learning core is a dedicated hardware element, integrated into ST MEMS sensors, that processes sensor data to detect application-specific movements and actions.

Why use a machine learning core?

A machine learning core removes the need to develop, port, and run a machine learning inferencing model in the host microcontroller. It also reduces the active power, by decreasing the MCU workload and data transfer.

How do I use a machine learning core?

STMicroelectronics has developed MEMS-Studio, a software tool that provides a no-code, GUI-driven approach to implementing machine learning in its MEMS sensors.

 

Inside a MEMS sensor

To understand how the machine learning core “knows” what the sensors are detecting, let’s look at how a MEMS sensor operates. Inside the sensor is a tiny, machined element. This physical mass is held in place by flexible supports, which allow it to move, or be displaced, by whatever force it is measuring.

In the case of a 3- or 6-axis MEMS sensor, the force may be acceleration or linear displacement, or rotation. The force creates mechanical stress, but it is detected electronically. The stress causes small but measurable changes in the electrical capacitance of the mass. The signal created is equally small, but directly linked to the strength or direction of the force acting upon the mass.

In a regular MEMS sensor, the signal is amplified, digitized and made available to a host processor. The software running on the host will decide what the measurements mean. The OEM’s engineers would need to write the code to evaluate the measurements and resolve them into actions, such as changes in direction or speed of travel.

In STMicroelectronics MEMS sensors equipped with a machine learning core, that evaluation is carried out on the device. Instead of sending separate measurements, the core evaluates all the data into actions or activities.

chart
MEMS Studio from STMicroelectronics makes it simpler to capture training data and deploy machine learning.

The ISM330DHCX (and the AEC-Q100 qualified variant, the ASM330LHHX) is an example of the STMicroelectronics MEMS sensors now equipped with its machine learning core. With an accelerometer and gyroscope, the sensor’s machine learning core can be configured to recognize sensor outputs that relate to actions that are specific to the application. In an activity tracker this could include walking, running, or laying down. For the automotive version, it could recognize when the vehicle has been involved in a collision, or that the vehicle is being lifted or jacked up. The intensity of these actions can also be used as a trigger, such as how fast a person changes direction when running (impact detection), or how quickly they sit down (fall detection). The latest generation of AXL sensors (LIS2DUX12 and LIS2DUXS12) also include the MLC, allowing them to process acceleration data locally.

For industrial applications, such as structural health monitoring in buildings (using the IIS2ICLX), or vibration analysis (using the ISM330DHCX or ISM330BX), the MLC may also be used to process raw sensor data locally.

How to train your machine learning core

STMicroelectronics has created MEMS-Studio, a software tool designed to support development with MEMS sensors. Part of the tool provides GUI-driven training of the machine learning core.

MEMS Studio from STMicroelectronics makes it simpler to capture training data and deploy machine learning.

Training involves capturing and labeling real-world data from the sensors. To make the data as real-world as possible, ST recommends using a device that is as close to the final product in form, fit, and function as possible.

Capturing and labeling data in this way provides MEMS-Studio with the raw information it needs to create decision trees. A decision tree is essentially a machine learning inferencing model, trained for a specific application.

Deploying the decision tree involves sending the configuration data to the MEMS device. This is also known as flashing, because the information is stored in the on-chip flash (non-volatile) memory.

As outlined above, part of the configuration involves assigning values to actions. The host application can read these values from the MEMS sensor’s registers. The engineering team decides the values that correspond to the pretrained actions. MEMS-Studio also allows optimization, using the sensor’s integrated filters.

Several MEMS sensors equipped with the machine learning core are available now. Many have the capacity to store several decision trees. Talk to your Avnet representative to find out more about ST’s range of MEMS sensors and how the right solution could enable your next application to leverage the power of machine learning.

About Author

Philip Ling
Philip Ling, Technical Content Manager, Corporate Marketing

Philip Ling is a Technical Content Manager with Avnet. He holds a post-graduate diploma in Advanced ...

Marketing Content Spots
Related Articles
Related Articles
globe
2025 supply chain insights you won’t get from a genAI chatbot
By David Paulson   -   January 16, 2025
What are the biggest risks and/or opportunities facing stakeholders across the high-tech supply chain in 2025? The greatest opportunities lie in the details others overlook.
man and woman watching robotic arm
AI-powered robots coming to a factory near you
By Karen Field   -   January 15, 2025
The smart robot trend stands to grow dramatically in the next few years, with enabling technologies like Edge AI powering the growth trend. Find out just how big the smart robot trend really is.
Related Events
Related Events