Anyone dug into Silicon Labs previously?
Trying to figure out their "in house" AI/ML accelerator to see if any connections.
They not saying much about it apparently.
Some article info below and I know they also connected to Edge Impulse like BRN
Also is inference on the chip...no cloud requirement apparently
Article below and their latest data sheet attached.
Silicon Labs's 2.4-GHz wireless SoCs with integrated AI/ML accelerators improve AI/ML performance with lower energy consumption.
www.electronicproducts.com
Silicon Labs unveils Matter-ready 2.4-GHz wireless SoCs
Posted on
February 8, 2022 by
Gina Roos
Silicon Labs’s 2.4-GHz wireless SoCs with integrated AI/ML accelerators improve AI/ML performance with lower energy consumption.
Matter-ready and built for artificial intelligence inferencing at the edge, Silicon Labs has unveiled the
BG24 and
MG24 families of 2.4-GHz wireless SoCs with integrated artificial intelligence/machine learning (AI/ML) accelerators and support for multiple protocols. The company also released a new software toolkit to help developers build AI and ML algorithms using popular tool suites like TensorFlow.
The hardware and software platform was developed to help bring
AI/ML applications and wireless high performance to battery-powered IoT edge devices. The new hardware also supports Matter, ZigBee, OpenThread, Bluetooth Low Energy, Bluetooth mesh, proprietary and multi-protocol operation.
Silicon Labs claims the 2.4-GHz wireless SoCs are the first with integrated AI/ML accelerators. As the first ultra-low powered devices with dedicated AI/ML accelerators built-in, the BG24 and MG24 are said to ease performance and energy penalties when deploying AI or ML at the edge.
This is “going to start to be pretty compelling for our customers. You can do inference on end-node devices without having to push data to the cloud,” said Ross Sabolcik, VP/GM of Industrial and Commercial IoT, at Silicon Labs.
The specialized hardware is designed to handle complex calculations.
Silicon Labs’s internal testing showing up to a 4× improvement in AI/ML performance along with up to a 6× improvement in energy efficiency. “Because the ML calculations are happening on the local device rather than in the cloud, network latency is eliminated for faster decision-making and actions,” said the company.
“If you’re going to do [AI] inference and you’re at the edge, power consumption becomes really important,” said Sabolcik. “When we added the accelerator a big focus was ‘okay, you can do inference at the edge, but if it kills your battery life then there’s no advantage to it.’ You might as well wake up the radio and push the data to the cloud if you have the bandwidth.”
The challenge was to make it usable within the constraints of a battery-powered edge node, he said. “We can run 4× faster doing the inferences with the engine versus the MCU core with a 6× to 10× improvement in energy efficiency.”
Not only can AI/ML improve existing applications it also opens up new applications. “There’s a huge appetite in the industry to find ways to use AI/ML at the edge. Some of those use cases are there and maturing and others are emerging.”
A few examples cited include wake-word detection for audio, simple video analysis, security sensing such as glass-break sensors, occupancy sensing, and predictive maintenance.
More features
The 2.4-GHz wireless SoCs support proprietary and multi-protocol operation (including including Zigbee, OpenThread, Bluetooth Low Energy, Bluetooth mesh), depending on the SoC, and include larger memory devices and upgraded peripherals, together with a high-performance radio.
“The fact that we have all those wireless protocols in one house makes Silicon Labs attractive to customers in that we can be a one-stop shop for applications that need to mix and match those protocols. I think that’s a real strength of ours,” said Sabolcik.
“We see ourselves as fairly agnostic,” he added. “We want customers to pick the best solution based on their needs.”
This combination of supporting all of the protocols – Matter, Thread, Zigbee, Bluetooth Low Energy, Bluetooth mesh, the ability to do the multi-protocol support, the inclusion of the AI/ML accelerator, the larger memories to allow for field upgradeability and futureproofing, and adding the rich peripheral set, and doing all this on battery power in edge-node devices “is industry leading and pretty unique in what we’re bringing to the table, said Sabolcik.
Part of the bold claim is also based on the SoCs’ radio, which is the company’s highest performance radio in the 2.4-GHz space, offering a high output power of up to 20 dBm, along with “extremely” good sensitivity on the receive side and very low power consumption, targeting line-powered or battery-powered end nodes.
“There are a couple reasons why output power could be important,” said Sabolcik. “The first is if you’re trying to achieve a longer range. If you can broadcast at a higher output power that can better result in a longer range.”
Another reason is a constrained form factor, he added. “If your form factor is constrained and you don’t have the luxury of having a super-efficient antenna that you can put either on your board or in your case, output power can overcome some of those antenna limitations.”
A third reason is to improve your link budget, Sabolcik said. “Even if you need a shorter range, you could see better throughput and less data error.”
Silicon Labs also didn’t skimp on the compute side. In addition to the high-performance 2.4-GHz radio, the single-die BG24 and MG24 SoCs combine a 78-MHz ARM Cortex-M33 processor, an optimized combination of Flash (up to 1536 kB), and up to 256 kB of RAM. The AI/ML accelerator processes the ML algorithms while offloading the ARM Cortex-M33.
The devices deliver the company’s largest memory and flash capacity in its portfolio. “This means that the device can evolve for multi-protocol support, Matter, and trained ML algorithms for large datasets, said Sabolcik.
The company also beefed up the peripherals, in particular with a 20-bit ADC. “In addition, we have a lot of peripherals that we’ve added to this device including a very high-performance ADC to interface and sense with the world,” he said. “We’ve always had ADCs but getting it to a 20-bit ADC is new for us and opens a lot of new applications.”
Building on its security roadmap, the SoCs come with PSA Level 3 security. With PSA Level 3
Secure Vault protection, the devices provide the security needed for devices in smart homes, medical equipment, and industrial applications.
On the tool side, in addition to natively supporting TensorFlow, Silicon Labs has partnered with leading AI and ML tool providers like SensiML and Edge Impulse to simplify the development of machine-learning models for embedded wireless applications. These tools, together with Silicon Labs’s
Simplicity Studio and the BG24 and MG24 SoCs, enable developers to create applications that use data from various connected devices, all communicating with each other using Matter to make intelligent machine learning-driven decisions, said the company.
Development and testing the new platform is underway in a closed Alpha program. The EFR32BG24 and EFR32MG24 SoCs in 5 × 5 mm QFN40 and 6 × 6 mm QFN48 packages will be available for mass deployment in April 2022. Multiple evaluation boards are available now. Modules based on the BG24 and MG24 SoCs will be available in the second half of 2022. Resources for more information about the new SoCs include a
Tech Talk, “Unboxing the new BG24 and MG24 SoCs.”
View attachment 5005
Silicon Labs has introduced a wireless microcontroller with a hardware AI accelerator that can work with Matter, Zigbee, OpenThread, Bluetooth Low Energy
www.electronicsweekly.com