BRN Discussion Ongoing

M_C

Founding Member

Silicon Labs' Most Capable Family of SoCs

The single-die BG24 and MG24 SoCs combine a 78 MHz ARM Cortex-M33 processor, high-performance 2.4 GHz radio, industry-leading 20-bit ADC, an optimized combination of Flash (up to 1536 kB) and RAM (up to 256 kB), and an AI/ML hardware accelerator for processing machine learning algorithms while offloading the ARM Cortex-M33, so applications have more cycles to do other work. Supporting a broad range of 2.4 GHz wireless IoT protocols, these SoCs incorporate the highest security with the best RF performance/energy-efficiency ratio in the market.

Availability

EFR32BG24 and EFR32MG24 SoCs in 5 mm x 5 mm QFN40 and 6 mm x 6 mm QFN48 packages are shipping today to Alpha customers and will be available for mass deployment in April 2022. Multiple evaluation boards are available to designers developing applications. Modules based on the BG24 and MG24 SoCs will be available in the second half of 2022.

To learn more about the new BG24 family, go to: http://silabs.com/bg24.
To learn more about the new MG24 family, go to: http://silabs.com/mg24.
To learn more about how Silicon Labs supports AI and ML, go to: http://silabs.com/ai-ml.

ai-ml-very-edge.png

AI at the Edge​

Why Very Edge?​

  • Ever-increasing demand for small, integrated solutions
  • High volume cost-sensitive markets require cost-effective edge solutions
  • Battery powered devices need lower power consumption
  • Small form factor requirements for size constraint devices
  • Increase security: data never leaves the sensing device

Benefit Examples of Artificial intelligence and Machine Learning at the Very Edge​

optimized-bandwidth.png

Optimized Bandwidth​

Edge device sensors can generate vast quantities of raw data, and therefore occupy large amounts of bandwidth. Also, long-range, low-power communication could have limited bandwidth by default. AI/ML-enabled end nodes can pre-process data and transmit only what matters – helping to reduce bandwidth.

faster-design-time.png

Faster Design Time​

Specialized AI modeling software create models that are used by small application MCUs, therefore avoiding complicated coding typically required to detect subtle differences in raw data.

smaller-low-power.png

Smaller Design & Low-Power​

AI/ML-based processing adds functional benefits and capabilities but without adding to the memory footprint or MCU requirements since code size tends to be reduced compared to traditional algorithms. Local processing also reduces current as radio communications are reduced.

low-latency.png

Low Latency​

Captured data is processed on the spot without sending to an aggregator on the network – this enables real-time operation.

privacy-security.png

Privacy, IP Protection & Security​

Without sharing the vast majority of data outside of the device, bad actors have less data with which to engage in hacking activities. As raw data never leaves the device, Privacy and Intellectual Property protection is highly effective.

offline.png

Offline Mode Operation​

Since there is no need for external computing, local processing enables full offline-mode operation.

cost-reduction.png

Cost Reduction​

As data is being processed locally, the cost of data traffic, processing and storage could be significantly reduced.
 
  • Like
  • Fire
Reactions: 28 users

Proga

Regular
Hi Dhm

I have not heard this question asked before and as a result have not heard any answers however I would think we need answers to the following as well:

The first question is what is the Edge. Brainchip defines the Edge where EV's are concerned as the whole vehicle and anything that is not done on the vehicle is not being done at the Edge.

The second question what is the cost of the computing being done on a Tesla? This cost has components such as the actual cost of the architecture being used and power being consumed by it to perform these functions and on this side to what extent does the physical weight of the architecture impact the amount of power available for the driving wheels?

Certainly for most functions Tesla runs a connected network and is constantly exchanging data with the cloud however some compute must be occurring on vehicle which using Brainchip's definition is at the Edge.

The following though begs the question of how Edge is Tesla's Edge:
(from September, 2021)
"TESLA MAKES CARS. Now, it’s also the latest company to seek an edge in artificial intelligence by making its own silicon chips.

At a promotional event last month, Tesla revealed details of a custom AI chip called D1 for training the machine-learning algorithm behind its Autopilot self-driving system. The event focused on Tesla’s AI work and featured a dancing human posing as a humanoid robot the company intends to build.

Tesla is the latest non traditional chipmaker to design its own silicon. As AI becomes more important and costly to deploy, other companies that are heavily invested in the technology—including Google, Amazon, and Microsoft—also now design their own chips.

At the event, Tesla CEO Elon Musk said squeezing more performance out of the computer system used to train the company’s neural network will be key to progress in autonomous driving. “If it takes a couple of days for a model to train versus a couple of hours, it’s a big deal,” he said.

Tesla already designs chips that interpret sensor input in its cars, after switching from using Nvidia hardware in 2019. But creating a powerful and complex kind of chip needed to train AI algorithms is a lot more expensive and challenging.

“If you believe that the solution to autonomous driving is training a large neural network, then what followed was exactly the kind of vertically integrated strategy you’d need,” says Chris Gerdes, director of the Center for Automotive Research at Stanford, who attended the Tesla event.

Many car companies use neural networks to identify objects on the road, but Tesla is relying more heavily on the technology, with a single giant neural network known as a “transformer” receiving input from eight cameras at once.

“We are effectively building a synthetic animal from the ground up,” Tesla’s AI chief, Andrej Karpathy, said during the August event. “The car can be thought of as an animal. It moves around autonomously, senses the environment and acts autonomously.”

Transformer models have provided big advances in areas such as language understanding in recent years; the gains have come from making the models larger and more data-hungry. Training the largest AI programs requires several million dollars worth of cloud computer power.

David Kanter, a chip analyst with Real World Technologies, says Musk is betting that by speeding the training, “then I can make this whole machine—the self-driving program—accelerate ahead of the Cruises and the Waymos of the world,” referring to two of Tesla’s rivals in autonomous driving.

Gerdes, of Stanford, says Tesla’s strategy is built around its neural network. Unlike many self-driving car companies, Tesla does not use lidar, a more expensive kind of sensor that can see the world in 3D. It relies instead on interpreting scenes by using the neural network algorithm to parse input from its cameras and radar. This is more computationally demanding because the algorithm has to reconstruct a map of its surroundings from the camera feeds rather than relying on sensors that can capture that picture directly.

But Tesla also gathers more training data than other car companies. Each of the more than 1 million Teslas on the road sends back to the company the videofeeds from its eight cameras. Tesla says it employs 1,000 people to label those images—noting cars, trucks, traffic signs, lane markings, and other features—to help train the large transformer. At the August event, Tesla also said it can automatically select which images to prioritize in labeling to make the process more efficient.

Gerdes says one risk of Tesla’s approach is that, at a certain point, adding more data may not make the system better. “Is it just a matter of more data?” he says. “Or do neural networks’ capabilities plateau at a lower level than you hope?”

Answering that question is likely to be expensive either way."

My opinion only DYOR
FF

AKIDA BALLISTA

Hoping you guys can answer me this: Last night I was out with friends and I was telling them about Brainchip and its multi sector advantages, noteably in last nights conversation, with EVs. I was asked about Teslas, for example. What percentage - roughly - of Tesla computing is currently Edge based, and what is Cloud based?
Tesla's new AI system is called Dojo. However, Musk said on the last earnings call he is still trying to use GPU's to run ADAS and is waiting on the GPU team to tell him to switch to Dojo. Tesla is also having problems getting Dojo to work.

 
  • Like
Reactions: 7 users
Thank you, I found that source as well. I like the animal analogy, however I see little evidence that the 'transformer' is little more than just a 'gatherer' that passes the data received from it on to the cloud. I plan to phone Tesla here in Sydney tomorrow about this. My bet is the response will be 'umm, err...."
Yes I think that might be the answer as well. Dojo is most certainly not when they get it up and running designed to be anywhere but in the data centre taking away even more of the compute from the vehicle at the Edge.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
Reactions: 8 users

Diogenese

Top 20
Hi Dhm

I have not heard this question asked before and as a result have not heard any answers however I would think we need answers to the following as well:

The first question is what is the Edge. Brainchip defines the Edge where EV's are concerned as the whole vehicle and anything that is not done on the vehicle is not being done at the Edge.

The second question what is the cost of the computing being done on a Tesla? This cost has components such as the actual cost of the architecture being used and power being consumed by it to perform these functions and on this side to what extent does the physical weight of the architecture impact the amount of power available for the driving wheels?

Certainly for most functions Tesla runs a connected network and is constantly exchanging data with the cloud however some compute must be occurring on vehicle which using Brainchip's definition is at the Edge.

The following though begs the question of how Edge is Tesla's Edge:
(from September, 2021)
"TESLA MAKES CARS. Now, it’s also the latest company to seek an edge in artificial intelligence by making its own silicon chips.

At a promotional event last month, Tesla revealed details of a custom AI chip called D1 for training the machine-learning algorithm behind its Autopilot self-driving system. The event focused on Tesla’s AI work and featured a dancing human posing as a humanoid robot the company intends to build.

Tesla is the latest non traditional chipmaker to design its own silicon. As AI becomes more important and costly to deploy, other companies that are heavily invested in the technology—including Google, Amazon, and Microsoft—also now design their own chips.

At the event, Tesla CEO Elon Musk said squeezing more performance out of the computer system used to train the company’s neural network will be key to progress in autonomous driving. “If it takes a couple of days for a model to train versus a couple of hours, it’s a big deal,” he said.

Tesla already designs chips that interpret sensor input in its cars, after switching from using Nvidia hardware in 2019. But creating a powerful and complex kind of chip needed to train AI algorithms is a lot more expensive and challenging.

“If you believe that the solution to autonomous driving is training a large neural network, then what followed was exactly the kind of vertically integrated strategy you’d need,” says Chris Gerdes, director of the Center for Automotive Research at Stanford, who attended the Tesla event.

Many car companies use neural networks to identify objects on the road, but Tesla is relying more heavily on the technology, with a single giant neural network known as a “transformer” receiving input from eight cameras at once.

“We are effectively building a synthetic animal from the ground up,” Tesla’s AI chief, Andrej Karpathy, said during the August event. “The car can be thought of as an animal. It moves around autonomously, senses the environment and acts autonomously.”

Transformer models have provided big advances in areas such as language understanding in recent years; the gains have come from making the models larger and more data-hungry. Training the largest AI programs requires several million dollars worth of cloud computer power.

David Kanter, a chip analyst with Real World Technologies, says Musk is betting that by speeding the training, “then I can make this whole machine—the self-driving program—accelerate ahead of the Cruises and the Waymos of the world,” referring to two of Tesla’s rivals in autonomous driving.

Gerdes, of Stanford, says Tesla’s strategy is built around its neural network. Unlike many self-driving car companies, Tesla does not use lidar, a more expensive kind of sensor that can see the world in 3D. It relies instead on interpreting scenes by using the neural network algorithm to parse input from its cameras and radar. This is more computationally demanding because the algorithm has to reconstruct a map of its surroundings from the camera feeds rather than relying on sensors that can capture that picture directly.

But Tesla also gathers more training data than other car companies. Each of the more than 1 million Teslas on the road sends back to the company the videofeeds from its eight cameras. Tesla says it employs 1,000 people to label those images—noting cars, trucks, traffic signs, lane markings, and other features—to help train the large transformer. At the August event, Tesla also said it can automatically select which images to prioritize in labeling to make the process more efficient.

Gerdes says one risk of Tesla’s approach is that, at a certain point, adding more data may not make the system better. “Is it just a matter of more data?” he says. “Or do neural networks’ capabilities plateau at a lower level than you hope?”

Answering that question is likely to be expensive either way."

My opinion only DYOR
FF

AKIDA BALLISTA
Hi FF,

Each of the more than 1 million Teslas on the road sends back to the company the videofeeds from its eight cameras. Tesla says it employs 1,000 people to label those images—noting cars, trucks, traffic signs, lane markings, and other features—to help train the large transformer. At the August event, Tesla also said it can automatically select which images to prioritize in labeling to make the process more efficient.

We know about bias in other AI applications. Will this bias the Tesla data to more affluent areas ... nicely paved roads ... well maintained ... well signposted ...
 
  • Like
  • Haha
Reactions: 14 users

Jefwilto

Regular
Hi FF,

Each of the more than 1 million Teslas on the road sends back to the company the videofeeds from its eight cameras. Tesla says it employs 1,000 people to label those images—noting cars, trucks, traffic signs, lane markings, and other features—to help train the large transformer. At the August event, Tesla also said it can automatically select which images to prioritize in labeling to make the process more efficient.

We know about bias in other AI applications. Will this bias the Tesla data to more affluent areas ... nicely paved roads ... well maintained ... well signposted ...
Yes Dio,what about Teslas that drive on LHS 😂 this will req a whole new Data Set 🙄
 
  • Like
  • Haha
Reactions: 12 users
Around my area it seems to bias them to parking across the lines defining the space you park in. LOL FF
 
  • Haha
  • Like
Reactions: 13 users
Hi FF,

Each of the more than 1 million Teslas on the road sends back to the company the videofeeds from its eight cameras. Tesla says it employs 1,000 people to label those images—noting cars, trucks, traffic signs, lane markings, and other features—to help train the large transformer. At the August event, Tesla also said it can automatically select which images to prioritize in labeling to make the process more efficient.

We know about bias in other AI applications. Will this bias the Tesla data to more affluent areas ... nicely paved roads ... well maintained ... well signposted ...
Reminds me actually of those movies like Return to OZ where the all seeing all knowing was a tiny little rather funny man with a microphone.

In this case it is a 1,000 people labelling new photos as fast as they possibly can so someone with $US135,000 can play with their phone and not watch the road when driving on a freeway.

All speaks of Roman decadence just before the fall. LOL

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Haha
  • Thinking
Reactions: 21 users

Getupthere

Regular
  • Like
  • Fire
  • Thinking
Reactions: 5 users

Diogenese

Top 20
Around my area it seems to bias them to parking across the lines defining the space you park in. LOL FF
That's why they invented sliding rooves.
 
  • Like
  • Haha
Reactions: 4 users

Diogenese

Top 20
Reminds me actually of those movies like Return to OZ where the all seeing all knowing was a tiny little rather funny man with a microphone.

In this case it is a 1,000 people labelling new photos as fast as they possibly can so someone with $US135,000 can play with their phone and not watch the road when driving on a freeway.

All speaks of Roman decadence just before the fall. LOL

My opinion only DYOR
FF

AKIDA BALLISTA
Now leave @Violin1 out of this ...
 
  • Like
  • Haha
Reactions: 4 users

hamilton66

Regular
Why did Mercedes announce they were working with us in January this year? I'm sure others on here have thought about this. They had no pressure to announce this and by not announcing this (at least until it was already inside commercial vehicles, if ever) they would have had potential to greatly extend any market lead. They could have simply said they were using advanced AI technologies to reduce power draw without mentioning Brainchip.

Here's what I think:
-The CEO Sean started in November. He has two main focusses: extending the market lead and getting the word out about Brainchip.
-Before January there hadn't been new customer announcements for a while. No customers wanted to say they were working with Brainchip as the tech is a key differentiator. The suddenly there were two, namely Mercedes and ISL. I don't think this is a coincidence.
-Sean is very limited about what he can talk about in his NDAs. This makes it difficult to promote Brainchip
-In Sean's first few public appearances he would likely have wanted to project a vibe of a highly successful, top tier technology. As mentioned above the NDAs keep getting in the way so he needed a way around this.
-Maintaining strong investor interest is crucial as a higher share price means higher valuation, greater international a credibility, and an increased ability to do things like strategic acquisitions. Again NDAs have been limiting this and affecting some investor decisions.
-In Sean's most recent presentation he said he would be negotiating with other companies to have the ability to display their logos. This indicates they won't necessarily say what they're working on with the customer, even just the logo would have a big impact. ISL was probably negotiated with NASA as a non-lethal defense application (non-lethal being critical for Brainchip's image). Using Mercedes as an example, they've probably only negotiated the ability to talk about the voice recognition system (a fairly basic / common use case). We probably won't hear about other things they are working on, especially if it's something big like Level 4 or 5 autonomous driving.
-Negotiating with a company means that whatever the outcome, it should work in both companies favour. For Mercedes to agree to announce Brainchip, they likely would have got something in return. This could be something like more engineering support, a discount on licensing fees, lower royalty fees, or the earliest access to AKD2000 etc.
-This desire to negotiate the ability to display EAP logos is probably being strongly driven by Sean. Brainchip are at a stage where they need to carefully balance increased investor interest with the revenue stream and the cash drain from increased hires.
-I think Sean said something in the Q&A about how he will only announce material contracts if the customer allows it. I think he'll try to push for a few more public announcements this year as it aligns strongly with his strategy. Valeo is a strong contender for this with their Lidar IMO. Early announcements will help project his image as talented and successful CEO. Though he may try space these announcements out to ensure regular news flow to investors and to maintain interest but without impacting on negotiations too heavily.

Also of interest is how in the recent interview Sean confirmed Megachips / Renesas are being used to hide which companies are using Brainchip IP (my words). This confirms what FF had been suggesting for quite some time.

Also worth mentioning is how the CEO is less forthcoming about NDAs than others in the company. I think this will become more of the norm in future Brainchip presentations, including those by other presenters going forward as Sean will push for this. He needs customers to trust he runs a tight ship and seems to be setting a high bar.

Pure speculation, DYOR
I, as unfortunate as it is, hacking is now a tool of war. Given the current state of play, cybersecurity is going to be paramount for every govt, and every big business going forward. BRN are in the box seat. Hoping our genius mgt team are putting plenty of focus into this area. The rewards of attaining market leader status in this field alone, will be mind boggling.
GLTA
 
  • Like
  • Fire
Reactions: 16 users

Diogenese

Top 20
I, as unfortunate as it is, hacking is now a tool of war. Given the current state of play, cybersecurity is going to be paramount for every govt, and every big business going forward. BRN are in the box seat. Hoping our genius mgt team are putting plenty of focus into this area. The rewards of attaining market leader status in this field alone, will be mind boggling.
GLTA
We can rest easy.

ScoMo is providing cybersecurity for Ukraine ...

https://www.smh.com.au/national/why...u-can-do-to-help-ukraine-20220202-p59tb5.html
 
  • Like
  • Haha
Reactions: 12 users

Fox151

Regular
  • Like
  • Haha
  • Sad
Reactions: 11 users

BaconLover

Founding Member
  • Like
  • Haha
  • Wow
Reactions: 26 users
We can rest easy.

ScoMo is providing cybersecurity for Ukraine ...

https://www.smh.com.au/national/why...u-can-do-to-help-ukraine-20220202-p59tb5.html
I think you are becoming political.

Name me one example of where the grand vision of an Australian leader including Scomo has been hacked and publicly revealed?

The cybersecurity protecting our Australian leaders has been impenetrable for decades.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Haha
  • Sad
Reactions: 18 users
I think you are becoming political.

Name me one example of where the grand vision of an Australian leader including Scomo has been hacked and publicly revealed?

The cybersecurity protecting our Australian leaders has been impenetrable for decades.

My opinion only DYOR
FF

AKIDA BALLISTA
On a serious note Brainchip does provide an advanced cybersecurity solution basically off the shelf and can be read about on their website under the Heading: Applications.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Love
Reactions: 12 users

Diogenese

Top 20
I am sure our ScoMo will do really well with the security issues. After all safety and security is his Trump Card.
I saw this earlier this morning, still laughing lol

View attachment 1446
I think ultraviolet is just his colour.

No wonder his banjo was so hot ...
 
  • Like
  • Haha
Reactions: 6 users

zeeb0t

Administrator
Staff member
In the world of software, technology, cybersecurity - no news, is good news. I guess at first I think - hmm, Australia helping with cybersecurity. That's odd. Is it even our strong suit? But then again, for the most part - we never really hear too much in relation to massive leaks or exploits. So I suppose the old adage is true and maybe we are in a position to help in that respect.
 
  • Like
Reactions: 18 users
I think ultraviolet is just his colour.

No wonder his banjo was so hot ...
It just occurred to me why Australia is such a great country it is the fact that we a sense of freedom and irreverence so much a part of our societies fabric that our political heads are nicknamed :

Albo and Scomo

In Russia, the US, the UK, China they have:

Putin
Biden
Boris
Ping

How much better would they be if their leaders were nicknamed:

Puto
Bido
Borro
Pingo.

Problem solved now just need to work out how to get them to adopt this approach to leadership. LOL

FF
 
  • Haha
  • Like
  • Fire
Reactions: 24 users

Diogenese

Top 20
I think you are becoming political.

Name me one example of where the grand vision of an Australian leader including Scomo has been hacked and publicly revealed?

The cybersecurity protecting our Australian leaders has been impenetrable for decades.

My opinion only DYOR
FF

AKIDA BALLISTA
Like Madigras chaps, my chagrin is bottomless.

Pertov,
reds under the bed,
Combe,
Khemlani,
...

all just in time for elections.

Not to forget the yellow peril ...

[Combe (David), not Coombs]
 
Last edited:
  • Like
  • Haha
Reactions: 7 users
Top Bottom