Breaking News- new BRN articles/research

Kb23

Emerged
BREAKING NEWS:

I have been informed directly by Mr. Tony Dawe that he is now an official Brainchip company observer of this platform. He has also informed Zeebot of this fact at the time of his registration this morning.

At this stage he will be an observer only and taking notes.

He will not be replying on this forum to questions and they should continue to be referred to him by email or telephone. Not via the private messaging service on this platform.

There is clearly acceptance that there is a need for a safe place for Brainchip shareholders and potential investors to congregate and exchange
information.

So it becomes even more important for all shareholders to accept that how we present ourselves here could eventually become an asset for ourselves, the share price and Brainchip.

Brainchip has high ethical standards and goals and promotes itself in the beneficial Ai space and is very much the new kid on the international block and has to prove itself everyday.

An ethical responsible shareholder base is just another part of this puzzle and so while it does not mean we cannot engage in robust debate, in fact we should, we should do it in a fashion which reflects well on ourselves and the company.

Finally as we all know the explosion that occurred last week sent shrapnel everywhere even around the globe but in particular it blew the doors off Tony Dawe's office. At 10pm last night I received an email from Tony Dawe asking if I could fill him in, in a comprehensive way as to what had taken place. At 11.15pm I sent him my reply and he replied almost immediately asking if he could share my communication with Ken Scarince. We have had multiple further communications since then and I can assure everyone that he is completely up to speed with all the goings on. I do not intend to release these communications suffice to say that they would fully expose my private identity and that of my son. There are enough members including zeebot who know those details so that if anyone has concerns that I am not what I appear to be it can be confirmed. Jesse Chapman comes to mind as someone who knows my identity and background.

No one chose the time of this explosion particularly Tony Dawe who informs me that he was and is very much in the middle of a major project needing his full attention.

So on his behalf can I ask that we move on from the past and allow Tony Dawe to get on with what we as shareholders want him to be doing on our behalf and leave any further discussions with him regarding these events until the inevitable social gatherings that will take place.

My opinion only DYOR
FF

AKIDA BALLISTA
Thanks for your tireless efforts FF. Welcome Tony
 
  • Like
Reactions: 16 users

Stockbob

Regular
BREAKING NEWS:

I have been informed directly by Mr. Tony Dawe that he is now an official Brainchip company observer of this platform. He has also informed Zeebot of this fact at the time of his registration this morning.

At this stage he will be an observer only and taking notes.

He will not be replying on this forum to questions and they should continue to be referred to him by email or telephone. Not via the private messaging service on this platform.

There is clearly acceptance that there is a need for a safe place for Brainchip shareholders and potential investors to congregate and exchange
information.

So it becomes even more important for all shareholders to accept that how we present ourselves here could eventually become an asset for ourselves, the share price and Brainchip.

Brainchip has high ethical standards and goals and promotes itself in the beneficial Ai space and is very much the new kid on the international block and has to prove itself everyday.

An ethical responsible shareholder base is just another part of this puzzle and so while it does not mean we cannot engage in robust debate, in fact we should, we should do it in a fashion which reflects well on ourselves and the company.

Finally as we all know the explosion that occurred last week sent shrapnel everywhere even around the globe but in particular it blew the doors off Tony Dawe's office. At 10pm last night I received an email from Tony Dawe asking if I could fill him in, in a comprehensive way as to what had taken place. At 11.15pm I sent him my reply and he replied almost immediately asking if he could share my communication with Ken Scarince. We have had multiple further communications since then and I can assure everyone that he is completely up to speed with all the goings on. I do not intend to release these communications suffice to say that they would fully expose my private identity and that of my son. There are enough members including zeebot who know those details so that if anyone has concerns that I am not what I appear to be it can be confirmed. Jesse Chapman comes to mind as someone who knows my identity and background.

No one chose the time of this explosion particularly Tony Dawe who informs me that he was and is very much in the middle of a major project needing his full attention.

So on his behalf can I ask that we move on from the past and allow Tony Dawe to get on with what we as shareholders want him to be doing on our behalf and leave any further discussions with him regarding these events until the inevitable social gatherings that will take place.

My opinion only DYOR
FF

AKIDA BALLISTA
Many Thanks @Fact Finder for your time and effort for organising this and @zeeb0t, for getting this site up and running and the whole BRN community for your selfless efforts.
 
  • Like
Reactions: 14 users

Fox151

Regular
  • Like
  • Haha
Reactions: 18 users

Ahboy

Regular
Not BRN but still somewhat noteworthy given how we have a "sweet spot" in this space.

Feb. 07, 2022 4:17 PM ETAmazon.com, Inc. (AMZN), VLDRAUR, AEVA, MVIS, OSTR, LIDRBy: Joshua Fineman,

Velodyne Lidar surges after Amazon pact with sensor maker​

Velodyne Lidar (NASDAQ:VLDR) surged 87% in after hours trading after the the sensor maker disclosed a pact with Amazon (NASDAQ:AMZN) including a warrant for the Internet retail giant to purchase up to 39.6M of the company's shares.

The vesting of the warrant shares is based on discretionary payments made by Amazon pursuant to existing commercial agreements between Velodyne and Amazon, according to an 8-K filing. The warrant may be exercised any time before Feb. 4, 2030 at an exercise price of $4.18 per share, which was determined based on the 30-day volume-weighted average price for the common stock as of Feb 3.

The warrant shares will vest over time based on discretionary payments to Velodyne by Amazon (whether made directly from Amazon, its affiliates or by a third party on behalf of Amazon) of up to $200M, according to the filing.

The positive news from Amazon comes after the stock plunged almost 85% over the past year amid internal struggles at the company with the former CEO stepping down in July. The ousted founder of Velodyne in August called for the resignation of the sensor company's board chairman and a director.

Stocks of other lidar companies also gained on the news in after hours trading with AEye Inc. (NASDAQ:LIDR) +16%, Ouster (NASDAQ:OSTR) +3.4%, Luminar Technologies +3.8%, MicroVision (NASDAQ:MVIS) +4%, Aeva Technologies (NYSE:AEVA) +2.4% and Aurora Innovation (NASDAQ:AUR) +1%.
 
  • Like
  • Thinking
Reactions: 15 users

Fox151

Regular
On the topic of LIDAR - The Honda Legend Hybrid EX sedan’s Level 3 automated-driving system has five Valeo lidar sensors, five radar units and two cameras.

Pretty safe bet that means each car has at least 5x BRN chips or IP included...

Lidar Makers Target Proactive Safety As Automated Driving Remains On The Horizon​

Sam Abuelsamid
Sam Abuelsamid
Senior Contributor
Transportation
A lifetime in the car business, first engineering, now communicating
Follow

Listen to article6 minutes

https://policies.google.com/privacy
Webasto Group’s Roof Sensor Module with Luminar’s Iris as the Primary Lidar Sensor

Webasto Group’s Roof Sensor Module with Luminar’s Iris as the Primary Lidar Sensor
LUMINAR
So-called advanced driver assistance systems (ADAS) could prove to be the real beneficiary as widespread adoption of robotaxis and other automated vehicles (AVs) remain stuck in small scale pilot programs around the world. While engineers continue to work on AVs and the supporting technologies to remove the human driver, those bits and pieces are finding their way into the vehicles we will be buying in the coming years. At next week’s IAA Mobility show in Munich, Germany, Luminar will be demonstrating how its high-performance lidar sensors can be utilized for ADAS to make roads safer.

There are already several vehicles going on sale this year that are utilizing lidar sensors for ADAS with more expected in 2022 and beyond. Honda is selling a limited number Legend sedans in Japan with a level 3 conditionally automated system that utilized five Valeo Scala lidar sensors. Mercedes-Benz is also using one of those sensors for its L3 Drive Pilot on the new S-Class and EQS launching later this year in Germany while Xpeng’s new P5 sedan uses a pair of Livox sensors and Toyota is using a Denso lidar on its hands-free Teammate system.

PROMOTED



However, all of those L2 and L3 systems that allow the driver take their hands off the steering wheel under certain driving conditions are actually convenience features rather than strictly focused on enhancing safety. Luminar’s focus is on the latter as CEO Austin Russell has been promoting the concept of proactive safety. Unlike airbags and seatbelts that are reactive by helping to protect vehicle occupants after an impact, proactive safety involves trying to prevent crashes in the first place.
Early active safety systems included anti-lock brakes, traction control and stability control, all of which were designed to help the driver maintain control and make the vehicle respond in the way the driver intended. Those systems had limited ability to perceive the environment using wheel speed sensors, accelerometers, yaw rate and other data about what the driver was requesting and how the vehicle was responding to the road conditions.
MORE FROMFORBES ADVISOR

Who Is The New Platinum Card From American Express Good For?​


By
Eric Rosen
Contributor

What To Do If Your Car Is Caught In A Flood​


By
Jason Metz
editor

Modern ADAS takes this a step further with cameras, radar and now lidar that can “see” away from the interface between the tires and the road. Cameras and increasingly radar have become ubiquitous on mainstream vehicles in the past five years but they have significant limitations. In most cases, vehicles have just one forward facing camera which means they can classify objects, but they have to rely on inherently problematic machine learning approaches to guess how far away objects are. As an active sensor, radar is better, but current low cost radar sensors have very low resolution and limited ability to distinguish different targets. New imaging radar sensors are much better and should start arriving in some vehicles by the end of 2021.

Lidar is also an active sensor, sending out its own light pulses and measuring the reflections with much higher resolution than even the best imaging radar. Luminar will be demonstrating a Lexus PLXS -0.9% RX equipped with its Hydra lidar sensors for pedestrian detection and automatic emergency braking (AEB). In a video published by the company, it can be seen alongside a Tesla TSLA -1.7% Model X and an Audi A5 approaching “walking” pedestrian mannequins. While the Tesla and Audi can be seen braking, the response always comes too late to avoid striking the pedestrians. The Luminar-equipped Lexus stops short before impact every time.
 
  • Like
Reactions: 34 users
On the topic of LIDAR - The Honda Legend Hybrid EX sedan’s Level 3 automated-driving system has five Valeo lidar sensors, five radar units and two cameras.

Pretty safe bet that means each car has at least 5x BRN chips or IP included...

Lidar Makers Target Proactive Safety As Automated Driving Remains On The Horizon​

Sam Abuelsamid
Sam Abuelsamid
Senior Contributor
Transportation
A lifetime in the car business, first engineering, now communicating
Follow

Listen to article6 minutes

https://policies.google.com/privacy
Webasto Group’s Roof Sensor Module with Luminar’s Iris as the Primary Lidar Sensor

Webasto Group’s Roof Sensor Module with Luminar’s Iris as the Primary Lidar Sensor
LUMINAR
So-called advanced driver assistance systems (ADAS) could prove to be the real beneficiary as widespread adoption of robotaxis and other automated vehicles (AVs) remain stuck in small scale pilot programs around the world. While engineers continue to work on AVs and the supporting technologies to remove the human driver, those bits and pieces are finding their way into the vehicles we will be buying in the coming years. At next week’s IAA Mobility show in Munich, Germany, Luminar will be demonstrating how its high-performance lidar sensors can be utilized for ADAS to make roads safer.

There are already several vehicles going on sale this year that are utilizing lidar sensors for ADAS with more expected in 2022 and beyond. Honda is selling a limited number Legend sedans in Japan with a level 3 conditionally automated system that utilized five Valeo Scala lidar sensors. Mercedes-Benz is also using one of those sensors for its L3 Drive Pilot on the new S-Class and EQS launching later this year in Germany while Xpeng’s new P5 sedan uses a pair of Livox sensors and Toyota is using a Denso lidar on its hands-free Teammate system.

PROMOTED



However, all of those L2 and L3 systems that allow the driver take their hands off the steering wheel under certain driving conditions are actually convenience features rather than strictly focused on enhancing safety. Luminar’s focus is on the latter as CEO Austin Russell has been promoting the concept of proactive safety. Unlike airbags and seatbelts that are reactive by helping to protect vehicle occupants after an impact, proactive safety involves trying to prevent crashes in the first place.
Early active safety systems included anti-lock brakes, traction control and stability control, all of which were designed to help the driver maintain control and make the vehicle respond in the way the driver intended. Those systems had limited ability to perceive the environment using wheel speed sensors, accelerometers, yaw rate and other data about what the driver was requesting and how the vehicle was responding to the road conditions.
MORE FROMFORBES ADVISOR

Who Is The New Platinum Card From American Express Good For?

By
Eric Rosen
Contributor

What To Do If Your Car Is Caught In A Flood

By
Jason Metz
editor
Modern ADAS takes this a step further with cameras, radar and now lidar that can “see” away from the interface between the tires and the road. Cameras and increasingly radar have become ubiquitous on mainstream vehicles in the past five years but they have significant limitations. In most cases, vehicles have just one forward facing camera which means they can classify objects, but they have to rely on inherently problematic machine learning approaches to guess how far away objects are. As an active sensor, radar is better, but current low cost radar sensors have very low resolution and limited ability to distinguish different targets. New imaging radar sensors are much better and should start arriving in some vehicles by the end of 2021.

Lidar is also an active sensor, sending out its own light pulses and measuring the reflections with much higher resolution than even the best imaging radar. Luminar will be demonstrating a Lexus PLXS -0.9% RX equipped with its Hydra lidar sensors for pedestrian detection and automatic emergency braking (AEB). In a video published by the company, it can be seen alongside a Tesla TSLA -1.7% Model X and an Audi A5 approaching “walking” pedestrian mannequins. While the Tesla and Audi can be seen braking, the response always comes too late to avoid striking the pedestrians. The Luminar-equipped Lexus stops short before impact every time.
On the topic of LIDAR - The Honda Legend Hybrid EX sedan’s Level 3 automated-driving system has five Valeo lidar sensors, five radar units and two cameras.

Pretty safe bet that means each car has at least 5x BRN chips or IP included...

Lidar Makers Target Proactive Safety As Automated Driving Remains On The Horizon​

Sam Abuelsamid
Sam Abuelsamid
Senior Contributor
Transportation
A lifetime in the car business, first engineering, now communicating
Follow

Listen to article6 minutes

https://policies.google.com/privacy
Webasto Group’s Roof Sensor Module with Luminar’s Iris as the Primary Lidar Sensor

Webasto Group’s Roof Sensor Module with Luminar’s Iris as the Primary Lidar Sensor
LUMINAR
So-called advanced driver assistance systems (ADAS) could prove to be the real beneficiary as widespread adoption of robotaxis and other automated vehicles (AVs) remain stuck in small scale pilot programs around the world. While engineers continue to work on AVs and the supporting technologies to remove the human driver, those bits and pieces are finding their way into the vehicles we will be buying in the coming years. At next week’s IAA Mobility show in Munich, Germany, Luminar will be demonstrating how its high-performance lidar sensors can be utilized for ADAS to make roads safer.

There are already several vehicles going on sale this year that are utilizing lidar sensors for ADAS with more expected in 2022 and beyond. Honda is selling a limited number Legend sedans in Japan with a level 3 conditionally automated system that utilized five Valeo Scala lidar sensors. Mercedes-Benz is also using one of those sensors for its L3 Drive Pilot on the new S-Class and EQS launching later this year in Germany while Xpeng’s new P5 sedan uses a pair of Livox sensors and Toyota is using a Denso lidar on its hands-free Teammate system.

PROMOTED



However, all of those L2 and L3 systems that allow the driver take their hands off the steering wheel under certain driving conditions are actually convenience features rather than strictly focused on enhancing safety. Luminar’s focus is on the latter as CEO Austin Russell has been promoting the concept of proactive safety. Unlike airbags and seatbelts that are reactive by helping to protect vehicle occupants after an impact, proactive safety involves trying to prevent crashes in the first place.
Early active safety systems included anti-lock brakes, traction control and stability control, all of which were designed to help the driver maintain control and make the vehicle respond in the way the driver intended. Those systems had limited ability to perceive the environment using wheel speed sensors, accelerometers, yaw rate and other data about what the driver was requesting and how the vehicle was responding to the road conditions.
MORE FROMFORBES ADVISOR

Who Is The New Platinum Card From American Express Good For?

By
Eric Rosen
Contributor

What To Do If Your Car Is Caught In A Flood

By
Jason Metz
editor
Modern ADAS takes this a step further with cameras, radar and now lidar that can “see” away from the interface between the tires and the road. Cameras and increasingly radar have become ubiquitous on mainstream vehicles in the past five years but they have significant limitations. In most cases, vehicles have just one forward facing camera which means they can classify objects, but they have to rely on inherently problematic machine learning approaches to guess how far away objects are. As an active sensor, radar is better, but current low cost radar sensors have very low resolution and limited ability to distinguish different targets. New imaging radar sensors are much better and should start arriving in some vehicles by the end of 2021.

Lidar is also an active sensor, sending out its own light pulses and measuring the reflections with much higher resolution than even the best imaging radar. Luminar will be demonstrating a Lexus PLXS -0.9% RX equipped with its Hydra lidar sensors for pedestrian detection and automatic emergency braking (AEB). In a video published by the company, it can be seen alongside a Tesla TSLA -1.7% Model X and an Audi A5 approaching “walking” pedestrian mannequins. While the Tesla and Audi can be seen braking, the response always comes too late to avoid striking the pedestrians. The Luminar-equipped Lexus stops short before impact every time.
I think it is worthwhile repeating part of the document you found. They all have a big problem that a certain AKD1000 can solve. It is latency and connection. Remember what Rob Telson said even the speed of light is not sufficient emergency reactions require real time response.
My opinion only DYOR
FF

AKIDA BALLISTA

"Modern ADAS takes this a step further with cameras, radar and now lidar that can “see” away from the interface between the tires and the road. Cameras and increasingly radar have become ubiquitous on mainstream vehicles in the past five years but they have significant limitations. In most cases, vehicles have just one forward facing camera which means they can classify objects, but they have to rely on inherently problematic machine learning approaches to guess how far away objects are. As an active sensor, radar is better, but current low cost radar sensors have very low resolution and limited ability to distinguish different targets. New imaging radar sensors are much better and should start arriving in some vehicles by the end of 2021.

Lidar is also an active sensor, sending out its own light pulses and measuring the reflections with much higher resolution than even the best imaging radar. Luminar will be demonstrating a Lexus PLXS -0.9% RX equipped with its Hydra lidar sensors for pedestrian detection and automatic emergency braking (AEB). In a video published by the company, it can be seen alongside a Tesla TSLA -1.7% Model X and an Audi A5 approaching “walking” pedestrian mannequins. While the Tesla and Audi can be seen braking, the response always comes too late to avoid striking the pedestrians. The Luminar-equipped Lexus stops short before impact every time."
 
  • Like
  • Fire
Reactions: 39 users

Kaktus112

Member
I found a new German Article
Arzt und Wissenschaft (engl.: Doctor and Science)

Topic: Artificial Intelligence - BrainChip is mentioned, also Neuralink (Elon Musk)


(Translated with Google)
Sales of AI chips are going through the roof
[...]
This makes new chips necessary. The market research institute Gartner already sees sales of AI chips at 35.5 billion US dollars. With new technology such as autonomous driving and areas of application such as the Internet of Things, demand is likely to increase. According to Gartner, sales could reach $73 billion by 2025.

New chip architectures are used here. For example, BrainChip Holdings (ISIN: AU000000BRN8), listed on the Nasdaq, is modeled on the human brain. It promises self-learning chips that, under certain circumstances, can do without additional hardware. You make chip-level decisions. The architecture should ensure that, in addition to an extremely low energy requirement, only small data volumes are exchanged.

That sounds perfect for the often cited use of AI on the milk can. The company has already produced a first small series and is presenting its chip to potential customers.
[...]

(Opinion of the author)
Don't get caught up in the gold rush mood


The general conditions surrounding AI chips are promising for investors: the market is growing and there is also a large portion of takeover fantasy. Such conditions typically create a gold rush mood among founders. However, history has taught that most prospectors end up with nothing. For this reason, investors should definitely invest diversified around chip stocks for special areas of application.

In doing so, they should not forget that the major manufacturers already in place may be the best judges of which technology holds promise over the long term.
 
  • Like
Reactions: 20 users

Mazewolf

Regular
I found a new German Article
Arzt und Wissenschaft (engl.: Doctor and Science)

Topic: Artificial Intelligence - BrainChip is mentioned, also Neuralink (Elon Musk)


(Translated with Google)
Sales of AI chips are going through the roof
[...]
This makes new chips necessary. The market research institute Gartner already sees sales of AI chips at 35.5 billion US dollars. With new technology such as autonomous driving and areas of application such as the Internet of Things, demand is likely to increase. According to Gartner, sales could reach $73 billion by 2025.

New chip architectures are used here. For example, BrainChip Holdings (ISIN: AU000000BRN8), listed on the Nasdaq, is modeled on the human brain. It promises self-learning chips that, under certain circumstances, can do without additional hardware. You make chip-level decisions. The architecture should ensure that, in addition to an extremely low energy requirement, only small data volumes are exchanged.

That sounds perfect for the often cited use of AI on the milk can. The company has already produced a first small series and is presenting its chip to potential customers.
[...]

(Opinion of the author)
Don't get caught up in the gold rush mood


The general conditions surrounding AI chips are promising for investors: the market is growing and there is also a large portion of takeover fantasy. Such conditions typically create a gold rush mood among founders. However, history has taught that most prospectors end up with nothing. For this reason, investors should definitely invest diversified around chip stocks for special areas of application.

In doing so, they should not forget that the major manufacturers already in place may be the best judges of which technology holds promise over the long term.
"New chip architectures are used here. For example, BrainChip Holdings (ISIN: AU000000BRN8), listed on the Nasdaq, is modeled on the human brain" should be >> "soon to be listed on the NASDAQ"
-fixed ;)
 
  • Like
Reactions: 29 users

Brubaker

Regular
Hot off the wires
Nvidia is no longer buying Arm from SoftBank,
That 40 billion dollar deal just announced has collapsed
"Arm" hints at AI but through the cloud .
Akida is superior and does not need the cloud as it operates at the edge using low power.
These deals whether they succeed or fail demonstrate the immense dollar value that will be at play throughout this year and as I see it Akida is in the box seat. This area of technology is absolutely on fire, right now
What happens throughout 2022 in this arena will blow your minds

 
  • Like
  • Fire
Reactions: 17 users
Just another recent article to get the story out there balanced with some competitors.



New Computer Chips Could Process More Like Your Brain Does​

Your gadgets may feel more ‘naturally’ smarter​

By
Sascha Brodsky

Published on January 20, 2022 10:26AM EST
Fact checked by
Jerri Ledford
  • Tweet
  • Share
  • Email

Key Takeaways​

  • Chips based on the architecture of the human brain could help make gadgets smarter and more power-efficient.
  • BrainChip recently announced its Akida neural networking processor.
  • Mercedes uses the BrainChip processor in its new Mercedes Vision EQXX concept car, promoted as "the most efficient Mercedes-Benz ever built."
Outline of two human heads, one with a regular brain, the other with a computer chip brain.

Science Photo Library - PASIEKA / Getty Images
A new generation of smartphones and other gadgets could be powered by chips designed to act like your brain.

BrainChip recently announced its Akida neural networking processor. The processor uses chips inspired by the spiking nature of the human brain. It's part of a growing effort to commercialize chips based on human neural structures.

The new generation of chips could mean "more deep neural network processing capability in the future on portable devices, e.g., smartphones, digital companions, smartwatches, health monitoring, autonomous vehicles and drones," Vishal Saxena, a professor of electrical and computer engineering at the University of Delaware told Lifewire in an email interview.

Brains on a Chip​

BrainChip says the new boards could help usher in a new era of remote AI, also known as edge computing, due to their performance, security, and low power requirements.

By mimicking brain processing, BrainChip uses a proprietary processing architecture called Akida, which is both scalable and flexible to address the requirements in edge devices. At the edge, sensor inputs are analyzed at the acquisition point rather than through transmission via the cloud to a data center.

"I am excited that people will finally be able to enjoy a world where AI meets the Internet of Things," said Sean Hehir, BrainChip CEO, in the news release. "We have been working on developing our Akida technology for more than a decade, and with the full commercial availability of our AKD1000, we are ready to fully execute on our vision. Other technologies are simply not capable of the autonomous, incremental learning at ultra-low power consumption that BrainChip's solutions can provide."

The Mercedes Vision EQXX

The Mercedes Vision EQXX.
Mercedes
Mercedes uses the BrainChip processor in its new Mercedes Vision EQXX concept car, promoted as "the most efficient Mercedes-Benz ever built." The vehicle incorporates neuromorphic computing to help reduce power consumption and extend vehicle range. BrainChip's Akida neuromorphic chip allows in-cabin keyword spotting instead of using power-hungry data transmission to process instructions.

One significant advantage to chips designed like a brain, also called neuromorphic design, is potential power savings. Although researchers understand very little about the basis of cognition, a human brain only consumes around 20 watts of energy, Saxena said.

"This is due to the fact that the brain performs 'in memory computing' and communication using spikes in an event-driven fashion, whereby energy is only consumed when a spike is emitted," he added.

Neuromorphic chips are a good fit for processor-intensive tasks like deep learning AI computers because they use much less power. The chips could also be helpful for edge devices like smartphones where battery power is limited, Saxena said.

Future Chip Brains​

BrainChip is one of many start-ups focusing on brain-inspired chips, called neuromorphic design, including SynSense and GrAI Matter Labs. Intel is working on its Loihi neuromorphic chip, but it's not yet available for purchase.



The international research group IMEC in Belgium develops neural networks to develop better audio devices, radar, and cameras that react to certain events.

Neural chips offer "the ability of on-line learning, making sensing systems adaptive to real-world variations (think of changing light conditions for cameras or variations person-to-person for wearables)," Ilja Ocket, a program manager at IMEC, told Lifewire in an email interview.

Neuromorphic chips could also allow computers to see like humans. Prophesee is applying neuromorphic techniques to vision processing. The company's approach is called event-based vision, which only captures and processes information that changes in a scene like humans do instead of a continuous stream of data for the entire locations that conventional cameras use.

Neuromorphic chips could one day enable more intelligent sensors in devices like smart wearables, AR/VR headsets, personal robots, and robot taxis, Ocket said. The new chips could perform local AI tasks to learn from and adapt to local and changing environments.

"All this without the need for cloud communication, hence enabling built-in privacy," he added.
 
  • Like
  • Fire
Reactions: 36 users
@Fact Finder or anyone else.

I did a quick search here and on HC, before posting this, and saw your post in Jan about an Akida USB release.

Don't know much about it but just found this site listing the PCIe Board, software DL's plus a separate PCIe USB v2.

Is that what you were discussing at the time as if so, then could be out now or about to release. No image avail on the site as yet.

Or am I off base and missed reading something.

They have the board and software downloads on other pages for BRN products




Screenshot_2022-02-09-23-22-19-63_4641ebc0df1485bf6b47ebd018b5ee76.jpg


Screenshot_2022-02-09-23-43-52-29_4641ebc0df1485bf6b47ebd018b5ee76.jpg
 
  • Like
Reactions: 15 users

Slade

Top 20
Yes, BrainChip has talked about releasing an Akida as a USB dongle. It could well be released to market soon. We haven’t heard it mentioned for a while.
 
  • Like
  • Sad
Reactions: 20 users

Slade

Top 20
  • Like
  • Fire
Reactions: 20 users
Apols for format as had to cache it to get the full article as not a subscriber.

Original & credit:




What’s So Exciting About Neuromorphic Computing​

By Aaryaa Padhyegurjar
February 10, 2022

Telegram


Facebook


Linkedin


WhatsApp


Email

Print

https://www.electronicsforu.com/technology-trends/neuromorphic-computing/amp#
Subscribe to updates Unsubscribe from updates

The human brain is the most efficient and powerful computer that exists. Even after decades and decades of technological advancements, no computer has managed to beat the brain with respect to efficiency, power consumption, and many other factors.

Will neuromorphic computers be able to do it?

The AKD1000-powered Mini PCIe boardThe AKD1000-powered Mini PCIe board (Source: BrainChip Inc.)

The exact sequence of events that take place when we do a particular activity on our computer, or on any other device, completely depends on its inherent architecture. It depends on how the various components of the computer like the processor and memory are structured in the solid state.

Almost all modern computers we use today are based on the Von Neumann architecture, a design first introduced in the late 1940s. There, the processor is responsible for executing instructions and programs, while the memory stores those instructions and programs.


When you think of your body as an embedded device, your brain is the processor as well as the memory. The architecture of our brain is such that there is no distinction between the two.

Since we know for a fact that the human brain is superior to every single computer that exists, doesn’t it make sense to modify computer architecture in a way that it functions more like our brain? This was what many scientists realised in the 1980s, starting with Carver Mead, an American scientist and engineer.

Fast forward to today​

Nowadays, almost all companies have dedicated teams working on neuromorphic computing. Ground breaking research is being done in multiple research organisations and universities. It is safe to say that neuromorphic computing is gaining momentum and will continue to do so as various advancements are being made.

What’s interesting to note is that although this is a specialised field with prerequisites from various topics, including solid-state physics, VLSI, neural networks, and computational neurobiology, undergraduate engineering students are extremely curious about this field.

At IIT Kanpur, Dr Shubham Sahay, Assistant Professor at the Department of Electrical Engineering, introduced a course on neuromorphic computing last year. Despite being a post-graduate level course, he saw great participation from undergrads as well.

“Throughout the course, they were very interactive. The huge B.Tech participation in my course bears testimony to the fact that undergrads are really interested in this topic. I believe that this (neuromorphic computing) could be introduced as one of the core courses in the UG curriculum in the future,” he says.

Getting it commercial​

Until recently, neuromorphic computing was a widely used term only in research and not in the commercial arena. However, as of January 18, 2022, BrainChip, a leading provider of ultra-low-power high-performance AI technology, commercialised its AKD1000 AIOT chip. Developers, system integrators, and engineers can now buy AKD1000-powered Mini PCIe boards and leverage them in their applications, especially those requiring on-edge computing, low power consumption, and high-performance AI.

“It’s meant as our entry-level product. We want to proliferate this into as many hands as we can and get people designing in the Akida environment,” says Rob Telson, Vice President of WorldWide Sales at BrainChip. Anil Mankar, Co-founder and Chief Development Officer of BrainChip, explains, “We are enabling system integrators to easily use neuromorphic AI in their applications. In India, if some system integrators want to manufacture the board locally, they can take the bill of materials from us (BrainChip) and manufacture it locally.”

The 5 sensor modalitiesThe 5 sensor modalities (Source: BrainChip Inc.)

What’s fascinating about Akida is that it enables sensor nodes to compute without depending on the cloud. Further, BrainChip’s AI technology not only performs audio and video based learning but even focuses on other sensor modalities like taste, vibration, and smell. You can use it to make a sensor that performs wine tasting! Here is a link to their wine tasting demonstration:

Another major event that occurred this year was when Mercedes implemented BrainChip’s Akida technology in its Vision EQXX electric vehicle. This is definitely a big deal since the Akida technology is tried and tested for a smart automotive experience. All features that the Akida provides, including facial recognition, keyword spotting, etc consume extremely low power.

“This is where we get excited. You’ll see a lot of these functionalities in vehicles—recognition of voices, faces, and individuals in the vehicle. This allows the vehicles to have customisation and device personalisation according to the drivers or the passengers as well,” says Telson. These really are exciting times.

Neuromorphic hardware, neural networks, and AI​

The process in which neurons work is eerily similar to an electric process. Neurons communicate with each other via synapses. Whenever they receive input, they produce electrical signals called spikes (also called action potentials), and the event is called neuron spiking. When this happens, chemicals called neurotransmitters are released into hundreds of synapses and activate the respective neurons. That’s the reason why this process is super-fast.

Akida MetaTF ML FrameworkAkida MetaTF ML Framework (Source: MetaTF)

Artificial neural networks mimic the logic of the human brain, but on a regular computer. The thing is, regular computers work on the Von Neumann architecture, which is extremely different from the architecture of our brain and is very power-hungry. We may not be able to deploy CMOS logic on the Von Neumann architecture for long. We will eventually reach a threshold to which we can exploit silicon. We are nearing the end of Moore’s Law and there is a need to establish a better computing mechanism. Neuromorphic computing is the solution because neuromorphic hardware realises the structure of the brain in the solid-state.

As we make progress in neuromorphic hardware, we will be able to deploy neural networks on it. Spiking Neural Network (SNN) is a type of artificial neural network that uses time in its model. It transmits information only when triggered—or, in other words, spiked. SNNs used along with neuromorphic chips will transform the way we compute, which is why they are so important for AI.

How to get started with SNNs​

Since the entire architecture of neuromorphic AI chips is different, it is only natural to expect the corresponding software framework to be different too. Developers need to be educated on working with SNNs. However, that is not the case with MetaTF, a free software development framework environment that BrainChip launched in April 2021.

“We want to make our environment extremely simple to use. Having people learn a new development environment is not an efficient way to move forward,” says Telson. “We had over 4600 unique users start looking at and playing with MetaTF in 2021. There’s a community out there that wants to learn.”

India and the future of neuromorphic computing​

When asked about the scope of neuromorphic computing in India, Dr Sahay mentions, “As of now, the knowledge, dissemination, and expertise in this area is limited to the eminent institutes such as IITs and IISc, but with government initiatives such as India Semiconductor Mission (ISM) and NITI Ayog’s national strategy for artificial intelligence (#AIforall), this field would get a major boost. Also, with respect to opportunities in the industry, several MNCs have memory divisions in India—

Micron, Sandisk (WesternDigital), etc—that develop the memory elements which will be used for neuromorphic computing.” There’s a long way to go, but there is absolutely no lack of potential. More companies would eventually have their neuromorphic teams in India.

BrainChip Inc. is also building its university strategy to make sure students are being educated in this arena. Slowly, the research done in neuromorphic computing is making its way into the commercial world and academia. Someday, we might be able to improve our self-driving cars, create artificial skins and prosthetic limbs that can learn things about their surroundings! Consider your smart devices. All of them are dependent on the internet and the cloud. If equipped with a neuromorphic chip, these devices can compute on their own! This is just the start of the neuromorphic revolution.


The author, Aaryaa Padhyegurjar, is an Industry 4.0 enthusiast with a keen interest in innovation and research.
 
  • Like
  • Fire
  • Love
Reactions: 43 users

Twing

Akida’s gambit
Great find Fullmoonfever.
Thanks for sharing.
 
  • Like
Reactions: 10 users
Afterthought....by saturating this way are we almost starting an Apple style ecosystem of users to build on ;)

“It’s meant as our entry-level product. We want to proliferate this into as many hands as we can and get people designing in the Akida environment,” says Rob Telson, Vice President of World Wide Sales at BrainChip. Anil Mankar, Co-founder and Chief Development Officer of BrainChip, explains, “We are enabling system integrators to easily use neuromorphic AI in their applications. In India, if some system integrators want to manufacture the board locally, they can take the bill of materials from us (BrainChip) and manufacture it locally.”
 
  • Like
  • Fire
Reactions: 18 users

Bobbydog

Emerged
Ok ive just received an email back from Samuel of Biotome as i had been enquiring about how i could invest directly with Biotome and if it would become public. Short answer not oublic yet but their is discussions, long answer is they are currently continuing to raise funds and have about have raised so far through capital raising which they can't disclose but also offering individual offerings if interested and are happy to offer more info of interested min inv $25100 aud round finishs 11 feb. $100,000 provides 1% equity in company. He has also asked me to let him know if anyone else in my network is interested and he can onforward more details to those interested.
Samuel said- Thanks a lot for reaching out and showing interest in Biotome. Thanks also for your kind and positive words of encouragement! It is exciting for us to work with the Brainchip team - Akida certainly has a very impressive capacity for learning, and a very appealing cost and size profile.
If you would like to take part in the current capital raise, or would like me to pass this offer on to any of your contacts, please let me know by email, and I will let you know the next steps.

SO if anyone has a spare 25k that they think is worthy of investment let me know and i can pass on yr details as per his instructions to me.

I am assuming then ive i invest min 25k i will receive .025% of share in company. so if my calculations are correct if the company became worth 2billion dollars my investment would then be $50 million dollars. Umm sounds like a no brainer right? :)
let me know yr thoughts and interest and i will reply to Samuel but i think im in :)
Would love some info MD please
 
  • Like
Reactions: 1 users
Article today from ZDNet with Mike Demler, Senior Analyst at The Linley Group.

Highlighted the BRN comments :)



The AI edge chip market is on fire, kindled by 'staggering' VC funding​

Dozens of startups continue to get tens of millions in venture funding to make chips for AI in mobile and other embedded computing uses. The race shows no sign of slowing down.
Tiernan Ray
Written by Tiernan Ray, Contributing Writer
on February 11, 2022 | Topic: Artificial Intelligence

Chips to perform AI inference on edge devices such as smartphones is a red-hot market, even years into the field's emergence, attracting more and more startups and more and more venture funding, according to a prominent chip analyst firm covering the field.

"There are more new startups continuing to come out, and continuing to try to differentiate," says Mike Demler, Senior Analyst with The Linley Group, which publishes the widely read Microprocessor Report, in an interview with ZDNet via phone.

Linley Group produces two conferences each year in Silicon Valley hosting numerous startups, the Spring and Fall Processor Forum, with an emphasis in recent years on those AI startups.

At the most recent event, held in October, both virtually and in-person, in Santa Clara, California, the conference was packed with startups such Flex Logix, Hailo Technologies, Roviero, BrainChip, Syntiant, Untether AI, Expedera, and Deep AI giving short talks about their chip designs.
Demler and team regularly assemble a research report titled the Guide to Processors for Deep Learning, the latest version of which is expected out this month. "I count more than 60 chip vendors in this latest edition," he told ZDNet.

Edge AI has become a blanket term that refers mostly to everything that is not in a data center, though it may include servers on the fringes of data centers. It ranges from smartphones to embedded devices that suck micro-watts of power using the TinyML framework for mobile AI from Google.

The middle part of that range, where chips consume from a few watts of power up to 75 watts, is an especially crowded part of the market, said Demler, usually in the form of a pluggable PCIe or M.2 card. (75 watts is the PCI-bus limit in devices.)

"PCIe cards are the hot segment of the market, for AI for industrial, for robotics, for traffic monitoring," he explained. "You've seen companies such as Blaize, FlexLogic -- lots of these companies are going after that segment."

But really low-power is also quite active. "I'd say the tinyML segment is just as hot. There we have chips running from a few milliwatts to even microwatts."

Most of the devices are dedicated to the "inference" stage of AI, where artificial intelligence makes predictions based on new data.
Inference happens after a neural network program has been trained, meaning that its tunable parameters have been developed fully enough to reliably form predictions and the program can be put into service.

The initial challenge for the startups, said Demler, is to actually get from a nice PowerPoint slide show to working silicon. Many start out with a simulation of their chip running on a field-programmable gate array, and then either move to selling a finished system-on-chip (SoC), or else licensing their design as synthesizable IP that can be incorporated into a customer's chip.

"We still see a lot of startups hedging their bets, or pursuing as many revenue models as they can," said Demler, "by first demo'ing on an FPGA and offering their core IP for licensing." Some startups also offer the FPGA-based version as a product."

With dozens of vendors in the market, even those that get to working silicon are challenged to show something that's meaningfully different.
"It's hard to come up with something that's truly different," said Demler. "I see these presentations, 'world's first,' or, 'world's best,' and I say, yeah, no, we've seen dozens."

Some companies began with such a different approach that they set themselves apart early, but have taken some time to bear fruit.

BrainChip Holdings, of Sydney, Australia, with offices in Laguna Hills, California, got a very early start in 2011 with a chip to handle spiking neural networks, the neuromorphic approach to AI that purports to more closely model how the human brain functions.

The company has over the years showed off how its technology can perform tasks such as using machine vision to identify poker chips on the casino floor.

"BrainChip has been doggedly pursuing this spiking architecture," said Demler. "It has a unique capability, it can truly learn on device," thus performing both training and inference.

BrainChip has in one sense come the farthest of any startup: it's publicly traded. Its stock is listed on the Australian Stock Exchange under the ticker "BRN," and last fall the company issued American Depository Shares to trade on the U.S. over-the-counter market, under the ticker "BCHPY." Those shares have since more than tripled in value.

BrainChip is just starting to produce revenue. The company in October came out with mini PCIe boards of its "Akida" processor, for x86 and Raspberry Pi, and last month announced new PCIe boards for $499. The company in the December quarter had revenue of U.S.$1.1 million, up from $100,000 in the prior quarter. Total revenue for the year was $2.5 million, with an operating loss of $14 million.


Some other exotic approaches have proved hard to deliver in practice. Chip startup Mythic, founded in 2012 and based in Austin, Texas, has been pursuing the novel route of making some of its circuitry use analog chip technology, where instead of processing ones and zeros, it computes via manipulation of a real-valued wave form of an electrical signal.

"Mythic has generated a few chips but no design wins," Demler observed."Everyone agrees, theoretically, analog should have a power efficiency advantage, but getting there in something commercially variable is going to be much more difficult."

Another startup presenting at the Processor Conference, Syntiant, started out with an analog approach but decided analog didn't provide sufficient power advantages and took longer to bring to market, noted Demler.

Syntiant of Irvine, California, founded in 2017, has focused on very simple object recognition that can operate with low power on nothing more than a feature phone or a hearable.

"On a feature phone, you don't want an apps processor, so the Syntiant solution is perfect," observed Demler.

Regardless of the success of any one startup, the utility of special circuitry means that AI acceleration will endure as a category of chip technology, said Demler.

"AI is becoming so ubiquitous in so many fields, including automotive, embedded processing, the IoT, mobile, PCs, cloud, etc., that including a special-purpose accelerator will become commonplace, just like GPUs are for graphics."

Nevertheless, some tasks will be more efficient to run on a general-purpose CPU, DSP, or GPU, said Demler. That is why Intel and Nvidia and others are amplifying their architectures with special instructions, such as for vector handling.

Different approaches will continue to be explored as long as a venture capital market awash in cash lets a thousand flowers bloom.
"There's still so much VC money out there, I'm astounded by the amount these companies continue to get," said Demler.

Demler notes giant funding rounds for Sima.ai of San Jose, California, founded in 2018, which is developing what it calls an "MLSoC" focused on reducing power consumption. The company received $80 million in their Series B funding round.

Another one is Hailo Technologies of Tel Aviv, founded in 2017, which has raised $320.5 million, according to FactSet, including $100 million in its most recent round, and is supposedly valued at a billion dollars

"The figures coming out of China, if true, are even more staggering," said Demler. Funding looks set to continue for the time being, he said. "Until the VC community decides there's something else to invest in, you're going to see these companies popping up everywhere."

At some point, a shake-out will happen, but when that day may come is not clear.

"Some of them have to go away eventually," mused Demler. "Whether it's 3 years or 5 years from now, we'll see much fewer companies in this space."

The next conference event Demler and colleagues will host is late April, the Spring Processor Forum, at the Hyatt Regency Hotel in Santa Clara, but with live-streaming for those who can't make it in person.
 
  • Like
  • Fire
Reactions: 54 users
We know about the release but opinion piece couple days ago from






BrainChip to start 2022 by shipping edge AI chips in volume​

BrainChip to start 2022 by shipping edge AI chips in volume

Akida PCIe Board - Brainchip
BrainChip Holdings Ltd. is gaining notoriety for its chips in the growing market for edge AI applications, having made a debut in a new all-electric Mercedes-Benz concept car in January. Recently, BrainChip started taking orders for two development kits integrated with the Akida AKD1000 edge AI processor for internal testing and validation.
Last month, the manufacturer announced the beginning of the full commercialization of its Akida AKD1000 AI chip through the availability of mini PCIe boards. BrainChip said its solution is a viable product to unlock increased functionalities for Smart City applications as well as digital consumer health and smart home devices. As the market for customized hardware has grown, the manufacturer pledged to offer full PCIe board design layout and bill of materials to system integrators and developers — enabling them to design custom boards and deploy the Akida AKD1000 chip as embedded accelerator or co-processor.

The Akida AKD1000 edge AI processor provides OEMs and car manufacturers with a cost-efficient solution for real-time in-vehicle preventative care. The edge AI processor is capable of early detection mechanisms through real-time analysis of incoming sensor data. Because of real-time edge AI capabilities, the processor chip reduces the existing problems with privacy, internet dropout, and latency and bandwidth constraints.

Akida PCIe Board
Image Source: BrainChip e-Store​

“We have been working on developing our Akida technology for more than a decade and with the full commercial availability of our AKD1000, we are ready to fully execute on our vision,” said Sean Hehir, BrainChip CEO. “Other technologies are simply not capable of the autonomous, incremental learning at an ultra-low power consumption that BrainChip’s solutions can provide. Getting these chips into as many hands as possible is how the next generation of AI becomes reality.”
BrainChip said the PCIe boards are immediately available for pre-orders on the BrainChip website, starting at $499.

Making headway in 2022​

There was a revamp inside BrainChip in 2021 with the appointment of Sean Hehir as the new CEO and the appointment of Jerome Nadel as CMO, who will lead the team for product marketing as it moves toward achieving full commercialization of its Akida neuromorphic computing platforms. BrainChip said 2021 was its “most successful year” with several technological advancements and marketing upgrades. Among all the milestones achieved, Akida AKD1000 is the most successful production chip designed by BrainChip. Additionally, the company introduced MetaTF, a machine learning framework that works within TensorFlow.
In a January regulatory filing (BrainChip is publicly traded on the ASX), the company revealed it reached $1m in revenue for 4Q 2021. The company has also been granted its ninth patent.
 
  • Like
Reactions: 26 users
Top Bottom