BRN Discussion Ongoing

Sirod69

bavarian girl ;-)
it could be green on Monday

0,787 EUR -0,38 % -0,003 EUR

Letzter Kurs 22:58:41 · L&S Exchange
1658523705277.png
 
  • Like
  • Fire
  • Love
Reactions: 10 users
D

Deleted member 118

Guest
it could be green on Monday

0,787 EUR -0,38 % -0,003 EUR

Letzter Kurs 22:58:41 · L&S Exchange
View attachment 12263


Nasdaq down quite a bit so might struggle to go green, likely being profit taking after such a good week there.
 
Last edited by a moderator:
  • Like
Reactions: 5 users

MDhere

Regular
Very interesting Tata :)



And this can go with it too

 
Last edited:
  • Like
  • Fire
  • Wow
Reactions: 14 users

TopCat

Regular
Very interesting Tata :)



And this can go with it too

I’ve come across a few articles now referring to digital twin technology. I just don’t understand what it is though 🤔
 
  • Like
  • Fire
Reactions: 5 users

equanimous

Norse clairvoyant shapeshifter goddess
 
  • Like
  • Love
  • Fire
Reactions: 48 users

equanimous

Norse clairvoyant shapeshifter goddess
BRAINCHIP SEES ITS NEUROPORHIC PROCESSOR NEXT TO EVERY SENSOR IN A CAR
 
  • Like
  • Fire
  • Love
Reactions: 43 users

Pandaxxx

Regular
"
.....The idea is to minimize the size and complexity of data packets sent to AI accelerators in automotive head units, while lowering latency and minimizing energy requirements. With a potential for 70 Akida chips or Akida–enabled sensors in each vehicle, Nadel said each one will be a “low–cost part that will play a humble role,” noting that the company needs to be mindful of the bill of materials for all these sensors.
....
"
copy from user investmentstratege 1980
www.wallstreet-online.de
thanks for posting!
date 07.22.2022
Perhaps one of the best articles I’ve read about Brainchip.

Is EE widely read?
 
  • Like
  • Fire
  • Love
Reactions: 29 users

MDhere

Regular
I’ve come across a few articles now referring to digital twin technology. I just don’t understand what it is though 🤔
not sure exactly, but my thoughts are its much like a digital memory imprint like a one shot /two shot learning and so on , a twin image to then find, record data of the same twin be it a person/fruit/mechanical method and so on. Maybe just a fancy name for neural network function. my two bob. And i only know two Bobs :ROFLMAO:
 
  • Like
  • Haha
Reactions: 5 users

equanimous

Norse clairvoyant shapeshifter goddess
Perhaps one of the best articles I’ve read about Brainchip.

Is EE widely read?

EE Times Group's EE Times and EETimes.com Named One of the Most Powerful Business-to-Business Advertising Venues by BtoB Magazine​

More than 530,000 print and digital editions are delivered around the world, while the website averages over 4.5 million page views, 1.6 million visits and 960,000 unique visitors per month.
 
  • Like
  • Fire
  • Love
Reactions: 54 users

TopCat

Regular

Would it be possible if enough cars started using this technology, that they could communicate with each other once they got within say 200m of each other without going via the cloud? Swapping details of speed, directions , etc.
 
  • Like
  • Fire
  • Thinking
Reactions: 13 users
It's been fun. aidios people👋
 
  • Sad
  • Thinking
  • Wow
Reactions: 16 users

equanimous

Norse clairvoyant shapeshifter goddess
Would it be possible if enough cars started using this technology, that they could communicate with each other once they got within say 200m of each other without going via the cloud? Swapping details of speed, directions , etc.

Spiking Neural Networks -- Part III: Neuromorphic Communications​

Show affiliations

Abstract​

Synergies between wireless communications and artificial intelligence are increasingly motivating research at the intersection of the two fields. On the one hand, the presence of more and more wirelessly connected devices, each with its own data, is driving efforts to export advances in machine learning (ML) from high performance computing facilities, where information is stored and processed in a single location, to distributed, privacy-minded, processing at the end user. On the other hand, ML can address algorithm and model deficits in the optimization of communication protocols. However, implementing ML models for learning and inference on battery-powered devices that are connected via bandwidth-constrained channels remains challenging. This paper explores two ways in which Spiking Neural Networks (SNNs) can help address these open problems. First, we discuss federated learning for the distributed training of SNNs, and then describe the integration of neuromorphic sensing, SNNs, and impulse radio technologies for low-power remote inference.

Next-Generation Wireless Networks (NGWN) for Autonomous Intelligent Communications​

7. Conclusion​

In this research paper, we have provided a detail and systematic survey on 6G wireless communication. This survey is carried out in such a way that initially, the literature on 6G and different practical research initiatives taken by different organizations are presented followed by the 6G vision, specifications, challenges, different candidate technologies, and future research directions. The vision of 6G is based on requirements that 5G is unable to fulfil and those services that 5G is unable to provide. To this end, the vision of 6G is categorized into four fundamental services: “Intelligent Connectivity,” “Deep Connectivity,” “Holographic Connectivity,” and “Ubiquitous Connectivity.” Based on discussed 6 visions, the achievable 6G goals are specified which are (a) latency minimization, (b) global connectivity, (c) massive connectivity, (d) enormously high data rates, (e) energy efficiency of network devices, (f) connection reliability, and (g) machine learning-based connected intelligence. To achieve these specified 6G goals, many potential technologies are proposed which include AI, FSO backhaul network, blockchain, UAVs, 3D networking, DNS, sensing-based communication, big data analytics, and some new spectrum-based technologies, e.g., terahertz spectrum and Optical Wireless Communications. To realize the vision of 6G and implementation of the potential candidate technologies to achieve the specified 6G goals, a lot of challenges will be faced which require intense research. These challenges include Peak Rate-Terabit, higher energy efficiency, connection everywhere and anytime, self-aggregating, high propagation and atmospheric absorption of THz, complexity in resource management for 3D networking, heterogeneous hardware constraints, autonomous wireless systems, modelling of sub-mmWave (THz) frequencies, and spectrum and interference management.
 
  • Like
  • Fire
  • Love
Reactions: 10 users
I’ve come across a few articles now referring to digital twin technology. I just don’t understand what it is though 🤔

Hey @TopCat I posted this back in May

* the text in the link does not match the content, but click the link - it works

I know Siemens are hot on digital twins so would be a great entry to their market for us

 
Last edited:
  • Like
  • Fire
Reactions: 7 users

Foxdog

Regular
  • Like
Reactions: 2 users
F

Filobeddo

Guest
That thread by @Neuromorphia - Akida Licensees, Partners, Speculative Theories and Supporting Links

Wow!! 🤩 , just brilliant!! , Love your work Neuro (& on a Saturday morning when most of us are barely functioning 😄)

Check it out
 
  • Like
  • Fire
Reactions: 25 users
F

Filobeddo

Guest
Last edited by a moderator:
  • Like
  • Haha
  • Fire
Reactions: 5 users

Makeme 2020

Regular
"
.....The idea is to minimize the size and complexity of data packets sent to AI accelerators in automotive head units, while lowering latency and minimizing energy requirements. With a potential for 70 Akida chips or Akida–enabled sensors in each vehicle, Nadel said each one will be a “low–cost part that will play a humble role,” noting that the company needs to be mindful of the bill of materials for all these sensors.
....
"
copy from user investmentstratege 1980
www.wallstreet-online.de
thanks for posting!
date 07.22.2022
SUBSCRIBE LOGIN/REGISTER


DESIGNLINES
AI & BIG DATA DESIGNLINE

Cars That Think Like You​

By Sally Ward-Foxton 07.22.2022 0
Share Post
Share on Facebook
Share on Twitter



Car makers are checking out neuromorphic technology to implement AI–powered features such as keyword spotting, driver attention, and passenger behavior monitoring.
Imitating biological brain processes is alluring because it promises to enable advanced features without adding significant power draw at a time when vehicles are trending towards battery–powered operation. Neuromorphic computing and sensing also promise benefits like extremely low latency, enabling real–time decision making in some cases. This combination of latency and power efficiency is extremely attractive.
Here’s the lowdown on how the technology works and a hint on how this might appear in the cars of the future.


With the rise of artificial intelligence, technologies claiming to be “brain-inspired” are abundant. We examine what neuromorphic means today in our Neuromorphic Computing Special Project.

SPIKING NETWORKS

The truth is there are still some things about how the human brain works that we just don’t understand. However, cutting–edge research suggests that neurons communicate with each other by sending electrical signals known as spikes to each other, and that the sequences and timing of spikes are the crucial factors, rather than their magnitude. The mathematical model of how the neuron responds to these spikes is still being worked out. But many scientists agree that if multiple spikes arrive at the neuron from its neighbors at the same time (or in very quick succession), that would mean the information represented by those spikes is correlated, therefore causing the neuron to fire off a spike to its neighbor.

This is in contrast to artificial neural networks based on deep learning (mainstream AI today) where information propagates through the network at a regular pace; that is, the information coming into each neuron is represented as numerical values and is not based on timing.
Making artificial systems based on spiking isn’t easy. Aside from the fact we don’t know exactly how the neuron works, there is also no agreement on the best way to train spiking networks. Backpropagation — the algorithm that makes training deep learning algorithms possible today — requires computation of derivatives, which is not possible for spikes. Some people approximate derivatives of spikes in order to use backpropagation (like SynSense) and some use another technique called spike timing dependent plasticity (STDP), which is closer to how biological brains function. STDP, however, is less mature as a technology (BrainChip uses this method for one–shot learning at the edge). There is also the possibility of taking deep learning CNNs (convolutional neural networks), trained by backpropagation in the normal way, and converting them to run in the spiking domain (another technique used by BrainChip).

SYNSENSE SPECK

SynSense is working with BMW to advance the integration of neuromorphic chips into smart cockpits and explore related fields together. BMW will be evaluating SynSense’s Speck SoC, which combines SynSense’s neuromorphic vision processor with a 128 x 128–pixel event–based camera from Inivation. It can be used to capture real–time visual information, recognize and detect objects, and perform other vision–based detection and interaction functions.
“When BMW replaces RGB cameras with Speck modules for vision sensing, they can replace not just the sensor but also a significant chunk of GPU or CPU computation required to process standard RGB vision streams,” Dylan Muir, VP global research operations at SynSense, told EE Times.
Using an event–based camera provides higher dynamic range than standard cameras, beneficial for the extreme range of lighting conditions experienced inside and outside the car.
BMW will explore neuromorphic technology for car applications, including driver attention and passenger behavior monitoring with the Speck module.
“We will explore additional applications both inside and outside the vehicle in coming months,” Muir said.
SynSense’s neuromorphic vision processor has a fully asynchronous digital architecture. Each neuron uses integer logic with 8–bit synaptic weights, a 16–bit neuron state, 16–bit threshold, and single–bit input and output spikes. The neuron uses a simple integrate–and–fire model, combining the input spikes with the neuron’s synaptic weights until the threshold is reached, when the neuron fires a simple one–bit spike. Overall, the design is a balance between complexity and computational efficiency, Muir said.
SynSense model of the neuron SynSense’s electronic neuron is based on the integrate–and–fire model. (Source: SynSense)
SynSense’s digital chip is designed for processing event–based CNNs, with each layer processed by a different core. Cores operate asynchronously and independently; the entire processing pipeline is event driven.
“Our Speck modules operate in real–time and with low latency,” Muir said. “We can manage effective inference rates of >20Hz at <5mW power consumption. This is much faster than what would be possible with traditional low–power compute on standard RGB vision streams.”
While SynSense and BMW will be exploring neuromorphic car use cases in the “smart cockpit” initially, there is potential for other automotive applications, too.
“To begin with we will explore non–safety–critical use cases,” Muir said. “We are planning future versions of Speck with higher resolution, as well as revisions of our DynapCNN vision processor that will interface with high–resolution sensors. We plan that these future technologies will support advanced automotive applications such as autonomous driving, emergency braking, etc.”
SynSense Speck Module SynSense and Inivation Speck module — an event–based camera module which incorporates sensor and processor. (Source: SynSense)

BRAINCHIP AKIDA

The Mercedes EQXX concept car, debuted at CES 2022, features BrainChip’s Akida neuromorphic processor performing in–cabin keyword spotting. Promoted as “the most efficient Mercedes–Benz ever built,” the car takes advantage of neuromorphic technology to use less power than deep learning powered keyword spotting systems. This is crucial for a car that is supposed to deliver a 620–mile range (about 1,000 km) on a single battery charge, 167 miles further than Mercedes’ flagship electric vehicle, the EQS
Mercedes said at the time that BrainChip’s solution was 5 to 10× more efficient than conventional voice control when spotting the wake word “Hey Mercedes”.
Neuromorphic Car Mercedes EQXX Mercedes’ EQXX concept EV has a power efficiency of more than 6.2 miles per kWh, almost double that of the EQS. (Source: Mercedes)

“Although neuromorphic computing is still in its infancy, systems like these will be available on the market in just a few years,” according to Mercedes. “When applied at scale throughout a vehicle, they have the potential to radically reduce the energy needed to run the latest AI technologies.”
“[Mercedes is] looking at big issues like battery management and transmission, but every milliwatt counts, and the context of [BrainChip’s] inclusion was that even the most basic inference, like spotting a keyword, is important when you consider the power envelope,” Jerome Nadel, chief marketing officer at BrainChip, told EE Times.
Nadel said that a typical car in 2022 may have as many as 70 different sensors. For in–cabin applications, these sensors may be enabling facial detection, gaze estimation, emotion classification, and more.
“From a systems architecture point of view, we can do it in a 1:1 way, there’s a sensor that will do a level of pre–processing, and then the data will be forwarded,” he said. “There would be AI inference close to the sensor and… it would pass the inference meta data forward and not the full array of data from the sensor.”
The idea is to minimize the size and complexity of data packets sent to AI accelerators in automotive head units, while lowering latency and minimizing energy requirements. With a potential for 70 Akida chips or Akida–enabled sensors in each vehicle, Nadel said each one will be a “low–cost part that will play a humble role,” noting that the company needs to be mindful of the bill of materials for all these sensors.
BrainChip Akida neuromorphic processor in car system BrainChip sees its neuromorphic processor next to every sensor in a car. (Source: BrainChip)
Looking further into the future, Nadel said neuromorphic processing will find its way into ADAS and autonomous vehicle systems, too. There is potential to reduce the need for other types of power–hungry AI accelerators.
“If every sensor had a limited, say, one or two node implementation of Akida, it would do the sufficient inference and the data that would be passed around would be cut by an order of magnitude, because it would be the inference metadata… that would have an impact on the horsepower that you need in the server in the trunk,” he said.
BrainChip’s Akida chip accelerates spiking neural networks (SNNs) and convolutional neural networks (via conversion to SNNs). It is not tailored for any particular use case or sensor, so it can work with vision sensing for face recognition or person detection, or other audio applications such as speaker ID. BrainChip has also demonstrated Akida with smell and taste sensors, though it’s more difficult to imagine how these sensors might be used in automotive (smelling and tasting for air pollution or fuel quality, perhaps).
Akida is set up to process SNNs or deep learning CNNs that have been converted to the spiking domain. Unlike native spiking networks, converted CNNs retain some information in spike magnitude, so 2– or 4–bit computation may be required. This approach, hwoever, allows exploitation of CNNs’ properties, including their ability to extract features from large datasets. Both types of networks can be updated at the edge using STDP — in the Mercedes example, that might mean retraining the network to spot more or different keywords after deployment.
Neuromorphic Car Mercedes EQXX interior Mercedes used BrainChip’s Akida processor to listen for the keyword “Hey Mercedes” in the cabin of its EQXX concept EV. (Source: Mercedes)
Mercedes has confirmed that “many innovations”, including “specific components and technologies” from the EQXX concept car, will make it into production vehicles, reports Autocar. There is no word yet on whether new models of Mercedes will feature artificial brains.
RELATED TOPICS: AI, AI AND BIG DATA, AUTOMOTIVE, AUTOMOTIVE ARCHITECTURE, NEUROMORPHIC, NEUROMORPHIC ENGINEERING, NEUROMORPHIC RESEARCH, NEUROMORPHIC TECHNOLOGY, NEUROMORPHIC VISION, SENSOR
 
  • Like
  • Love
  • Fire
Reactions: 29 users

Shadow59

Regular
Thanks for the reply Learning,

I’m not trying to be negative. I’m up 100 percent and could pay my house off tomorrow if I wanted to only I think Brainchip will increase significantly more than my mortgage will. Hence I‘ll leave my money sitting where it is! I won’t be selling anything until at least 2025 when Valeo have had massive sales with their Lidar product and Nanose has been released. Then I might buy myself a new car; at the moment I’m thinking a Kia Stinger if they’re still in existence, (they’re a fun car to drive and remind me of my first car which was a Datsun 260Z)

View attachment 12255


View attachment 12256

This photo isn’t mine but pretty well the same thing, was a beauty and I regret selling it! I actually had a matching motorbike the same colour, a Virago 535 which was great fun as well! I thought I was the coolest thing since sliced bread, but that was back in the early 90’s when I owned them!

View attachment 12257

Sorry, I digress, came back from a mates with a couple of beers under the belt and reminisce.

It was more a discussion point as I have been reading of everyone getting excited with the SP rise which is great, however even though I think it could be justified because there is a lot known here I’m suspicious of how quickly it has risen. Even though there has been some great news shared on this forum I’m not convinced the general market is aware of the totality of it all. I put a fair bit of time into researching where Brainchip is at and I’m not sure the average investor puts the same amount of daily time or effort into it; hence they may have knowledge gaps or be as up to date as those on this forum.

You don’t need to sell the company to me; both my children and 1/2 a dozen work colleagues have bought in as I’ve been promoting the company to anyone interested as I’d love for all my colleagues to share in the wealth that will be created in the future. Just not sure where we’ll be in a few weeks time and trying to prepare people in case the 4c isn’t as expected and there is a manipulated price drop. Hope for the best but prepare for the worst! The forecast was “Lumpy.”

We’ll see what happens next week when the 4c comes out and how the SP goes. It’s easy in hindsight to comment on the share price but I don’t mind having a go at casting an opinion and seeing how accurate I am, as in reality what happens over the next few months won’t matter in relation to the next few years. I think the future looks great regardless of this upcoming 4c. I have confidence in the eco-system being developed, with strong fundamentals that enable significant growth in the future.

I’ll see how much egg is on my face in a week or so: I’m sure someone will remind me if I’m wrong. :) And I won’t be disappointed if that’s the case.

Cheers and have a great weekend!

Thanks for the reply Learning,

I’m not trying to be negative. I’m up 100 percent and could pay my house off tomorrow if I wanted to only I think Brainchip will increase significantly more than my mortgage will. Hence I‘ll leave my money sitting where it is! I won’t be selling anything until at least 2025 when Valeo have had massive sales with their Lidar product and Nanose has been released. Then I might buy myself a new car; at the moment I’m thinking a Kia Stinger if they’re still in existence, (they’re a fun car to drive and remind me of my first car which was a Datsun 260Z)

View attachment 12255


View attachment 12256

This photo isn’t mine but pretty well the same thing, was a beauty and I regret selling it! I actually had a matching motorbike the same colour, a Virago 535 which was great fun as well! I thought I was the coolest thing since sliced bread, but that was back in the early 90’s when I owned them!

View attachment 12257

Sorry, I digress, came back from a mates with a couple of beers under the belt and reminisce.

It was more a discussion point as I have been reading of everyone getting excited with the SP rise which is great, however even though I think it could be justified because there is a lot known here I’m suspicious of how quickly it has risen. Even though there has been some great news shared on this forum I’m not convinced the general market is aware of the totality of it all. I put a fair bit of time into researching where Brainchip is at and I’m not sure the average investor puts the same amount of daily time or effort into it; hence they may have knowledge gaps or be as up to date as those on this forum.

You don’t need to sell the company to me; both my children and 1/2 a dozen work colleagues have bought in as I’ve been promoting the company to anyone interested as I’d love for all my colleagues to share in the wealth that will be created in the future. Just not sure where we’ll be in a few weeks time and trying to prepare people in case the 4c isn’t as expected and there is a manipulated price drop. Hope for the best but prepare for the worst! The forecast was “Lumpy.”

We’ll see what happens next week when the 4c comes out and how the SP goes. It’s easy in hindsight to comment on the share price but I don’t mind having a go at casting an opinion and seeing how accurate I am, as in reality what happens over the next few months won’t matter in relation to the next few years. I think the future looks great regardless of this upcoming 4c. I have confidence in the eco-system being developed, with strong fundamentals that enable significant growth in the future.

I’ll see how much egg is on my face in a week or so: I’m sure someone will remind me if I’m wrong. :) And I won’t be disappointed if that’s the case.

Cheers and have a great weekend!
Hi SG.
I think we have similar tastes in cars. My first sports car was a gold 260Z, this was followed by a 300ZX, then an MX5, I looked at a Stinger but they are BIG and wouldn't fit in my garage with my gym. Ended up buying another special edition manual MX5, which was a major improvement on the old and is great fun to drive. Hence I'm not looking forward to when everything is autonomous.
Apparently the next MX5 will be hybrid and manual and faster. That could be my next car.
20220723_100754.jpg
 

Attachments

  • 20220218_134511.jpg
    20220218_134511.jpg
    8.2 MB · Views: 79
  • Like
  • Love
  • Fire
Reactions: 12 users

Xhosa12345

Regular
  • Like
  • Love
Reactions: 6 users
Top Bottom