BRN Discussion Ongoing

Dang Son

Regular
Interesting to read the advertising companies blurb about our new look.
Whilst I freely acknowledge that I am not the target market and understand from what I have read here that engagement is growing I do find myself somewhat at odds with their stated strategy.
They say they inherited a brand strong on logic but short on magic.
Whilst I can appreciate the allure of our tech “hey prestoing” away our clients issues I think this approach is more suited to a commercial retail customer who just wants whatever the doodad is, to work, preferably efficiently, but certainly effectively and economically, easily, and straight out of the box.
However, it seems to me that the client’s we are seeking to nab now are people who appreciate and want the “logic” of our offering front and centre, presented in a clear and undeniable way that compels them to proceed with further investigation and engagement, leading, where appropriate to our adoption.
Perhaps my age has somewhat jaded my sensibilities, and I am not advocating a return to generic robots, but would like a bit more pzazz than an odd splash of orange here and there.
I know it’s all very grown up and that we are a serious company but would love to see a bit more flair and originality and some element to tie it all together.
Perhaps the reincorporation of our synapse symbol.
I saw that as a lovely, non linguistic, universally meaningful sign that helped in making our brand memorable and somewhat more approachable.
It exhibited simplicity with the promise of scalability with which one can easily appreciate that complex problem’s may be resolved by the incorporation of a scaled thinking device at the “right“ place. Being a representation of a natural phenomenon also lends it a degree of intuitive recognition and viability.
Anyway just my Sunday afternoon musings, but I say “Bring back the Synapse”
before someone else appropriates it. 🤣
“Bring back the Synapse”
 
  • Like
  • Fire
  • Haha
Reactions: 8 users

buena suerte :-)

BOB Bank of Brainchip

Well ... My thinking is that we still have a VERY close relationship and a HUGE future with MB (No doubting that at all) I know this article is from April of this year but it's just a reminder of what was announced back in January!! and on that note I think we were extremely lucky to have had MB come out and publicly state that .. Akida is installed in the Concept car EQXX from Mercedes ... you couldn't get more of an endorsement and huge exposure from a World automobile leader for AKIDA than that! If MB were not 100% happy they would not have released that statement! With MB we know for a fact that we are a partner and maybe they are already testing the New Akida 2.0 as maybe many other big players are too! So who knows news/updates from our newest member of the Brain Fam could be released SOON!! Looking forward to the 4C anytime now.. but most probably next Thurs/Fri 27th/28th?

Have a great week Chippers :)

MERCEDES PARTNER BRAINCHIP WITH THE NEXT COLLABORATION​

Probably one of the most exciting listed chip newcomers is BrainChip. The Australian technology company is working on solutions in the field of artificial intelligence and machine learning. The Akida chip is the Company's current flagship product. It is a neuromorphic processor. It is said to be very close to the workings of the brain and thus, in particular, very energy-efficient. BrainChip sees applications in autonomous driving, IoT devices, robotics, medical diagnostics and security technology. The Australians caused a sensation at the turn of the year when it became known that Akida is installed in the Concept car EQXX from Mercedes and, among other things, makes the "Hey, Mercedes" voice control up to ten times more efficient than conventional voice control. There were also reports of interest on the part of the US Air Force. As a result, the share price exploded to EUR 1.20. In the context of the general weakness on the stock markets, the price returned to EUR 0.60. Operationally, the Company continues to step on the gas.

Most recently, BrainChip reported a cooperation with nViso SA. The Swiss company says its technology is the only one capable of analyzing human behavior signals such as facial expressions, emotions, identity, head posture, gaze, gestures, activities and objects with which users interact. In robotics and vehicle applications, human behavior analysis detects the user's emotional state to provide personalized, adaptive, interactive and safe devices and systems. This technology will be integrated into BrainChip's Akida processors. The partners see initial applications in the area of robots and surveillance systems. BrainChip CEO Sean Hehir said, "nViso's AI systems for analyzing human behavior offer fascinating possibilities in homes, cars, buildings, hospitals and more. We are excited to support these capabilities with BrainChip's processing power and energy efficiency." This may not have been the last interesting news from BrainChip this year.
 
  • Like
  • Love
  • Fire
Reactions: 27 users
My personal theory and that is all it is is that the first model released with Brainchip involved will be a high range electric vehicle show casing Mercedes Benz technology leap ahead of the competition.

The target range will be 800 kilometres plus.

It will be the car that thinks like you.

It will be the car that learns and adapts to you.

It will be the car taking full advantage of the JAST learning rules owned by…?…that does one shot, few shot, incremental learning securely on chip without connection.

There is only one chip and one company in the WORLD that offers this technology.

My opinion only DYOR
FF

AKIDA BALLISTA
Not sure why so many sad face emoji’s?

Crack the champagne team - We have become a market product!!!

Hey Mercedes: very powerful voice assistant
The "Hey Mercedes" voice assistant is highly capable of dialogue and learning by activating online services in the Mercedes me App. Moreover, certain actions can be performed even without the activation keyword "Hey Mercedes". These include taking a telephone call. "Hey Mercedes" also explains vehicle functions, and, for example, can help when asked how to connect a smartphone via Bluetooth or where the first-aid kit can be found.
If compatible home technology and household devices are present, they can also be networked with the vehicle thanks to the smart home function and controlled from the vehicle by voice. "Hey Mercedes" can also detect occupants audibly. Once the individual voice characteristics have been learned, this can be used to access personal data and functions by activating a profile.


This is from the media PDF that has been sent out.
 
  • Like
  • Fire
  • Love
Reactions: 39 users

Dhm

Regular
Sony together with Prophesee have produced an Event Based Vision Sensor and we have a strong liklihood that Akida is embedded within the product. This is not necessarily new to the 1000 Eyes but it is good to review and remind ourselves how good it is to be shareholders.

It is however, over a year old.

Sony says in their press release:
https://www.sony-semicon.com/en/new...+-+IMX636-637+launch&utm_id=IMX636-637+launch

Screen Shot 2022-10-17 at 10.58.34 am.png


Lots of Brainchip style jargon mentioned in this video....

 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 30 users

Diogenese

Top 20
Little more background.

This looks like could be him based on profile pic and Rigpa article pic.



Is apparently SNN.

Was working with Steve Furber at Manchester at one point. Sure I've seen his name around before.

Also had funding from US DTRA.




This new brain-inspired chip is 23 times faster and needs 28 times less energy

It was 2009 when Mike Huang first knew he wanted to become a chip design engineer, as he developed a 16-bit microprocessor from scratch with his classmates in his university lab.

Soon, Huang grew interested in understanding how the mind works — and “as an engineer, to understand and prove how the mind works is to effectively reverse engineer the brain,” he says. He reached out to Professor Steve Furber at the University of Manchester, who was designing a supercomputer with 1 million ARM cores that draws inspiration from the communication architecture of the brain, and started working closely with him.

A decade later, Huang has just developed a brain-inspired microchip that could process large amounts of data faster and with lower power, improving performance and energy efficiency for AI applications.

“The technology itself closely mimics how biological neural networks work compared to conventional AI solutions,” Huang says.

Huang first created the chip as part of his PhD in Neuromorphic Computing at the University of Edinburgh School of Engineering, funded by UK-based radiation detection company Kromek and the US Defense Threat Reduction Agency (DTRA). He then launched a startup, Rigpa, and joined Cohort IV at Conception X to learn how to commercialise his technology.

The problem Huang originally set out to solve was to reduce power consumption and inference times compared to traditional chip architectures. He achieved this by designing a spiking neural network chip to accelerate the next generation of AI — efficient, sustainable and human brain-like.

“GPUs are an old technology — they were originally designed for video games,” Huang says. “The median GPU consumes a huge amount of power. Just think that the latest neural network model GPT-3 generated 552 metric tons of carbon dioxide during training — that’s the CO2 emissions the average American produces over more than 34 years. It’s not a sustainable solution.”

Rigpa’s technology achieves 28 times less power consumption and 23 times faster inference speed than conventional architectures, with key applications in situations that require reliable, real-time computation — think computer vision, drones, smart home appliances, self-driving cars, wearables, high-frequency trading and more.


“National security is a good example of how this technology could be used,” Huang says. “Imagine a police officer working in counterterrorism who’s equipped with a handheld radiation detector connected to their mobile phone, which processes the data from the detector. Rigpa’s new chip can be integrated directly into the detector so that everything happens in there, and the phone is no longer needed.”

At Conception X, Huang learned how to turn blue sky ideas into something tangible. “Conception X has helped to sharpen my mindset. I’m still doing research, but it’s completely different from one year ago,” he says. “Before, I wasn’t sure when or how this technology would be useful. Now, I know in which direction to go and I’m constantly thinking about how a new piece of research I’m working on will feed into my technology.”

Rigpa plans to launch its product on the market in 2024, when demand for neuromorphic technologies is set to take off, and is currently looking to raise.


Great sleuthing Fmf,

This is a paper Huang published disclosing his SNN.


https://arxiv.org/ftp/arxiv/papers/2010/2010.13125.pdf

1665965807563.png

It appears that one reason why it is slower than Akida is that it is clocked rather than asynchronous:

Each neuron in the same layer triggered by the same presynaptic neuron, is processed one-by-one in a Time Division Multiplexing (TDM) manner. The sequence of the operations and data flow are managed by the state machine in the Control Logic block. As shown in Fig. 5, to handle the worst case of back-to-back spikes from the hidden layer, 9 cycles of delay are added after each spike fire event of any neurons in hidden layer to allow sufficient time for the 9 cycles needed by TDM process of 8 neurons in the output layer.
 
  • Like
  • Love
  • Fire
Reactions: 36 users
Oh my goodness my entire share portfolio is currently down $625.00. That’s right 62,500 cents where will it all end.🤡🤣😂🤡 Boring flat day at this stage. The WANCA’s must be tearing their hair out. 😁
 
Last edited:
  • Haha
  • Like
  • Love
Reactions: 18 users

buena suerte :-)

BOB Bank of Brainchip
Nice to see the BUY side looking a bit healthier! :cool:

654 buyers for 6,459,174 units

390 sellers for 4,490,809 units
 
  • Like
  • Fire
Reactions: 13 users

Quercuskid

Regular
Oh my goodness my entire share portfolio is currently down $625.00. That’s right 6,250 cents where will it all end.🤡🤣😂🤡 Boring flat day at this stage. The WANCA’s must be tearing their hair out. 😁
Mines up lol thanks to a few patrys and their amazing announcement today!
 
  • Like
  • Love
  • Fire
Reactions: 9 users
Great sleuthing Fmf,

This is a paper Huang published disclosing his SNN.


https://arxiv.org/ftp/arxiv/papers/2010/2010.13125.pdf

View attachment 19110
It appears that one reason why it is slower than Akida is that it is clocked rather than asynchronous:

Each neuron in the same layer triggered by the same presynaptic neuron, is processed one-by-one in a Time Division Multiplexing (TDM) manner. The sequence of the operations and data flow are managed by the state machine in the Control Logic block. As shown in Fig. 5, to handle the worst case of back-to-back spikes from the hidden layer, 9 cycles of delay are added after each spike fire event of any neurons in hidden layer to allow sufficient time for the 9 cycles needed by TDM process of 8 neurons in the output layer.
Thanks D

I try to never discount potential competition in so much as not necessarily better than us but just a possible direct player in our space if a close product.

We know a lot of the other pretenders out there are actually different, general AI etc, or run a watered down attempt at SNN but this one just seemed a little closer for some reason.

Good to see there is still a key diff that keeps his SNN a bit slower and trust our increasing patent moat should ensure can't encroach too much into our abilities.

Cheers
 
  • Like
  • Fire
Reactions: 22 users
  • Like
  • Haha
Reactions: 5 users

Diogenese

Top 20
Great sleuthing Fmf,

This is a paper Huang published disclosing his SNN.


https://arxiv.org/ftp/arxiv/papers/2010/2010.13125.pdf

View attachment 19110
It appears that one reason why it is slower than Akida is that it is clocked rather than asynchronous:

Each neuron in the same layer triggered by the same presynaptic neuron, is processed one-by-one in a Time Division Multiplexing (TDM) manner. The sequence of the operations and data flow are managed by the state machine in the Control Logic block. As shown in Fig. 5, to handle the worst case of back-to-back spikes from the hidden layer, 9 cycles of delay are added after each spike fire event of any neurons in hidden layer to allow sufficient time for the 9 cycles needed by TDM process of 8 neurons in the output layer.
More lead in the Rigpa saddlebags:

B. ANN-to-SNN Conversion The process of conversion from the trained ANN to a SNN model running on SpiNNaker was made up of two steps: quantisation and network conversion. In the quantisation step the ANN parameters were fine-tuned using quantisation aware training available in the TensorFlow Model Optimization Toolkit [9] and the weights were quantised to 8-bit signed integers. Figure 9 (right) shows the difference that the losses due to weight quantisation were negligible. This suggests that the model has a high number of redundant connections and that reduction of the model complexity through further reduction of the precision of the weights or through pruning of connections could give further energy savings in the hardware implementation. Lower precisions were not investigated here due to the lack of software support for precisions lower than 8-bit integer..

To misquote Mick Dundee:

"Those aren't spikes ...

THIS is a spike!"
 
  • Like
  • Haha
  • Fire
Reactions: 24 users

JK200SX

Regular
This has to be confirmation that AKIDA has been incorporated in the NEW EQE !

1665968558289.png



Click on the link above, then click on "5 documents" and select the document in English, or whatever other language you can read! (screenshot above is from page 20)
 
Last edited:
  • Like
  • Fire
  • Thinking
Reactions: 46 users

Mn2019

Regular
Thanks JK200SX. Now I will have to go and buy a MB electric car....
 
  • Like
  • Haha
Reactions: 9 users

JK200SX

Regular
This has to be confirmation that AKIDA has been incorporated in the NEW EQE !

View attachment 19116


Click on the link above, then click on "5 documents" and select the document in English, or whatever other language you can read! (screenshot above is from page 20)


1665969086713.png
 
  • Like
  • Fire
Reactions: 10 users

alwaysgreen

Top 20
  • Like
  • Love
Reactions: 4 users
So are we saying that to have Akida in this latest MB release they would have gone via ARM for the IP?

If they were using direct from BRN as per the EQXX reveal then would expect a signed agreement of some level with BRN which would necessitate an Ann.

I'm just not sure as yet personally as we know the time cycle to Design, Dev and Produce a vehicle and these would have been in the pipeline for quite sometime.

Unless they developed the production Akida integration for this release parallel to the EQXX concept :unsure:
 
  • Like
Reactions: 10 users

Mn2019

Regular
So are we saying that to have Akida in this latest MB release they would have gone via ARM for the IP?

If they were using direct from BRN as per the EQXX reveal then would expect a signed agreement of some level with BRN which would necessitate an Ann.

I'm just not sure as yet personally as we know the time cycle to Design, Dev and Produce a vehicle and these would have been in the pipeline for quite sometime.

Unless they developed the production Akida integration for this release parallel to the EQXX concept :unsure:
Marcus Schaefer did say to "stay tuned"..........
 
  • Like
  • Haha
Reactions: 15 users
Top Bottom