BRN Discussion Ongoing

Bravo

If ARM was an arm, BRN would be its biceps💪!
Hi Bravo,

I would put a blue circle around "deep learning and computer vision accelerators".


Whooopsies, I had a feeling I forgot something. 🥴Now edited thanks Dodgy Knees! 😘
 
  • Like
  • Haha
Reactions: 8 users

TopCat

Regular

Untether AI Touts “At-Memory” Architecture, Promising Efficiency And Performance At The Edge​


What does Untether AI promise?​

Untether AI recently emerged from stealth and described their architecture as being “At-Memory”, which the company claims will deliver leading performance and power efficiency.


So, let’s look at the architecture. I like the chart below, as it clearly positions Untether AI versus cache-oriented Von Neumann designs. While interesting, the assertion that this architecture is unique is a bit of a reach: other startups have somewhat similar memory architectures, but few can claim the efficiency that Untether AI is demonstrating with early silicon.


Now, here’s the thing: Untether AI claims that its runAI200 chip will crank out over 500 Trillion Ops Per Second (TOPS) with 8-bit inference operations with 200MB of on-die SRAM, all at 8 TOPS/watt. That implies the chip consumes about 60 watts or so, while potentially cranking out 80% of the inference performance of the industry-leading NVIDIA A100, which consumes 300-500 watts. ( IS 60W TOO MUCH FOR AKIDA ??? )


For applications such as ADAS and mobile autonomy, latency can be as critical as throughput, and Untether seems to shine here. The company is shipping chips, cards, and software by the end of this year to early potential clients.

Conclusions​

While a few competitors will claim better power efficiency, none that we are aware of can also claim exceptional inference performance and latency. UntetherAI may be setting a new bar here, and may have significant opportunities in performance-demanding applications such as self-driving vehicles.

 
  • Like
  • Haha
Reactions: 9 users

Diogenese

Top 20
VW had been collaborating with Audi and Porsche on LiDaR/LSTM/SpikingNN since at least 2017.

1659241711854.png




US10133944B2 Digital neuromorphic (NM) sensor array, detector, engine and methodologies

US10387741B2 Digital neuromorphic (NM) sensor array, detector, engine and methodologies

[042] ... The digital NM vision system also can include a digital NM engine 145 that perform image and data processing operations on the velocity vector data generated by the digital NM detector that enables image data processing for improved object detection, classification, and tracking, including machine and deep learning. As such, in accordance with at least one embodiment, the digital NM engine 145 may include one or processors running software to generate digital NM output data for analysis and subsequent control of components with the environment imaged by the detector 110 . Operation of the digital NM engine 145 is further discussed herein with connection to FIGS. 17-20.

That said, more efficient ways of processing the the data than "processors running software" are now available.

However, Akida 1000 does not do tracking, so they will need to wait for Akida 2000 which will include LSTM, and is being developed at top speed by BRN.

LSTM is probably available in MetaTF/ADE software for simulation testing as we speak.
 
  • Like
  • Fire
  • Love
Reactions: 58 users
@chapman89 my calculator broke just on that many sensors let alone my convo with PVDM chatting about their are up to 300 sensors on a automobile , so if we get a number between 70 and 300 , I'm gonna need another wider calculator 🤣🤣
Whether 70 or 300 sensors, on a car, wouldn't/couldn't they be served by just a few AKIDAD chips?
So there may only be 10 or even fewer royalties per car..

With 300 sensors, there wouldn't be 300 "brains"..

We have one brain, for the five senses we use..
 
  • Like
Reactions: 9 users
Hi Bravo,

I would put a blue circle around "deep learning and computer vision accelerators".
Isn't "deep learning" purely cloud based and the complete opposite of what we do Diogenese?
 
  • Like
  • Thinking
Reactions: 3 users

Untether AI Touts “At-Memory” Architecture, Promising Efficiency And Performance At The Edge​


What does Untether AI promise?​

Untether AI recently emerged from stealth and described their architecture as being “At-Memory”, which the company claims will deliver leading performance and power efficiency.


So, let’s look at the architecture. I like the chart below, as it clearly positions Untether AI versus cache-oriented Von Neumann designs. While interesting, the assertion that this architecture is unique is a bit of a reach: other startups have somewhat similar memory architectures, but few can claim the efficiency that Untether AI is demonstrating with early silicon.


Now, here’s the thing: Untether AI claims that its runAI200 chip will crank out over 500 Trillion Ops Per Second (TOPS) with 8-bit inference operations with 200MB of on-die SRAM, all at 8 TOPS/watt. That implies the chip consumes about 60 watts or so, while potentially cranking out 80% of the inference performance of the industry-leading NVIDIA A100, which consumes 300-500 watts. ( IS 60W TOO MUCH FOR AKIDA ??? )


For applications such as ADAS and mobile autonomy, latency can be as critical as throughput, and Untether seems to shine here. The company is shipping chips, cards, and software by the end of this year to early potential clients.

Conclusions​

While a few competitors will claim better power efficiency, none that we are aware of can also claim exceptional inference performance and latency. UntetherAI may be setting a new bar here, and may have significant opportunities in performance-demanding applications such as self-driving vehicles.

"That implies the chip consumes about 60 watts or so, while potentially cranking out 80% of the inference performance of the industry-leading NVIDIA A100, which consumes 300-500 watts"

I think... Maybe they should have come out of stealth 5 years ago, when 60 watts power usage, would have been revolutionary 🤣
 
Last edited:
  • Haha
  • Like
Reactions: 6 users

TopCat

Regular
"That implies the chip consumes about 60 watts or so, while potentially cranking out 80% of the inference performance of the industry-leading NVIDIA A100, which consumes 300-500 watts"

I think... Maybe they should have come out of stealth 5 years ago, when 60 watts power usuage, would have been revolutionary 🤣
Yes thanks…I thought that might of been too much
 
  • Like
Reactions: 3 users

Diogenese

Top 20
Whether 70 or 300 sensors, on a car, wouldn't/couldn't they be served by just a few AKIDAD chips?
So there may only be 10 or even fewer royalties per car..

With 300 sensors, there wouldn't be 300 "brains"..

We have one brain, for the five senses we use..
If the Akidas are integrated with sensors, there would only be one per sensor, but it would only be a partial Akida, eg, 4 nodes or so instead of 20. Then there's mission-critical function redundancy. I recall in the early days of electronic ignition, Holden had a "limp-home" function (30 kph), which, on one occasion, came into play returning to Sydney from Wollongong Uni.

With any luck there will be at least 20 full Akida equivalents at, say, $10 per pop. After all, we're not here to strangle the golden-egg-laying goose.
 
  • Like
  • Haha
  • Fire
Reactions: 27 users
It is possibly obvious to some, but sometimes I like it spelt out to me; this is Akida giving extended battery time, right?
Nothing screams out that it is, other than the power saving, that every tech company is chasing..

But it shows the relevancy and value, of our IP in Edge devices.
 
  • Like
  • Fire
Reactions: 5 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Check this out team!

I thought these slides from NVDIA's Investor Day 2021 Presentation were pretty interesting. The first slide shows the NVDIA + ARM relationship and the second shows a Business Model with Mercedes for shared revenue for AutoPilot and AI cockpit software per car, which I imagine is like Netflix-style subscription service to enable features like autonomous driving, from which NVDIA will presumably take some kind of cut???


funky-dancer-t.gif





Screen Shot 2022-07-31 at 3.49.28 pm.png



Screen Shot 2022-07-31 at 3.46.25 pm.png



 
  • Like
  • Fire
  • Love
Reactions: 40 users

Diogenese

Top 20
  • Like
  • Fire
  • Love
Reactions: 26 users

Boab

I wish I could paint like Vincent
Check this out team!

I thought these slides from NVDIA's Investor Day 2021 Presentation were pretty interesting. The first slide shows the NVDIA + ARM relationship and the second shows a Business Model with Mercedes for shared revenue for AutoPilot and AI cockpit software per car, which I imagine is like Netflix-style subscription service to enable features like autonomous driving, from which NVDIA will presumably take some kind of cut???


View attachment 12942




View attachment 12937


View attachment 12939


Haha, Napoleon Dynamite always makes you feel good.
 
  • Like
  • Haha
Reactions: 8 users

Dozzaman1977

Regular
Love this Quote from Rob telson about the new podcast guest...... ... ...

"Jan is one of the people who we work with to help drive neuromorphic compute and ML at the Edge to enable more intelligence for billions of IoT devices," said Telson.

BILLIONS BILLIONS BILLIONS
kelly aucoin showtime GIF by Billions
 
  • Like
  • Love
  • Fire
Reactions: 39 users
  • Haha
  • Like
  • Love
Reactions: 10 users

hamilton66

Regular
You have hit the nail on the head DB.

It's a game for bug institutions. They can open up 500 small companies separate entities and accumulate share with out breaking the disclosure under an umbrella. Then buy out that company or merge what ever they see fit.

If we look back in the history of BRN or past the financal backer where hard working Australians. There was limited venture capital money put in (highly rich) venture capitalists don't usually lose and they missed the free growth from the 3 to 7 cent range as brn was under the radar.

The games will be played till the shake out the retail Investers with limited knowledge or understanding of AKIDA and what it really mean to the globe. Thanks to the 1000 eyes all the highly intelligent contributors that have tried to help educate us less knowledgeable folks that may have stumbled across a quick gain and moved on and forgot about this one. Once they feel they have enough shares they can let the tiger out if the cage. I know it sounds like a huge conspiracy but if this is what we have the 4th industrial revolution then they will get thier share.

Believe me it's frustrating to see 6 figure drops 5 to get drops or 4 figure drops what ever your level of investment is. I feel a lot of the anger towards thIs games as some people have expected returns too soon I'm one of these people but I have evaluated my personal finance plan and decieded to keep working on hard at my other stuff before I venture off to new things. This was planned for 2022 now it may be 2024 but life is never a straight road. I have made peace with my plans so I'm happy don't like the drop but what can I do.

So I do think some anger pushed on one another is the frustration and helplessness we feel to the high volitility. So that said we need to check ourselves in and realize we are all on the same team. A team can win better then individuals. Sometimes people leave the team for various reasons sometimes we have players that use us for a better return but you know what in the end all of us will win.

I will highlight the investment side of the argument has the company shown signs of growth and traction for comercialization yes. It's slow I agree but I also don't know what other companies speed of implementation will be. I know for a fact if you have a shit product in inventory you still would sell it all until you can implement the next gen. As that would be financial waste. Also you may have previous agreements you need to fulfill prior to moving forth. As long as your direct competition does not have this problem you will play that game. So these delays may not be anyone's fault at BRN.

The top 20 yeah it could be published it should we were told. If I was really concerned I would ask where the registry is and go view it and put my efforts into it. Can we say BRN is dogging that yes but what's there reason we don't know and won't we never know and likely will not. Sorry but we all have control of the top 20 if you put an effort. For my if t there was a place to view the registry in Brisbane I would but I believe it's only in Sydney would that be correct.?

Anyway the company is moving ahead hiring and growing I'm happy to wait and see what 2023 brings 2022 is more than half way gone.

Have a good remainder of the weekend folks
K, ur right - 2022 is more than halfway down. As we sit right now, BRN is up over 50% from 1/1/2022. People are looking at perceived losses from highs, rather than looking at historical gains. I want it all, and I want it now. Those with a l/t strategy need to chill. We're going great. Plenty of positive news flow to come over the coming 6 mths IMO. Patience. GLTA
 
  • Like
  • Fire
  • Love
Reactions: 37 users

skutza

Regular
Love this Quote from Rob telson about the new podcast guest...... ... ...

"Jan is one of the people who we work with to help drive neuromorphic compute and ML at the Edge to enable more intelligence for billions of IoT devices," said Telson.

BILLIONS BILLIONS BILLIONS
kelly aucoin showtime GIF by Billions
Is it just me, or everytime I read the word compute I have to say it in an Anastasia type accent?
 

Attachments

  • Screenshot_20220731-190602_YouTube.jpg
    Screenshot_20220731-190602_YouTube.jpg
    319.2 KB · Views: 91
  • Haha
  • Fire
  • Love
Reactions: 10 users

RobjHunt

Regular
  • Haha
  • Love
  • Like
Reactions: 5 users
S

Straw

Guest
Well her sister-in-law (or equivalent) certainly won't have to explain to her why Brainchip is a good investment!
 
  • Like
Reactions: 1 users

Deadpool

hyper-efficient Ai
Is it just me, or everytime I read the word compute I have to say it in an Anastasia type accent?
She does it for me
Adam Sandler GIF
 
  • Like
  • Fire
  • Love
Reactions: 6 users

uiux

Regular
Conversation over beers last night with brother in law (however could have been anyone near and dear).

Him: “eff sake didn’t win lotto again…”
Me: “mate you’d be better off to put this in stock…”
Him (laughs): “yeah right, like you are some stock portfolio mogul…”
Me: “Nope, but I do have a good pile in Brainchip and frankly this very much has potential to be life changing for us”.
Him: “what, you now gambling like bitcoin…”
🤦‍♂️🤦‍♂️


I then proceeded to try and firstly outline Akida and then mention all the partnerships and announcements we’ve had. I could see him glaze over and thinking I was on crack when talking about the huge potential for this to take flight. And then he started his usual way of ridiculing people (very much a defense mechanism when he doesn’t understand🤣😂🤣)

I however also realised, there was no page or post which just shows potential use cases and partnerships etc etc. kinda a one pager listing all we know so far to show to people.

It would be cool to have such a document to show the skeptics, or does this already exist?

Cheers and happy Sunday!

I've spent 7+ years researching the fuck out of the neuromorphic industry and this is the one pager I have produced showing the potential use cases for neuromorphic technologies :

 
  • Like
  • Love
  • Fire
Reactions: 96 users
Top Bottom