BRN Discussion Ongoing

HUSS

Regular

I can smell Akida inside these Qualcomms premium phone coming next year to markets via Generative AI on the device. Great read and interesting article!!
 
  • Like
  • Fire
  • Thinking
Reactions: 23 users

AusEire

Founding Member. It's ok to say No to Dot Joining
Now we have confirmation that the first gen is a flop.

I'm going to assume that you are completely illiterate. The company has never once said that Akida1000 was a flop or failed or didn't work.

They stated that tech available currently is good enough for what tech companies are trying to achieve. It doesn't mean that it couldn't be used in future products that need more power efficiency etc etc.

Remember Megachips and Renesas (this year) are due to bring their own products to market soon having used Akida1000. Mercedes also used Akida1000 and said it was 5-10 times better than what they had been working with.

It was a success and continues to be. Otherwise Akida2.0 or Akida 1500 wouldn't have been developed.

The amount of dribble I've seen on here about this topic is annoying.

If you truly believe what you said just sell up and short this fucker. I bet you won't though
 
  • Like
  • Fire
  • Love
Reactions: 59 users

Dozzaman1977

Regular
I'm going to assume that you are completely illiterate. The company has never once said that Akida1000 was a flop or failed or didn't work.

They stated that tech available currently is good enough for what tech companies are trying to achieve. It doesn't mean that it couldn't be used in future products that need more power efficiency etc etc.

Remember Megachips and Renesas (this year) are due to bring their own products to market soon having used Akida1000. Mercedes also used Akida1000 and said it was 5-10 times better than what they had been working with.

It was a success and continues to be. Otherwise Akida2.0 or Akida 1500 wouldn't have been developed.

The amount of dribble I've seen on here about this topic is annoying.

If you truly believe what you said just sell up and short this fucker. I bet you won't though
Well said . And don't forget socionext!!!!!! Mass production of radar chip early 2024.
Screenshot_20230827_082247_Firefox.jpg
 
  • Like
  • Fire
  • Love
Reactions: 51 users

IloveLamp

Top 20

I can smell Akida inside these Qualcomms premium phone coming next year to markets via Generative AI on the device. Great read and interesting article!!
Agreed!

(Below for those without LinkedIn)


Screenshot_20230827_082810_LinkedIn.jpg
 
  • Like
  • Fire
  • Love
Reactions: 19 users

Getupthere

Regular
That's a good way to push the SP down to 5 cents. Throw your toy on the ground and break it because mum and havnt put the batteries in yet. The 1st strike was stupidity the 2nd will be suicide.
The first strike was a warning and a second strike would be because of no results.

I’m sorry that’s the real world…..you have to show results…. Investors won’t wait for ever.
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 9 users

DK6161

Regular
  • Like
Reactions: 7 users

Neuromorphia

fact collector
Last edited:
  • Like
  • Fire
  • Love
Reactions: 19 users

Fenris78

Regular
It looks like we are in the Renesas R-Car Gen 5 system on chip.... planned for 2027. I can't see that Akida 1000/1500 is in R-Car S4 Gen 4?
Is there any evidence that Akida is in anything in the near future? Given Renesas did the 1500 tape out in Dec 2022... I'm hoping that there is a Renesas product planned before 2027.

1693092196908.png


 
  • Like
  • Fire
Reactions: 9 users

AusEire

Founding Member. It's ok to say No to Dot Joining
  • Like
  • Thinking
Reactions: 4 users
@Diogenese this is one for you mate. Looks and sounds like Akida but is it? 😅

https://www.nordicsemi.com/News/202...160-in-the-Vodafone-Asset-Solar-asset-tracker

Thanks to Daniel on one of the Facebook group chats for finding this
Might help with the Easter egg hunt
 
Last edited:
  • Like
  • Haha
  • Thinking
Reactions: 5 users

Cartagena

Regular

I can smell Akida inside these Qualcomms premium phone coming next year to markets via Generative AI on the device. Great read and interesting article!!

And please watch this awesome on-device AI demo by Qualcomm: https://lnkd.in/gy92AZJZ
  • No alternative text description for this image
 
  • Like
  • Love
  • Fire
Reactions: 8 users

Dozzaman1977

Regular
Can you point us to where it says mass production containing our IP please? Eager to read and confirm
Neuromorphia has put the links after your post . This was spoken of in great detail a few months back .
DYOR GLTAH
 
  • Like
Reactions: 8 users

Cartagena

Regular

Attachments

  • 1693094731677.png
    1693094731677.png
    14.8 KB · Views: 108
Last edited:
  • Like
  • Love
  • Fire
Reactions: 14 users
Good evening, I know the last HY report has stirred up a lot of concerns amongst us, so I just want to be prepared to navigate this situation and contemplating whether I should average down again if the share price drops next week. If I use one of the known facts that Renesas already taping out a MCU with Akida in it in last Dec, my question is do you think SP should be higher then the current level once Renesas announce the availability of Akida powered MCU? I personally think it should, so hopefully that I got it right this time..
In my view, if the product revenue from this was going to be meaningfully market sensitive, we should be seeing some sort of industry buying of Brainchip shares, as they would be the ones who have visibility over the scale and or coming adoption of the product.
 
  • Like
  • Fire
  • Sad
Reactions: 6 users

skutza

Regular
Just putting it out there. But just watched the TENNs video out by brainchip I think they highlighted the main reason we are where we are. "Generations ahead of the industry". What we have seems magical, but does the industry need it today? It seems it will come, but is it a case that we are too far ahead and the industry is saying we don't need to invent the wheel again just yet. I think we need Megachips or the like to come out with something so needed that others must follow. Yes the cloud is getting busy, but are we providing a solution not needed yet? Just a few thoughts, but maybe I'm not understanding enough, maybe the need is now?
 
  • Like
  • Thinking
Reactions: 15 users

Diogenese

Top 20
And please watch this awesome on-device AI demo by Qualcomm: https://lnkd.in/gy92AZJZ
  • No alternative text description for this image


Hi Cartagena,

I don't believe there is any evidence that Qualcomm uses Akida, although I wish they did.

They have their in-house Hexagon:

https://www.qualcomm.com/news/onq/2...ile-computing-performance-for-windows-laptops


https://www.theregister.com/2022/11/15/qualcomm_snapdragon_8_gen_2/?td=readmore

Qualcomm pushes latest Arm-powered Snapdragon chip amid bitter license fight

The Snapdragon 8 Gen 2 system-on-chip features eight off-the-shelf cores from Arm, which is locked in a bitter legal fight with Qualcomm over licenses and contracts.
...
This includes an AI acceleration engine that is, we're told, up to 4.35 times faster than the previous generation, and with a potential 60 percent increase in performance-per-watt, depending on how it's used. This unit can be used to speed up machine-learning tasks on the device without any outside help, such as object recognition, and real-time spoken language translation and transcription. The dual-processor engine can handle as low as INT4 precision for AI models that don't need a lot of precision but do need it done fast on battery power, which the 4-bit integer format can afford developers, according to Qualcomm.

Qualcomm is pushing the INT4 capabilities as a precision ideal for modern mobile apps. It said a cross-platform Qualcomm AI Studio is due to be made available in preview form in the first half of next year that will optimize developers' models for this precision as well as other formats. This studio looks like a typical IDE in which programmers can organize their training workflows.
...
The SoC supports up to 200MP image capture and 8K HDR video capture in 10-bit HDR, according to the specifications. Qualcomm said it worked with Samsung and Sony to develop large sensors that the 8 Gen 2 can handle. There are direct paths in the chip that link the Hexagon AI engines to, say, the image-processing units so that pictures and video can be manipulated more efficiently.

The processor, according to Qualcomm, can also be made to reduce the amount of data read and written during neural network inference – which saves power – by breaking input data into not just tiles as other chipsets do, but micro tiles that apparently do a better job of cutting down information transfer.

This is from a quote in a now-deleted Stable genius post on TSEx:

Qualcomm uses all of the Snapdragon SoC’s processing elements for AI processing and calls the combination of these processing elements the “AI engine.” Among the enhancements incorporated into the AI engine was a dedicated power plane and a doubling of the tensor processing cores within the Hexagon processor. The result is a 4.35x improvement in performance and an equally impressive 60% improvement in performance per watt efficiency. Qualcomm also added support for transformer neural network models which are critical for applications like natural language processing (NLP) for speech-to-text and text-to-speech translation. The Hexagon can splice the neural NLP model into smaller elements to run on micro tiles allowing for more efficient use of the processing cores. Additionally, Qualcomm added support for Int4 data structures. In many cases, this lower precision data structure can be used by neural network models like computational photography image enhancement without a noticeable loss of accuracy while improving speed and power efficiency. The end result is faster and more efficient processing of neural network models.


This is from a 2021 Qualcomm patent application:

WO2023049655A1 TRANSFORMER-BASED ARCHITECTURE FOR TRANSFORM CODING OF MEDIA 2021-09-27


1693095625400.png


Systems and techniques are described herein for processing media data using a neural network system. For instance, a process can include obtaining a latent representation of a frame of encoded image data and generating, by a plurality of decoder transformer layers of a decoder sub-network using the latent representation of the frame of encoded image data as input, a frame of decoded image data. At least one decoder transformer layer of the plurality of decoder transformer layers includes: one or more transformer blocks for generating one or more patches of features and determine self-attention locally within one or more window partitions and shifted window partitions applied over the one or more patches; and a patch un-merging engine for decreasing a respective size of each patch of the one or more patches.

...
[0075] The SOC 100 may also include additional processing blocks tailored to specific functions, such as a GPU 104, a DSP 106, a connectivity block 110, which may include fifth generation (5G) connectivity, fourth generation long term evolution (4G LTE) connectivity, Wi-Fi connectivity, USB connectivity, Bluetooth connectivity, and the like, and a multimedia processor 112 that may, for example, detect and recognize gestures. In one implementation, the NPU is implemented in the CPU 102, DSP 106, and/or GPU 104. The SOC 100 may also include a sensor processor 114, image signal processors (ISPs) 116, and/or navigation module 120, which may include a global positioning system
.



Snapdragon8_2Hexagon

Snapdragon 8 Gen 2 deep dive: Everything you need to know (androidauthority.com)


...
Qualcomm doubled the physical link between the image signal processor (ISP), Hexagon DSP, and Adreno GPU, driving higher bandwidth and lowering latency. This allows the Snapdragon 8 Gen 2 to run much more powerful machine-learning tasks on imaging data right off the camera sensor. RAW data, for instance, can be passed directly to the DSP/AI Engine for imaging workloads, or Qualcomm can use the link to upscale low-res gaming scenarios to assist with GPU load balancing.
 
  • Like
  • Fire
  • Love
Reactions: 18 users
Just putting it out there. But just watched the TENNs video out by brainchip I think they highlighted the main reason we are where we are. "Generations ahead of the industry". What we have seems magical, but does the industry need it today? It seems it will come, but is it a case that we are too far ahead and the industry is saying we don't need to invent the wheel again just yet. I think we need Megachips or the like to come out with something so needed that others must follow. Yes the cloud is getting busy, but are we providing a solution not needed yet? Just a few thoughts, but maybe I'm not understanding enough, maybe the need is now?
The same issue in the energy space.

There’s technology available to supercede oil, gas, coal, and wind generated power, and has been since the mid 1900s.

Unfortunately it’s not whether the technology is good. It’s whether it gets adopted.

BRN clearly have the lead in the AI Edge space, but they are busting every valve to try to get it used and adopted, which is the ultimate challenge in my view..

They probably need to get the right powerful people invested in BRN to then get it the adoption that would make it ubiquitous.. IMO DYOR
 
  • Like
  • Fire
Reactions: 6 users
The same issue in the energy space.

There’s technology available to supercede oil, gas, coal, and wind generated power, and has been since the mid 1900s.

Unfortunately it’s not whether the technology is good. It’s whether it gets adopted.

BRN clearly have the lead in the AI Edge space, but they are busting every valve to try to get it used and adopted, which is the ultimate challenge in my view..

They probably need to get the right powerful people invested in BRN to then get it the adoption that would make it ubiquitous.. IMO DYOR
You think part of the lack of adoption is to do with that some companies have already made deep investments in developing other tech and want to see a return on that to save face before turning a new leaf and looking at other options?

Unrelated but interesting.
Mineralogy meets zero-shot computer vision
 
  • Like
  • Fire
Reactions: 5 users

robsmark

Regular
You think part of the lack of adoption is to do with that some companies have already made deep investments in developing other tech and want to see a return on that to save face before turning a new leaf and looking at other options?

Unrelated but interesting.
Mineralogy meets zero-shot computer vision
I agree that one cow needs to be milked before they move onto another.
 
  • Like
  • Fire
Reactions: 8 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
There are a few things that really boost my confidence when thinking about Qualcomm and they are, in no particular order:
  • Teksun had an existing ecosystem partnership with Qualcomm and yet they still chose to partner with us, presumably because we can offer technological capabilities that Qualcomm isn't in a position to provide. This is what Brijesh Kamani (Founder & CEO of Teksun) said at the time.
We realized very early on in our partnership the advantages of BrainChip’s Neuromorphic architecture and its efficient performance with low power consumption,” said Brijesh Kamani – Founder & CEO of Teksun. “But apart from these capabilities which make BrainChip’s Akida critical for our current and future customers, it is the ease of adoption that enables them to move rapidly from concept to solution.
  • In a similar vein, we know that Mercedes also works with Qualcomm but decided to announce publicly the employment of our technology within the Vision EQXX concept car with its emphasis on intuition, perception and real-time prediction and with confirmation that this technology will be employed in production versions. Interstingly Google will be deeply integrated within the Mercedes user interface. (UI/UX), the same place where the "Hey Mercedes" voice assistant resides, with hot word detection made 5-10 times more efficient than conventional voice control thanks to AKIDA. Here's some of what Mercedes had to say in January 2022.
“Working with California-based artificial intelligence experts BrainChip, Mercedes-Benz engineers developed systems based on BrainChip’s Akida hardware and software,” Mercedes noted in a statement describing the Vision EQXX.

“Although neuromorphic computing is still in its infancy, systems like these will be available on the market in just a few years,” Mercedes said. “When applied at scale throughout a vehicle, they have the potential to radically reduce the energy needed to run the latest AI technologies.

  • Then there's Prophesee and I doubt you could get a better endorsement of our technology that those given in the past by Luca Verre and they have agreed to a multi-year collaboration with Qualcomm. So, hopefully, we'll go wherever Prophesee goes.
We have partnered with BrainChip the world's first commercial producer of neuromorphic AI IP, to deliver next-generation platforms for OEMs looking to integrate PROPHESEE event-based vision sensors systems with high levels of AI performance coupled with ultra-low power technologies. One more step toward the fulfilment of our neuromorphic vision! - Luca Verre LinkedIn

There's probably a few more things I could have added but that's hopefully enough to demonstrate my point for now.
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 57 users
Top Bottom