BRN Discussion Ongoing

“Description. Onsemi is a supplier of power semiconductors and sensors focused on the automotive and industrial markets. Onsemi is the second-largest power chipmaker in the world and the largest supplier of image sensors to the automotive market.”

It had revenue for 2023 over $US8 billion.

Infineon achieves record revenue and earnings in the 2023 fiscal year and ...

15 Nov 2023 — The Company has around 58,600 employees worldwide and generated revenue of about €16.3 billion

For those who follow and like to deal in facts in the last few weeks Unigen, Microchip, Infineon and OnSemi have been revealed.

Last year Rob Telson stated there were 100s of undisclosed companies testing AKIDA technology.

These four companies must now be removed from that group and added to the list of disclosed companies and institutions with which Brainchip is working.

This takes us to 55 known relationships.

The unknown for ease of calculation could be described as 100s minus 4 companies. 🤣🤡😂

My opinion only DYOR
Fact Finder
 
  • Like
  • Love
  • Fire
Reactions: 97 users

IloveLamp

Top 20
“Description. Onsemi is a supplier of power semiconductors and sensors focused on the automotive and industrial markets. Onsemi is the second-largest power chipmaker in the world and the largest supplier of image sensors to the automotive market.”

It had revenue for 2023 over $US8 billion.

Infineon achieves record revenue and earnings in the 2023 fiscal year and ...

15 Nov 2023 — The Company has around 58,600 employees worldwide and generated revenue of about €16.3 billion

For those who follow and like to deal in facts in the last few weeks Unigen, Microchip, Infineon and OnSemi have been revealed.

Last year Rob Telson stated there were 100s of undisclosed companies testing AKIDA technology.

These four companies must now be removed from that group and added to the list of disclosed companies and institutions with which Brainchip is working.

This takes us to 55 known relationships.

The unknown for ease of calculation could be described as 100s minus 4 companies. 🤣🤡😂

My opinion only DYOR
Fact Finder
bcee066611b7943072de7eb3569759d0.gif
 
  • Like
  • Haha
Reactions: 8 users

toasty

Regular
“Description. Onsemi is a supplier of power semiconductors and sensors focused on the automotive and industrial markets. Onsemi is the second-largest power chipmaker in the world and the largest supplier of image sensors to the automotive market.”

It had revenue for 2023 over $US8 billion.

Infineon achieves record revenue and earnings in the 2023 fiscal year and ...

15 Nov 2023 — The Company has around 58,600 employees worldwide and generated revenue of about €16.3 billion

For those who follow and like to deal in facts in the last few weeks Unigen, Microchip, Infineon and OnSemi have been revealed.

Last year Rob Telson stated there were 100s of undisclosed companies testing AKIDA technology.

These four companies must now be removed from that group and added to the list of disclosed companies and institutions with which Brainchip is working.

This takes us to 55 known relationships.

The unknown for ease of calculation could be described as 100s minus 4 companies. 🤣🤡😂

My opinion only DYOR
Fact Finder
And still no-one is willing to say what products AKIDA will be used in and when those products might begin shipping.........VERY frustrating.
 
  • Like
  • Fire
Reactions: 10 users

wilzy123

Founding Member
If you are willing to say sorry for your cult comment, FF may share.
Interesting you think the place is a cult, but not enough to stop suckling on the teat of information this place is built on.

Edit: Hint - look at the podcast thread

The level of DD on display is honestly staggering..............
 
  • Haha
  • Like
  • Love
Reactions: 5 users
Hi Bravo
Just wait until you listen to the OnSemi representative talk about their joint demonstration of a smart Airbag AKIDA Sensor for automotive applications.
My opinion only DYOR
Fact Finder
Is there a link to that FactFinder?
 
  • Like
  • Love
Reactions: 5 users

TheDrooben

Pretty Pretty Pretty Pretty Good
Little mention..........



"From industry giants like Siemens to startups like BrainChip, AI is opening new ways to engage and interact with smart technology."


Happy as Larry
 
  • Like
  • Fire
  • Love
Reactions: 47 users
And still no-one is willing to say what products AKIDA will be used in and when those products might begin shipping.........VERY frustrating.
Hi Toasty
It is a feature of the industry they inhabit.

ARM for example supplies 90 percent of the IP used in chips delivered to the market in the form of mobile phones.

The reality is however the different companies making mobile phones all want to differentiate from competitors which would be impossible if they boasted about the ARM IP they were using.

On top of this product development cycles run to years.

It is for each individual to judge what constitutes success but personally if no more is revealed at CES I will be very happy after today.

The thing is at least Tata Elxsi will be presenting with Brainchip so clearly more to come.

The AKIDA VVDN Box is still to be fully exposed.

My opinion only DYOR
Fact Finder
 
  • Like
  • Fire
  • Love
Reactions: 67 users

Damo4

Regular
1000005316.png


Lock up your children, it's the TSEx final boss
 
  • Haha
  • Fire
  • Like
Reactions: 28 users
If you are willing to say sorry for your cult comment, FF may share.
Interesting you think the place is a cult, but not enough to stop suckling on the teat of information this place is built on.
Hahhaha... Never mind.

Enjoy the rest of your day!
 
  • Fire
Reactions: 1 users
Is there a link to that FactFinder?
Hi DB

These are the links to the three podcasts from today:









Regards
Fact Finder
 
  • Like
  • Love
  • Fire
Reactions: 41 users

Diogenese

Top 20
  • Haha
  • Like
Reactions: 21 users
  • Like
  • Love
  • Fire
Reactions: 13 users

wilzy123

Founding Member
  • Haha
Reactions: 7 users
  • Haha
  • Like
Reactions: 4 users

JB49

Regular
Things are starting to get interesting. 4 big companies coming out of the woodworks in the last 3 weeks. It's got me thinking, imagine how many other companies we are working with behind the scenes that will choose to never make the partnership public by utilising Renesas or Megachips. After all, we were told to watch the financials...
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 34 users

keyeat

Regular
No BRN involved (as far as i know) but very interesting approach.



<blockquote class="twitter-tweet" data-media-max-width="560"><p lang="en" dir="ltr">Introducing r1. Watch the keynote. <br><br>Order now: <a href="https://t.co/R3sOtVWoJ5">https://t.co/R3sOtVWoJ5</a> <a href="https://twitter.com/hashtag/CES2024?src=hash&amp;ref_src=twsrc^tfw">#CES2024</a> <a href="https://t.co/niUmjFvKvE">pic.twitter.com/niUmjFvKvE</a></p>&mdash; rabbit inc. (@rabbit_hmi) <a href="">January 9, 2024</a></blockquote> <script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script>
 
  • Like
  • Haha
Reactions: 5 users
  • Haha
  • Like
  • Love
Reactions: 8 users

Tothemoon24

Top 20

Ambarella Showcases N1 SoC Series at CES, Enhancing Edge AI with Efficient LLMs​

SANTA CLARA, Calif., Jan. 8, 2024 — Ambarella, Inc., an edge AI semiconductor company, today announced during CES that it is demonstrating multi-modal large language models (LLMs) running on its new N1 SoC series at a fraction of the power-per-inference of leading GPU solutions.
Ambarella aims to bring generative AI—a transformative technology that first appeared in servers due to the large processing power required—to edge endpoint devices and on-premise hardware, across a wide range of applications such as video security analysis, robotics and a multitude of industrial applications.
Ambarella will initially be offering optimized generative AI processing capabilities on its mid to high-end SoCs, from the existing CV72 for on-device performance under 5W, through to the new N1 series for server-grade performance under 50W. Compared to GPUs and other AI accelerators, Ambarella provides complete SoC solutions that are up to 3x more power-efficient per generated token, while enabling immediate and cost-effective deployment in products.
“Generative AI networks are enabling new functions across our target application markets that were just not possible before,” said Les Kohn, CTO and co-founder of Ambarella. “All edge devices are about to get a lot smarter, with our N1 series of SoCs enabling world-class multi-modal LLM processing in a very attractive power/price envelope.”
“Virtually every edge application will get enhanced by generative AI in the next 18 months,” said Alexander Harrowell, Principal Analyst, Advanced Computing at Omdia. “When moving genAI workloads to the edge, the game becomes all about performance per watt and integration with the rest of the edge ecosystem, not just raw throughput.”
All of Ambarella’s AI SoCs are supported by the company’s new Cooper Developer Platform. Additionally, in order to reduce customers’ time-to-market, Ambarella has pre-ported and optimized popular LLMs, such as Llama-2, as well as the Large Language and Video Assistant (LLava) model running on N1 for multi-modal vision analysis of up to 32 camera sources. These pre-trained and fine-tuned models will be available for partners to download from the Cooper Model Garden.
For many real-world applications, visual input is a key modality, in addition to language, and Ambarella’s SoC architecture is natively well-suited to process video and AI simultaneously at very low power. Providing a full-function SoC enables the highly efficient processing of multi-modal LLMs while still performing all system functions, unlike a standalone AI accelerator.
Generative AI will be a step function for computer vision processing that brings context and scene understanding to a variety of devices, from security installations and autonomous robots to industrial applications. Examples of the on-device LLM and multi-modal processing enabled by this new Ambarella offering include: smart contextual searches of security footage; robots that can be controlled with natural language commands; and different AI helpers that can perform anything from code generation to text and image generation.
Most of these systems rely heavily on both camera and natural language understanding, and will benefit from on-device generative AI processing for speed and privacy, as well as a lower total cost of ownership. The local processing enabled by Ambarella’s solutions also perfectly suits application-specific LLMs, which are typically fine-tuned on the edge for each individual scenario; versus the classical server approach of using bigger and more power-hungry LLMs to cater to every use case.
Based on Ambarella’s powerful CV3-HD architecture, initially developed for autonomous driving applications, the N1 series of SoCs repurposes all this performance for running multi-modal LLMs in an extremely low power footprint. For example, the N1 SoC runs Llama2-13B with up to 25 output tokens per second in single-streaming mode at under 50W of power. Combined with the ease-of-integration of pre-ported models, this new solution can quickly help OEMs deploy generative AI into any power-sensitive application, from an on-premise AI box to a delivery robot.
Both the N1 SoC and a demonstration of its multi-modal LLM capabilities will be on display this week at the Ambarella exhibition during CES.
About Ambarella
Ambarella’s products are used in a wide variety of human vision and edge AI applications, including video security, advanced driver assistance systems (ADAS), electronic mirror, drive recorder, driver/cabin monitoring, autonomous driving and robotics applications. Ambarella’s low-power systems-on-chip (SoCs) offer high-resolution video compression, advanced image and radar processing, and powerful deep neural network processing to enable intelligent perception, fusion and planning. For more information, please visit ambarella.com.
IMG_8131.jpeg
 
  • Like
  • Fire
  • Thinking
Reactions: 15 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
@Bravo, are you in the car or are you sitting on the windscreen? Hard to tell. If it's on the windscreen don't scratch the paintwork when you get off! :)

SC
Hi Space Cadet,

I'm actually with my mate Willy in the front seat of the car and we're pumping out some tunes together on the new sound system. Willy just asked me if I want to record a song with him about the losers on the Crapper. He reckons it's totally vibing and going to go nuts when it's realised! Here's a bit of a teaser on the hip-hop style beats we've been working on.


Yo, don't step into The Crapper, where it's full of chatter,
TomandJerry and Deadrise, the main clatter,
Critiquing BrainChip, like they're the brain master,
But we know their game is a disaster.

Always down-ramping achievements, spreading discontent,
They jump on TSEx, with aliases, a failed attempt,
Mt09 and FabricatedLunacy, trying to deceive,
But TSEx members are sharp, we won't be naive.

In The Crapper, they talk, trying to bring us down,
But we stand tall, we won't let them wear the crown,
BrainChip rising, achievements profound,
No room for dum-dums, in this TSEx forum town.

TSEx ain't fooled, by your sneaky moves,
Mt09 and FabricatedLunacy, with your fake grooves,
We see through the masquerade, the charade,
In this game of wits, we won't be played.

In The Crapper, they talk, trying to bring us down,
But we stand tall, we won't let them wear the crown,
BrainChip rising, achievements profound,
No room for ding-dongs, in this forum town.

We're building bridges, not walls,
BrainChip's success, as the market calls,
TSEx united, we stand strong,
Against the critics, proving them wrong.

TomandJerry, Deadrise, take a jump,
Your negativity won't define our bump,
BrainChip innovating, reaching the pump,
While you're stuck in The Crapper taking a dump.
 
Last edited:
  • Haha
  • Like
  • Love
Reactions: 68 users

Iseki

Regular
WAT are you even talking about?

View attachment 53898
Rob Telson likes this, but Pat Benatar likes this even more!

Hey Wilzy, please don't take this the wrong way but AI, not even neuromorphic AI, will help your self-hatred. It will only encourage it.

Take care!

Iseki
 
  • Like
Reactions: 3 users
Top Bottom