BRN Discussion Ongoing

Esq.111

Fascinatingly Intuitive.
Afternoon MDhere ,

Have not added ERICSSON yet ... though thay have / are playing with our tech , as per the documents reveiled.

Need a little more time and proof...confirmation from their or our chief Indian.

THE BRAINCHIP SCROLL.

Note : there may well be others which I have missed ... so this should by no means be considered exhaustive.

Regards,
Esq.
 

Attachments

  • 20240307_180933.jpg
    20240307_180933.jpg
    2.3 MB · Views: 86
  • Like
  • Love
  • Fire
Reactions: 30 users

Tothemoon24

Top 20

Interesting development.​


If this is to be accurate, & our lead in the neuromorphic SNN space is correct ….?

Then we would be part of this ….?

If this is accurate & we are not part of it ,… 🤐

KAIST researchers develop world's first 'neuromorphic' AI chip​

Neuromorphic computing technology aims to develop integrated circuits mimicking the human nervous system so that chips could be able to perform more sophisticated tasks. [SHUTTERSTOCK]

Neuromorphic computing technology aims to develop integrated circuits mimicking the human nervous system so that chips could be able to perform more sophisticated tasks. [SHUTTERSTOCK]
A research team at KAIST has developed the world’s first AI semiconductor capable of processing a large language model (LLM) with ultra-low power consumption using neuromorphic computing technology.

The technology aims to develop integrated circuits mimicking the human nervous system so that chips could be able to perform more sophisticated tasks that require adaption and reasoning with far less energy consumption.


The Science Ministry said Wednesday that the team, led by Prof. Yoo Hoi-jun at the KAIST PIM Semiconductor Research Center, developed a “Complementary-Transformer” AI chip, which processes GPT-2 with an ultra-low power consumption of 400 milliwatts and a high speed of 0.4 seconds, according to the Ministry of Science and ICT.


A rendered image of comparing performance of different types of processors [YONHAP]

A rendered image of comparing performance of different types of processors [YONHAP]

The 4.5-millimeter-square chip, developed using Korean tech giant Samsung Electronics' 28 nanometer process, has 625 times less power consumption compared with global AI chip giant Nvidia’s A-100 GPU, which requires 250 watts of power to process LLMs, the ministry explained.

The chip is also 41 times smaller in area than the Nvidia model, enabling it to be used on devices like mobile phones, therefore better protecting user privacy.

The KAIST team has succeeded in demonstrating various language processing tasks with its LLM accelerator on Samsung’s latest smartphone model, the Galaxy S24, which is the world’s first smartphone model with on-device AI, featuring real-time translation for phone calls and improved camera performance, Kim Sang-yeob, a researcher on the team, told reporters in a press briefing.

The ministry said the utilization of neuromorphic computing technology, which functions like a human brain, specifically spiking neural networks (SNNs), is essential to the achievement.

Previously, the technology was less accurate than deep neural networks (DNNs) and mainly capable of simple image classifications, but the research team succeeded in improving the accuracy of the technology to match that of DNNs to apply it to LLMs.

The team said its new AI chip optimizes computational energy consumption while maintaining accuracy by using unique neural network architecture that fuses DNNs and SNNs and effectively compresses the large parameters of LLMs.

“Neuromorphic computing is a technology global tech giants, like IBM and Intel, failed to realize. We believe we are the first to run an LLM with a ultra-low power neuromorphic accelerator,” Yoo said.

BY PARK EUN-JEE, YONHAP [park.eunjee@joongang.co.kr]
 
Last edited:
  • Like
  • Thinking
  • Wow
Reactions: 19 users

cosors

👀
  • Fire
  • Like
  • Wow
Reactions: 7 users

TECH

Regular
The difference is though, he hasn't left and stopped posting, he's just changed forums.

His integrity was called into question here and instead of being a man and admitting to his mistakes and errors of judgement, he told Frangipani she should "go and get some help"?

And that he felt it no longer necessary to post about BRN, as he had better things to do?

I agree, he is an excellent poster and great researcher, who excels at communication and much more often than not, has a very insightful point of view.
The prodigiousness and quality of his posting, was/is incredible.

I'm not enamoured though, by someone who thinks they are beyond reproach and is not able to handle criticism, or admit fault.

Hi Dingo...there is so much I could say about FF, yes he is a solid researcher, possibly obsessive, I have had two run ins with him over the years, but I have since made the choice to focus on what I know best, that being, the staff working for us at Brainchip, yes that's correct, us the major stakeholders, the public.

Why would any genuine shareholder who left hot crapper to join us at TSE after all the crap over there, go back, I haven't bothered since I left years ago, unless you like confrontation, why the F would you waste your time.

The real strength to someone's personality is to admit when you're wrong, it truly reflects ones character, sadly, many people can't...for an
example, even my own father or my ex-partner of 19 years couldn't...it's a lot more common than you think, it's ok to be wrong "sometimes" !

Today I had a visitor for over 4.5 hours visit me at my villa, a New Zealand based Brainchip shareholder, we had a great conversation about Brainchip among other topics, and I confirmed that I had no insider information to share with him, why, because our company runs a tight,
professional operation, end of story.

Announcements will come this year, I'm picking from the second-half onwards, we shall see, if things don't ramp up a notch in the second-half I will be slightly, yes only slightly disappointed....Love our company ♦️AKIDA
 
  • Like
  • Love
  • Fire
Reactions: 33 users

Interesting development​

KAIST researchers develop world's first 'neuromorphic' AI chip​

Neuromorphic computing technology aims to develop integrated circuits mimicking the human nervous system so that chips could be able to perform more sophisticated tasks. [SHUTTERSTOCK]

Neuromorphic computing technology aims to develop integrated circuits mimicking the human nervous system so that chips could be able to perform more sophisticated tasks. [SHUTTERSTOCK]
A research team at KAIST has developed the world’s first AI semiconductor capable of processing a large language model (LLM) with ultra-low power consumption using neuromorphic computing technology.

The technology aims to develop integrated circuits mimicking the human nervous system so that chips could be able to perform more sophisticated tasks that require adaption and reasoning with far less energy consumption.


The Science Ministry said Wednesday that the team, led by Prof. Yoo Hoi-jun at the KAIST PIM Semiconductor Research Center, developed a “Complementary-Transformer” AI chip, which processes GPT-2 with an ultra-low power consumption of 400 milliwatts and a high speed of 0.4 seconds, according to the Ministry of Science and ICT.


A rendered image of comparing performance of different types of processors [YONHAP]

A rendered image of comparing performance of different types of processors [YONHAP]

The 4.5-millimeter-square chip, developed using Korean tech giant Samsung Electronics' 28 nanometer process, has 625 times less power consumption compared with global AI chip giant Nvidia’s A-100 GPU, which requires 250 watts of power to process LLMs, the ministry explained.

The chip is also 41 times smaller in area than the Nvidia model, enabling it to be used on devices like mobile phones, therefore better protecting user privacy.

The KAIST team has succeeded in demonstrating various language processing tasks with its LLM accelerator on Samsung’s latest smartphone model, the Galaxy S24, which is the world’s first smartphone model with on-device AI, featuring real-time translation for phone calls and improved camera performance, Kim Sang-yeob, a researcher on the team, told reporters in a press briefing.

The ministry said the utilization of neuromorphic computing technology, which functions like a human brain, specifically spiking neural networks (SNNs), is essential to the achievement.

Previously, the technology was less accurate than deep neural networks (DNNs) and mainly capable of simple image classifications, but the research team succeeded in improving the accuracy of the technology to match that of DNNs to apply it to LLMs.

The team said its new AI chip optimizes computational energy consumption while maintaining accuracy by using unique neural network architecture that fuses DNNs and SNNs and effectively compresses the large parameters of LLMs.

“Neuromorphic computing is a technology global tech giants, like IBM and Intel, failed to realize. We believe we are the first to run an LLM with a ultra-low power neuromorphic accelerator,” Yoo said.

BY PARK EUN-JEE, YONHAP [park.eunjee@joongang.co.kr]
"Neuromorphic computing is a technology that even companies like IBM and Intel have not been able to implement, and we are proud to be the first in the world to run the LLM with a low-power neuromorphic accelerator," Yoo said.

Was addressed yesterday, but very interesting, that the claim in this article, isn't as certain 🤔..

“Neuromorphic computing is a technology global tech giants, like IBM and Intel, failed to realize. We believe we are the first to run an LLM with a ultra-low power neuromorphic accelerator,” Yoo said.
 
  • Like
  • Thinking
Reactions: 7 users
A Horrible and Dumb thing to say that makes me also feel like I don’t want to spend much time on this forum.
Simple . Ignore it . Continue to enjoy the forum . I did ages ago . You are a valuable contributor so use the tools available here to continue on without hinderance .
 
  • Like
Reactions: 5 users

Tothemoon24

Top 20
"Neuromorphic computing is a technology that even companies like IBM and Intel have not been able to implement, and we are proud to be the first in the world to run the LLM with a low-power neuromorphic accelerator," Yoo said.

Was addressed yesterday, but very interesting, that the claim in this article, isn't as certain 🤔..

“Neuromorphic computing is a technology global tech giants, like IBM and Intel, failed to realize. We believe we are the first to run an LLM with an ultra-low power neuromorphic accelerator,” Yoo said.
Apologies I didn’t see yesterday’s discussion that said …. Perhaps our apparent lead is Questionable?
 
  • Like
Reactions: 1 users
A nice.....hmmmm ;)



Eric Feuilleaubois (Ph.D)
Deep Learning / ADAS / Autonomous Parking chez VALEO // Curator of Deep_In_Depth news feed
4d

BrainChip showcases AI-enabled human behavioral analysis with Akida neuromorphic computing | Biometric Update
BrainChip showcases AI-enabled human behavioral analysis with Akida neuromorphic computing | Biometric Update

BrainChip showcases AI-enabled human behavioral analysis with Akida neuromorphic computing | Biometric Update

https://www.biometricupdate.com



Eric Feuilleaubois (Ph.D)​


Valeo Université Pierre et Marie Curie​

Paris, Île-de-France, France Contact Info​

30K followers 500+ connections​

 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 22 users

Slade

Top 20
Ok here i go again @skutza this is an anonymous forum about a stock that is absoulely incredible. its not a forum to judge if one can live with another. Also @DingoBorat , there are two sides to the frangipani story and FF , both had merits but both saw different views and i agree that go and get help wasnt exactly positive it was said in regards to Frangis bible of a story which quite frankly wasnt need on this forum.

Now i admit to watching MAFS but i certainly dont expect my fellow brners to be wondering what underwear i wear and and wether im a suitable companion depending on my texts on here lol

i for one am happy with some long term holders posting on HC as it keeps the stupid site in check a bit (although i have chosen not to but i do frequent it and put like and so forth up) Though at times I want to throttle that Dean guy lol

And lastly @Slade , Mate (sounds so aussie when i write that) Mate don't you dare depart from this forum, or all my beer shouting is off! lol

On a side not i can't get long skip connections out of my head i think also known as long residual connections as Sean has mentioned this a number of times stating "our customers" asked for this and we have delivered what they are asking for.

To me that is like a medical purpose this long skip connection. The other thing was somatosensory which I believe will bring up assistance to articial limbs momement. and the US Department of Homeland Security have been playing iwth Adika in the past when i googled about the skp/residual connections.

O and FF said something about Ericsson, I don't see it on our iceberg nor on our resident flowcharting pencil pusher A4 paper detective (cant remember is that you @Esq.111 ) or it might be someone else but i loved it and prob needs an update?

Cheers fellow brners Thats it for me for now, peace to all and I will no doubt re-appear tomorrow as we are all strapped in :)
MD, Did you say beer? I’m not going anywhere.
 
  • Haha
  • Love
  • Like
Reactions: 25 users

MDhere

Regular
Afternoon MDhere ,

Have not added ERICSSON yet ... though thay have / are playing with our tech , as per the documents reveiled.

Need a little more time and proof...confirmation from their or our chief Indian.

THE BRAINCHIP SCROLL.

Note : there may well be others which I have missed ... so this should by no means be considered exhaustive.

Regards,
Esq.
Super that's @Esq.111 I had an old besoin so thanks for the update and feather pen :)
 
  • Like
  • Fire
Reactions: 4 users
The difference is though, he hasn't left and stopped posting, he's just changed forums.

His integrity was called into question here and instead of being a man and admitting to his mistakes and errors of judgement, he told Frangipani she should "go and get some help"?
1709801636432.gif
 
  • Haha
  • Like
Reactions: 3 users

Makeme 2020

Regular
  • Like
Reactions: 1 users

JDelekto

Regular
Hi JD,

That's some impressive technowizardry.

As you know, PvdM's "4 Bits are enough" white paper discusses the advantages of 4-bit quantization.

https://brainchip.com/4-bits-are-enough/
...
4-bit network resolution is not unique. Brainchip pioneered this Machine Learning technology as early as 2015 and, through multiple silicon implementations, tested and delivered a commercial offering to the market. Others have recently published papers on its advantages, such as IBM, Stanford University and MIT.

Akida is based on a neuromorphic, event-based, fully digital design with additional convolutional features. The combination of spiking, event-based neurons, and convolutional functions is unique. It offers many advantages, including on-chip learning, small size, sparsity, and power consumption in the microwatt/milliwatt ranges. The underlying technology is not the usual matrix multiplier, but up to a million digital neurons with either 1, 2, or 4-bit synapses. Akida’s extremely efficient event-based neural processor IP is commercially available as a device (AKD1000) and as an IP offering that can be integrated into partner System on Chips (SoC). The hardware can be configured through the MetaTF software, integrated into TensorFlow layers equating up to 5 million filters, thereby simplifying model development, tuning and optimization through popular development platforms like TensorFlow/Keras and Edge Impulse. There are a fast-growing number of models available through the Akida model zoo and the Brainchip ecosystem.

To dive a little bit deeper into the value of 4-bit, in its 2020 NeurIPS paper IBM described the various pieces that are already present and how they come together. They prove the readiness and the benefit through several experiments simulating 4-bit training for a variety of deep-learning models in computer vision, speech, and natural language processing. The results show a minimal loss of accuracy in the models’ overall performance compared with 16-bit deep learning. The results are also more than seven times faster and seven times more energy efficient. And Boris Murmann, a professor at Stanford who was not involved in the research, calls the results exciting. “This advancement opens the door for training in resource-constrained environments,” he says. It would not necessarily make new applications possible, but it would make existing ones faster and less battery-draining“ by a good margin
.”

While I have some understanding of the visual aspect, I find the NLP, covering both speech and text, more perplexing mainly because of the need for context or "attention", but, as usual, Prof Wiki has some useful background:

https://en.wikipedia.org/wiki/Natural_language_processing
I've found that depending on the model, 4-bit quantization is fairly sufficient to get a decent result, from what I have seen so far, it looks like it also depends on the amount of data used to train the model as well as the number of parameters where a lower quantization affects the accuracy.

For small datasets like speech and handwriting recognition, resnet50 image recognition, etc. I think that 4-bit works fine, but I've seen some of the coding LLMs suffer from lower than 6-bit quantization using them.

I'll have to look for a chart comparing the different models with quantization and its effects. However, the more data, the more parameters, and depending upon the model, they can take up large amounts of memory, which they may want to keep very low to reduce costs for Edge devices.
 
  • Like
  • Fire
  • Love
Reactions: 15 users

Interesting development.​


If this is to be accurate, & our lead in the neuromorphic SNN space is correct ….?

Then we would be part of this ….?

If this is accurate & we are not part of it ,… 🤐

KAIST researchers develop world's first 'neuromorphic' AI chip​

Neuromorphic computing technology aims to develop integrated circuits mimicking the human nervous system so that chips could be able to perform more sophisticated tasks. [SHUTTERSTOCK]

Neuromorphic computing technology aims to develop integrated circuits mimicking the human nervous system so that chips could be able to perform more sophisticated tasks. [SHUTTERSTOCK]
A research team at KAIST has developed the world’s first AI semiconductor capable of processing a large language model (LLM) with ultra-low power consumption using neuromorphic computing technology.

The technology aims to develop integrated circuits mimicking the human nervous system so that chips could be able to perform more sophisticated tasks that require adaption and reasoning with far less energy consumption.


The Science Ministry said Wednesday that the team, led by Prof. Yoo Hoi-jun at the KAIST PIM Semiconductor Research Center, developed a “Complementary-Transformer” AI chip, which processes GPT-2 with an ultra-low power consumption of 400 milliwatts and a high speed of 0.4 seconds, according to the Ministry of Science and ICT.


A rendered image of comparing performance of different types of processors [YONHAP]

A rendered image of comparing performance of different types of processors [YONHAP]

The 4.5-millimeter-square chip, developed using Korean tech giant Samsung Electronics' 28 nanometer process, has 625 times less power consumption compared with global AI chip giant Nvidia’s A-100 GPU, which requires 250 watts of power to process LLMs, the ministry explained.

The chip is also 41 times smaller in area than the Nvidia model, enabling it to be used on devices like mobile phones, therefore better protecting user privacy.

The KAIST team has succeeded in demonstrating various language processing tasks with its LLM accelerator on Samsung’s latest smartphone model, the Galaxy S24, which is the world’s first smartphone model with on-device AI, featuring real-time translation for phone calls and improved camera performance, Kim Sang-yeob, a researcher on the team, told reporters in a press briefing.

The ministry said the utilization of neuromorphic computing technology, which functions like a human brain, specifically spiking neural networks (SNNs), is essential to the achievement.

Previously, the technology was less accurate than deep neural networks (DNNs) and mainly capable of simple image classifications, but the research team succeeded in improving the accuracy of the technology to match that of DNNs to apply it to LLMs.

The team said its new AI chip optimizes computational energy consumption while maintaining accuracy by using unique neural network architecture that fuses DNNs and SNNs and effectively compresses the large parameters of LLMs.

“Neuromorphic computing is a technology global tech giants, like IBM and Intel, failed to realize. We believe we are the first to run an LLM with a ultra-low power neuromorphic accelerator,” Yoo said.

BY PARK EUN-JEE, YONHAP [park.eunjee@joongang.co.kr]
If we are not apart of this then we are in deep pop imo as all the talk to date about our 3 year lead in Neurmorphic and the discussion of BRN and SNN uniqueness will be thrown out the door and I will be very disappointed. I do think we may never know the details of many partners however explosive growth this year may or should be coming from this new product as it goes on sale 2 quarter this year.
 
  • Like
  • Fire
Reactions: 5 users
Apologies I didn’t see yesterday’s discussion that said …. Perhaps our apparent lead is Questionable?
It's a bit strange...
Because the title, is even claiming "World's first 'neuromorphic' A.I. chip"..

And then they go on to "mention" TrueNorth and Loihi?..

Unless they are distinguishing it as the first neuromorphic "A.I." chip and not just chip?
Sounds more like a marketing angle, than a statement of fact..

BrainChip doesn't even claim that and of course they shouldn't, but they do claim World's first 'commercial' neuromorphic architecture..

What they have achieved, is possibly ahead of us in that area (we will not know, without comment by the Company) but not necessarily ahead in other areas...

@Diogenese says he can see no "secret sauce" which is encouraging.

We have no idea of this chip's abilities, in other areas and also remember, that they are not yet commercial.
 
Last edited:
  • Like
  • Fire
  • Thinking
Reactions: 10 users

GStocks123

Regular
  • Like
  • Fire
Reactions: 8 users

Earlyrelease

Regular
Just a little thought to ponder.

The NDA and secrecy surrounding our product is expected and demanded by what will be potential future customers. Whilst frustrating I am fine with it.

The reason I am fine is from thinking a little long term. Imagine you have engaged BRN and have the secret sauce secretly being tried and tested in your products and with luck being developed into a new product range or line. Now as CEO or the board members of these companies things can’t get better. They are patting themselves on their backs and smugly bragging how clever they are in being an industry leader and having a jump on the crowd.
Now imagine the urgent out of session board meeting called when your opposition drops a public statement saying they have a product or are using the secret sauce. Do we declare now, do we risk losing existing customers who may seek advice from opposition sales staff or do you quickly spruik off your product and reassure your customers you are on it too.

So for me I am not expecting a release by any date rather I am hoping that when it starts we have a run of releases especially if there are companies engaged under nda in similar fields.

So that’s when the rockets will be taking off for me.
 
  • Like
  • Love
Reactions: 22 users

cosors

👀
It's a bit strange...
Because the title, is even claiming "World's first 'neuromorphic' A.I. chip"..

And then they go on to "mention" TrueNorth and Loihi?..

Unless they are distinguishing it as the first neuromorphic "A.I." chip and not just chip?
Sounds more like a marketing angle, than a statement of fact..

BrainChip doesn't even claim that and of course they shouldn't, but they do claim World's first 'commercial' neuromorphic architecture..

What they have achieved, is possibly ahead of us in that area (we will not know, without comment by the Company) but not necessarily ahead in other areas...

@Diogenese says he can see no "secret sauce" which is encouraging.

We have no idea of this chip's abilities, in other areas and also remember, that they are not yet commercial.
Maybe a bit of national pride of the authors? I can't find anything or such a statement on the KAIST homepage itself.
KAIST was founded by the state and is considered the second most innovative university in the Asia-Pacific region. And I have a colleague who explained to me what I see confirmed here in this comment of an article.

When Koreans look at the map, they see a small country between two superpowers. This leads to a latent and constant fear of being torn between Japan and China, both economically and culturally, which is justified by historical events. For Koreans, with their strong sense of national pride, it is dismaying to realise time and again that Korea is barely recognised outside Asia. Yet they have every reason to be proud of the so-called "miracle on the Han River", because today car and shipbuilding, semiconductor manufacturing, digital electronics, steel and petrochemicals are cited as key industries, and every Korean knows where they currently rank internationally.

[As far as I know, KAIST was the first in the world to develop filament LEDs, which went on to conquer the world much later. I bought one from Asia back then and it took another 2 or 3 years before they were mass-produced and sold here.]
 
Last edited:
  • Like
  • Love
Reactions: 11 users

Jumpchooks

Regular
Super that's @Esq.111 I had an old besoin so thanks for the update and feather pen :)
It's a Quill, believe me I used to be an ink well monitor when I was at school. 😎
 
  • Like
  • Haha
Reactions: 5 users
Top Bottom