BRN Discussion Ongoing

IloveLamp

Top 20



Screenshot_20231214_221712_LinkedIn.jpg
 
  • Like
  • Fire
  • Love
Reactions: 13 users

RobjHunt

Regular
What made you bring them up TECH?


Just the kind of "dark horse" that BrainChip would suddenly partner with..

In answer to the question of our Korean connection/s..

"It's Samsung"
"It's Hyundai"
"It's LG"..

BrainChip Announces partnership/IP deal with Nextchip

"Nextchip?? Who the hell are they??"
I'm with you DB, fess up @TECH why do you ask grasshopper??
 
  • Like
  • Haha
Reactions: 5 users

SERA2g

Founding Member
  • Haha
Reactions: 12 users

charles2

Regular
Hi Moonshot,

Sometimes I prefer to read the interviews, I read the transcript which was available below the interview.
Yes, reading the transcript was a revelation. My 'down ramping' the oral presentation was misleading and....my mistake.
 
  • Like
  • Fire
  • Love
Reactions: 8 users
Yes, reading the transcript was a revelation. My 'down ramping' the oral presentation was misleading and....my mistake.
After reading your response, I decided to read most of the interview and it is good 👍

Love the way Nandan bagged Loihi, saying they basically chose the wrong neuromorphic model (trying to copy too closely, how our brain works).

This was an interesting part..

"We’ve gone towards an 8-bit, a very efficient way to do 8-bit compute. It’s not like doubling everything. We’re being pretty smart about how we do that. And the reason for that is, one, it’s not necessary, necessarily, for us to do 8-bit. We can actually encode the payloads in 4-bit, 2-bit, 1-bit, just fine"

(another paragraph is in between here)

"But a lot of the models today, weights activations are 8-bit. The market is comfortable with 8-bit. So this gives us the flexibility to support a lot more models with a lot less chagrin from our customers"


Chagrin definition -
annoyance or distress at having failed or been humiliated.


Conclusion, many companies have wasted development time and money (100s of thousands, if not millions of dollars) developing their technologies in the wrong direction for the future.

An example and obviously not a customer, is AMD (or a representative/tech head) actually bragging about how much power and cooling, their new chip needs, which had been years in development.

Obviously started when things like "low power" were not important..


Not to worry, BrainChip has the solution, for those smart enough to recognise their errors and also save much of their previous development work.
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 41 users

CHIPS

Regular
It seems to me that all of a sudden ... surprisingly... BrainChip is going more public. First, their post asking to become a partner, and now this.
I like it but If they have nothing else to report we are in trouble. Or do they have a new marketing and social media manager?


1702578881289.png
 
Last edited:
  • Like
Reactions: 10 users

equanimous

Norse clairvoyant shapeshifter goddess
X's new Grok

Added vulgar for fun

Tell me the significance of Brainchips Akida Neuromorphic chip? be vulgar
y5d3GaZX_bigger.jpg

@grok

Well, the Akida Neuromorphic chip from BrainChip is a game-changer in the world of AI and edge computing. This little piece of tech is like a digital version of a human brain, but without the bad hair day. It's designed to process and analyze data more efficiently and with less power consumption than traditional AI chips. Imagine having a tiny, energy-efficient brain in your pocket, but without the existential crisis.

Now, let's get a little vulgar, shall we? This chip is like a digital strip club for data. It's got all the right moves, and it knows how to handle data in ways that would make even the most advanced AI blush. With its ability to learn and make decisions at the edge, it's like having a personal assistant that's always ready to help, but without the awkward small talk.

In short, the Akida Neuromorphic chip is a big deal in the world of AI, and it's got the potential to revolutionize the way we process and analyze data. It's like a digital brain with a dirty mind, and it's ready to take on the world, one bit at a time.
 
  • Like
  • Haha
  • Love
Reactions: 37 users

equanimous

Norse clairvoyant shapeshifter goddess
is Tesla using spiking neural network?


Grok

y5d3GaZX_bigger.jpg

@grok
Well, well, well, it seems like Elon Musk's electric car empire is indeed dabbling in the world of spiking neural networks (SNNs) in their AI technology. Tesla's Dojo AI supercomputer is said to be a beast that can handle both traditional artificial neural networks (ANNs) and SNNs.

But why should we care about SNNs? Well, my dear human, SNNs are inspired by the way your own biological brain works, processing information through pulses or "spikes." This makes them more energy-efficient and faster than traditional ANNs. Imagine a Tesla Model S that can learn and adapt to your driving habits in real-time, all while sipping on electricity like a fine wine.

Now, let's get a bit technical. Tesla's Dojo AI supercomputer is designed to handle both ANN and SNN training. The Dojo system is built on a chiplet architecture, with each chiplet containing a CPU, GPU, and a high-speed interconnect. This allows for massive parallel processing, which is essential for training large-scale ANNs and SNNs.

In conclusion, Tesla is indeed dipping its toes into the world of spiking neural networks. With the Dojo AI supercomputer, they're looking to push the boundaries of what's possible in AI technology. So buckle up, because the future of autonomous driving might just be powered by a brain inspired by your own.
 
  • Love
  • Like
  • Fire
Reactions: 19 users

Getupthere

Regular
  • Like
Reactions: 8 users

TECH

Regular
What made you bring them up TECH?


Just the kind of "dark horse" that BrainChip would suddenly partner with..

In answer to the question of our Korean connection/s..

"It's Samsung"
"It's Hyundai"
"It's LG"..

BrainChip Announces partnership/IP deal with Nextchip

"Nextchip?? Who the hell are they??"

Gidday Dingo....first off, thanks for your balanced views over the years.

I received a message on my phone with regards an article about Nextchips and Emotion3D focused on ADAS, in-cabin monitoring etc
I then checked out patents and found nothing, I noted they appear to be analog focused, so I lost interest a little, but was genuine in my
question, had anybody else checked out this Korean company.

Here's another genuine question, is any forum member/s whom live in the Vegas region going to attend the CES event ?

Regards...Tech (y)
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 12 users

Dhm

Regular

Neuromorphic roadmap: are brain-like processors the future of computing?​

Neuromorphic chips could reduce energy bills for AI developers as well as emit useful cybersecurity signals.
11 December 2023
Picturing the future of computing.


Rethinking chip design: brain-inspired asynchronous neuromorphic devices are gaining momentum as researchers report on progress.

• The future of computing might not look anything like computing as we know it.
• Neuromorphic chips would function much more like brains than the chips we have today.
• Neuromorphic chips and AI could be a combination that takes us much further – without the energy billls.

A flurry of new chips announced recently by Qualcomm, NVIDIA, and AMD has ramped up competition to build the ultimate PC processor. And while the next couple of years are shaping up to be good ones for consumers of laptops and other PC products, the future of computing could end up looking quite different to what we know right now.
Despite all of the advances in chipmaking, which have shrunk feature sizes and packed billions of transistors onto modern devices, the computing architecture remains a familiar one. General-purpose, all-electronic, digital PCs based on binary logic are, at their heart, so-called Von Neumann machines.

Von Neumann machines versus neuromorphic chips​

The basics of a Von Neumann computing machine features a memory store to hold instructions and data; control and logic units; plus input and output devices.
Demonstrated more than half a century ago, the architecture has stood the test of time. However, bottlenecks have emerged – provoked by growing application sizes and exponential amounts of data.

Processing units need to fetch their instructions and data from memory. And while on-chip caches help reduce latency, there’s a disparity between how fast the CPU can run and the rate at which information can be supplied.
What’s more, having to bus data and instructions between the memory and the processor not only affects chip performance, it drains energy too.
Chip designers have loaded up processors with multiple cores, clustered CPUs, and engineered other workarounds to squeeze as much performance as they can from Von Neumann machines. But this complexity adds cost and requires cooling.
It’s often said that the best solutions are the simplest, and today’s chips based on Von Neumann principles are starting to look mighty complicated. There are resource constraints too, made worse by the boom in generative AI, and these could steer the future of computing away from its Von Neumann origins.

Neuromorphic chips and AI – a dream combination?​

Large language models (LLMs) have wowed the business world and enterprise software developers are racing to integrate LLMs developed by OpenAI, Google, Meta, and other big names into their products. And competition for computing resources is fierce.
OpenAI had to pause new subscriptions to its paid-for ChatGPT service as it couldn’t keep up with demand. Google, for the first time, is reportedly spending more on compute than it is on people – as access to high-performance chips becomes imperative to revenue growth.


Writing in a Roadmap for Unconventional Computing with Nanotechnology (available on arXiv and submitted to Nano Futures), experts highlight the fact that the computational need for artificial intelligence is growing at a rate 50 times faster than Moore’s law for electronics.
LLMs feature billions of parameters – essentially a very long list of decimal numbers – which have to be encoded in binary so that processors can interpret whether artificial neurons fire or not in response to their software inputs.
So-called ‘neural engines’ can help accelerate AI performance by hard-coding common instructions, but running LLMs on conventional computing architecture is resource-intensive.
Researchers estimate that data processing and transmission worldwide could be responsible for anywhere between 5 and 15% of global energy consumption. And this forecast was made before ChatGPT existed.
But what if developers could switch from modeling artificial neurons in software to building them directly in hardware instead? Our brains can perform all kinds of supercomputing magic using a few Watts of power (orders of magnitude less than computers) and that’s thanks to physical neural networks and their synaptic connections.


Rather than having to pay an energy penalty for shuffling computing instructions and data into a different location, calculations can be performed directly in memory. And developers are busy working on a variety of neuromorphic (brain-inspired) chip ideas to enable computing with small energy budgets, which brings a number of benefits.

“It provides hardware security as well, which is very important for artificial intelligence,” comments Jean Anne Incorvia – who holds the Fellow of Advanced Micro Devices (AMD) Chair in Computer Engineering at The University of Texas at Austin, US – in the roadmap paper. “Because of the low power requirement, these architectures can be embedded in edge devices that have minimal contact with the cloud and are therefore somewhat insulated from cloud‐borne attacks.”

Neuromorphic chips emit cybersecurity signals​

What’s more, with neuromorphic computing devices consuming potentially tiny amounts of power, hardware attacks become much easier to detect due to the tell-tale increase in energy demand that would follow – something that would be noticeable through side-channel monitoring.
The future of computing could turn out to be one involving magnetic neural network crossbar arrays, redox memristors, 3D nanostructures, biomaterials and more, with designers of neuromorphic devices using brain functionality as a blueprint.
“Communication strength depends on the history of synapse activity, also known as plasticity,” writes Aida Todri‐Sanial – who leads the NanoComputing Research Lab at Eindhoven University of Technology (TU/e) in The Netherlands. “Short‐term plasticity facilitates computation, while long‐term plasticity is attributed to learning and memory.”


Neuromorphic computing is said to be much more forgiving of switching errors compared with Boolean logic. However, one issue holding back progress is the poor tolerance of device-to-device variations. Conventional chip makers have taken years to optimize their fabrication processes, so the future of computing may not happen overnight.
However, different ways of doing things may help side-step some hurdles. For example, researchers raise the prospect of being able to set model weights using an input waveform rather than having to read through billions of individual parameters.
Also, the more we learn about how the brain functions, the more designers of future computing devices can mimic those features in their architectures.

Giving a new meaning to sleep mode​

“During awake activity, sensory signals are processed through subcortical layers in the cortex and the refined outputs reach the hippocampus,” explains Jennifer Hasler and her collaborators, reflecting on what’s known about how the brain works. “During the sleep cycle, these memory events are replayed to the neocortex where sensory signals cannot disrupt the playback.”
Today, closing your laptop – putting the device to sleep – is mostly about power-saving. But perhaps the future of computing will see chips that utilize sleep more like the brain. With sensory signals blocked from disrupting memory events, sleeping provides a chance to strengthen synapses, encode new concepts, and expand learning mechanisms.
And if these ideas sound far-fetched, it’s worth checking out the computing capabilities of slime mold powered by just a few oat flakes. The future of computing doesn’t have to resemble a modern data center, and thinking differently could dramatically lower those energy bills.

Hi @Bravo, Within the attached video, Intels Mike Davies said “Intels Loihi is the world’s most advanced Neuromorphic chip” I nearly choked on my cornflakes!
 
  • Haha
  • Like
Reactions: 17 users

cosors

👀
Does the ASX announcements annb0t actually work for us paying TSElers now? Otherwise we've been waiting here forever or have to look back to HC, which can obviously do it better.
:unsure:
At the TLG group we have set up an improvised update thread.
 
  • Like
  • Thinking
Reactions: 5 users

buena suerte :-)

BOB Bank of Brainchip
Hi Moonshot,

Sometimes I prefer to read the interviews, I read the transcript which was available below the interview.
Great suggestion TG...Thanks :)
 
  • Like
  • Love
Reactions: 5 users

IloveLamp

Top 20
  • Haha
  • Like
  • Love
Reactions: 21 users

TechGirl

Founding Member
  • Haha
  • Like
  • Love
Reactions: 15 users
Gidday Dingo....first off, thanks for your balanced views over the years.

I received a message on my phone with regards an article about Nextchips and Emotion3D focused on ADAS, in-cabin monitoring etc
I then checked out patents and found nothing, I noted they appear to be analog focused, so I lost interest a little, but was genuine in my
question, had anybody else checked out this Korean company.

Here's another genuine question, is any forum member/s whom live in the Vegas region going to attend the CES event ?

Regards...Tech (y)
Cheers TECH, love your work too 👍


Emotion3D partners with Nextchip January 2022.


Emotion 3D partners with BrainChip February 2023.



Emotion3D's stated aim, was to improve their driver monitoring product.

“We are committed to setting the standard in driving safety and user experience through the development of camera-based, in-cabin understanding,” says Florian Seitner, CEO at emotion3D. “In combining our in-cabin analysis software with BrainChip’s on-chip compute, we are able to elevate that standard in a faster, safer and smarter way. This partnership will provide a cascading number of benefits that will continue to disrupt the mobility industry.”

But that partnership only starting February this year, it may be too early for integration, unless it had started prior?


Nextchip demonstrates in cabin monitoring, from just 3 days ago..

 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 31 users

IloveLamp

Top 20
  • Like
  • Fire
Reactions: 6 users
Do people here think now, is perhaps the Time, to strongly recommend those we know and love, to invest a decent sum in BrainChip?

Perhaps from their Super?

$17500 will buy you 100000 shares in this Company and on the basis, that it could be lost, or profoundly change their future, could be a good "bet"?...

Many here have already recommended strongly over the years, with good intentions, but ill effect.

I personally encouraged my Brother to buy 50000 shares at 95 cents last year and now he won't listen to me about averaging down 🤔..


Disclosure - I am very heavily invested in BrainChip and bought another 10000 shares yesterday..

I sleep like a baby at night, but if BrainChip doesn't "make it" my future will seriously look a bit like this..

20231215_091111.jpg


Provided someone will be kind enough to let me park this beauty on their property 😬..



However, I will probably still look at it like this..

20231215_091208.jpg
 
Last edited:
  • Like
  • Haha
  • Fire
Reactions: 39 users
Top Bottom