BRN Discussion Ongoing

Hi DB,

Am I wrong in thinking these "reference chips" were produced under Peter's time as stand in CEO and were only later "kicked to the curb" by Antonio & Sean when they, uhm, course corrected.
Maybe there are more of these around than we think (well...guess at...you know)
Hey FJ, I don't think they were ever "kicked to the kerb" when the Company changed, or affirmed a strategy of selling IP..

If there were buyers for the volume of chips, that we have, I think they would have just sold them..

Pretty sure it was stated at the time, of being the start of a mass production run, so there should have been possibly a few 10s of thousands produced (although we don't know and will never know the yield percentage, with Anil on the job, I suspect it would be pretty high).

With the speed of change in technology, I think it was a wise move, to follow the IP route, but even better now, with a multi-pronged strategy.

Some keep banging on (yes I'm talking about you Iseki 😛) that we need to produce an AKD2000 reference chip, but there is no need as the tech can be played with in simulation.

Before the AKD1000 ES and reference chips, anybody looking at the simulations, would have thought "Yeah looks incredible, but can you really do that on a chip?"..

We don't have that problem anymore, as the core technology has been proven, in 2 processes (AKD1500).
 
  • Like
  • Love
  • Fire
Reactions: 29 users

IloveLamp

Top 20
1000014713.jpg
 
  • Like
  • Thinking
  • Fire
Reactions: 25 users
Good to see some game developers starting recognise the future re neuromorphic.

Blog excerpt below. Not us specifically but the tech which we know we are at the forefront.


how ai models learn and train themselves​

March 27, 2024 // Alex Stone

In our last blog on AI, we covered different types of AI models. But how do these digital brains actually learn and train themselves? In this blog, we’re going to take a closer look into the processes behind AI learning and training, as well as cover some future considerations and caveats when it comes to using AI.

Neural Networks and Pre-Trained Models

All modern AI models rely on Neural Networks, mirroring the structure of the human brain. These networks consist of simulated neurons connected in various configurations, each with assigned “weights” dictating their influence. These weights, crucial for learning, are determined through rigorous training against labeled datasets. Models like ChatGPT and DALL-E are pre-trained, with companies investing heavily in refining these weights and datasets, making them the linchpin of AI value.

This also means that the economic value, the trade secret or the recipe of AI, if you will, are the weights and the training datasets. The network of connections are not nearly as valuable from a business standpoint. This is why Meta and other companies have started sharing untrained models publicly. That doesn’t mean there aren’t plenty of open source weights as well, but they are usually inferior quality to what big companies with lots of investment in training time have. This is also why publications and authors are upset that their articles were scraped into a training dataset without compensation.

When a trained model generates a response from a stimulus, it’s called inference, akin to human thought processes. While inference is generally swift, Generative Models face computational hurdles due to their expansive output layers, driving the race for AI-capable chips.

There are two kinds of AI-helpful chips – the kind everyone has been using to accelerate training and inference have been GPUs – which happen to be the same exact chips in your graphics card in your PC, PS5, or XBox. If you think that’s strange, you are not alone. While GPUs coincidentally accelerate certain AI tasks, the future is in Neuromorphic Processors. These implement neurons in silicon. This means pre-trained AI models will be able to run inference entirely in hardware, but it’s likely training will still need to rely on classical techniques for longer, so don’t dump your NVIDIA stock just yet!

about filament games​

we create playful experiences that improve people’s lives​


Founded in 2005, Filament Games is a full-service digital studio that specializes in learning game development on a for-hire basis. We’ve completed over 400 projects since our founding and have worked with some of the biggest names in education – folks like Amazon, Scholastic, Smithsonian, Oculus, National Geographic, PBS, Television Ontario (TVO), McGraw-Hill, and even the US Department of Education.​

 
  • Like
  • Love
  • Fire
Reactions: 23 users
View attachment 59896




Nvidia’s H100 AI #GPUs are taking the tech world by storm, but their reign comes at the price of a hefty energy bill.

According to a report from #CBInsights and #Stocklytics.com, these power-hungry processors are projected to consume a staggering 13,797 GWh in 2024, exceeding the annual energy consumption of nations like #Georgia and #CostaRica.

Imagine this, a legacy data center consumes 10 kW/rack where #CyrusOne, #KKR owned leading global data center operator and developer specialising in #AI applications, consumes 300 kW/rack!

But why do #GPUs consume so much power?

Data center #GPUs consume a substantial amount of power primarily due to their high computational requirements and the complex algorithms they handle. These #GPUs optimize parallel processing tasks like #machinelearning and #dataanalytics, involving simultaneous processing of vast amounts of data.

While #parallel processing speeds up data processing, one demerit is that, at a time most parts of a chip are active. This constant computation, coupled with the execution of complex algorithms, demands significant computational power, thereby increasing energy consumption.

The large-scale deployment of #GPUs in data centers, where racks and clusters utilize hundreds or thousands of #GPUs further amplify their collective power consumption. This combination of factors underscores the considerable energy consumption associated with data center GPUs.

Successfully navigating these challenges and fostering innovation will shape the future landscape of #AI computing.

So, what options do we have?

● #Amazon, frenemy to Nvidia, recently unveiled Arm based Graviton4 and Trainium2 chips holds promise for efficiency gains.

● In the near to medium term, #Neuromorphic computing is being researched aggressively as an alternative to synchronous parallel computing architectures. Neuromorphic computing is an asynchronous computing paradigm which runs on event based ‘spikes’ rather than a clock signal. And drastically lowers the power consumption.

● Big money is going into enabling tech like liquid cooling - #KKR acquired CoolIT Systems for $270 mn and Bosch acquired Jetcool through its venture arm

While CooIT Systems becomes the supplier for Cyrus One, #KKR makes money on both!
 
  • Like
Reactions: 4 users

CHIPS

Regular
Jetzt kommt bald raus dass hier alle aus dem BN Forum sind und einen auf Australier machen wart ab!
It is so strange, that my posts are sometimes in German though I post them in English. It is no secret that I am German, but who the heck translates my post and why? #Akida was that you??? :unsure:
 
Last edited:
  • Haha
  • Like
Reactions: 9 users

CHIPS

Regular
Jetzt kommt bald raus dass hier alle aus dem BN Forum sind und einen auf Australier machen wart ab!
Besides from the German, what is the BN forum??
 

Krustor

Regular
Besides from the German, what is the BN forum??
Börsenews Forum - also known as the German kindergarden Forum - so never mind
 
  • Like
  • Haha
Reactions: 3 users

CHIPS

Regular
Börsenews Forum - also known as the German kindergarden Forum - so never mind

Thanks. Nope, that's not my place to play!
 
  • Like
Reactions: 2 users

cosors

👀
🦘🦘🦘🦘 🦘



Volkswagen's RooBadge AI for safer roads in Australia

Volkswagen has developed a new technology called RooBadge to reduce kangaroo collisions in Australia.

This innovative device uses artificial intelligence to deter kangaroos from approaching vehicles.

Here's how it works:

RooBadge replaces the existing front badge on Volkswagen trucks.

It uses GPS and machine learning to identify the kangaroo species in the area.

Based on this information, it emits targeted sounds, including bird calls, predator sounds, and kangaroo foot thumps, which are natural warning signs for them.
The sound is focused ahead of the car, maximizing its effectiveness.

Benefits:

Reduces the risk of kangaroo collisions, protecting both drivers and animals.
Provides valuable data through driver and wildlife agency reports, allowing for continuous improvement.

A universal version is in development, potentially deterring other animals like deer and being adaptable for global use.

RooBadge is a prime example of how AI can be used to create positive change and save lives.

What are your thoughts about this innovative?
I hope it is a GNSS device instead of just GPS and uses the EU's Galileo - now called HAS - High Accuracy Service. Not that it is 10 or 20m off.)

From 1 January 2023, three frequency bands will be are available to all users free of charge and unencrypted, which means that a worldwide accuracy of a few cm can be achieved.
 
  • Like
  • Fire
Reactions: 2 users

IloveLamp

Top 20
  • Like
  • Love
  • Fire
Reactions: 50 users

IloveLamp

Top 20
  • Like
  • Love
  • Fire
Reactions: 33 users

TECH

Regular
Hey FJ, I don't think they were ever "kicked to the kerb" when the Company changed, or affirmed a strategy of selling IP..

If there were buyers for the volume of chips, that we have, I think they would have just sold them..

Pretty sure it was stated at the time, of being the start of a mass production run, so there should have been possibly a few 10s of thousands produced (although we don't know and will never know the yield percentage, with Anil on the job, I suspect it would be pretty high).

With the speed of change in technology, I think it was a wise move, to follow the IP route, but even better now, with a multi-pronged strategy.

Some keep banging on (yes I'm talking about you Iseki 😛) that we need to produce an AKD2000 reference chip, but there is no need as the tech can be played with in simulation.

Before the AKD1000 ES and reference chips, anybody looking at the simulations, would have thought "Yeah looks incredible, but can you really do that on a chip?"..

We don't have that problem anymore, as the core technology has been proven, in 2 processes (AKD1500).

Nice Dingo....including being able to be produced out of 3 different foundries, namely TSMC, Global and Intel, in different formats, NOT
forgetting more foundry/s are to be named, as per Sean Hehir.

Regards...Tech (y)
 
  • Like
  • Fire
  • Love
Reactions: 21 users

IloveLamp

Top 20
  • Like
  • Love
  • Fire
Reactions: 81 users

toasty

Regular
  • Like
Reactions: 7 users

7für7

Regular
Besides from the German, what is the BN forum??
Was just a joke! 🙋🏻‍♂️
 
  • Like
Reactions: 3 users

7für7

Regular
Sorry if already posted

 
  • Like
  • Fire
Reactions: 22 users

7für7

Regular
It is so strange, that my posts are sometimes in German though I post them in English. It is no secret that I am German, but who the heck translates my post and why? #Akida was that you??? :unsure:
😂 sometimes the brain is playing games! One time i started to write something here. I thought I wrote in English and realised it was German
 
  • Haha
Reactions: 2 users

HopalongPetrovski

I'm Spartacus!
  • Wow
  • Sad
  • Like
Reactions: 4 users

Rach2512

Regular
View attachment 60174 View attachment 60175 View attachment 60176

Morning IloveLamp



This from Apr 2020, so could that mean that Socionext has something to offer with Akida already built in?

BrainChip and Socionext expect Akida silicon in Q3

BrainChip and Socionext expect engineering samples of their neural net processor Akida in Q3. They are being made on a TSMC MPW run.



Morning everyone, have a great day ❤
 
  • Like
  • Love
  • Fire
Reactions: 39 users

ndefries

Regular
Sorry if already posted


this is all getting out of hand. I thought I was investing in a meme stock. I am very concerned this could become a lot more. I am not going to renew my MF subscription!!!
 
  • Haha
  • Like
Reactions: 48 users
Top Bottom