BRN Discussion Ongoing

Good morning,

Our CFO Ken Scarince seems to be very active recently, he must be kicking some big goals, correct me if I'm slightly off here,
but hasn't he been awarded/converted around 5 million shares recently, moving into the top 20 for the first time that I'm aware of.

What's he know that I don't ? :unsure: ...sounds like he's in the prime spot, 1 out, 1 back on the track. :ROFLMAO::ROFLMAO:

Quarter coming to an end, I'm not expecting anything to be totally honest, the pattern has showed no signs of changing just yet, sadly.

Tech.
Good morning,
Would you be able to explain to me how these current partners of BRN agreements work …..are they in fact 100 % required to announce that their products are with BRN inside prior to any sale ?.
Is it fact that they MUST announce us as part of products , or does the existing agreement protect them from announcing anything to do with BRN in their products ?.
Iam at little confused as to what the agreements with Renesas, megachip, Prophesse, ARM, Intel ext currently state.
Is it wrong of me to think they are protected and do NOT require any type of information regarding BRN in their products moving forward and it will only be seen in the financials
Thanks in advanced.
 
Last edited:
  • Like
Reactions: 4 users

7für7

Regular
Did I just witness a BOOM?!
 
  • Like
Reactions: 3 users

Guzzi62

Regular
Good morning,

Our CFO Ken Scarince seems to be very active recently, he must be kicking some big goals, correct me if I'm slightly off here,
but hasn't he been awarded/converted around 5 million shares recently, moving into the top 20 for the first time that I'm aware of.

What's he know that I don't ? :unsure: ...sounds like he's in the prime spot, 1 out, 1 back on the track. :ROFLMAO::ROFLMAO:

Quarter coming to an end, I'm not expecting anything to be totally honest, the pattern has showed no signs of changing just yet, sadly.

Tech.
5 mill shares for a financial officer?? Really!
That crazy, the coffers are empty and they give him that amount, don't make any sense to me at all.
 
  • Like
Reactions: 7 users

toasty

Regular
5 mill shares for a financial officer?? Really!
That crazy, the coffers are empty and they give him that amount, don't make any sense to me at all.
Possibly time related......bonus for staying around
 
  • Sad
Reactions: 1 users

Guzzi62

Regular
Possibly time related......bonus for staying around
A finance officer is not that high up a position, he has nothing to do with the development of the company but just count money basically.

I am looking forward to a IP deal as promised by the 2 very top people, it's about time!
 
  • Like
  • Fire
Reactions: 5 users

jtardif999

Regular
NVIDIA have sold the sledgehammer approach to AI very well (I call it AI by GRUNT) and have been rewarded fabulously for it—a hangover from crypto mining maybe. And while excessive energy consumption is still not an issue for data centres, and many AI systems, who simply add extra green (and perceived to be free) energy generators (wind, solar etc), I doubt NVIDIA will need to consider energy efficiency. They sell the perception of AI by GRUNT and want everyone to think that’s the only way to do it.

I call it a perception of AI, because throwing immense resources at a problem in order to do something clever, very fast is not my idea of AI.

The world doesn’t yet comprehend what intelligence is, let alone artificial intelligence, nor AI at the edge. Power consumption at the edge is critical—and this is where Akida will rule.

I view intelligence as reasoning to an acceptably correct result given completely new information, and even with information missing.

The closest thing I’ve ever seen to true AI is the Tiger demonstration by BrainChip. Being trained with a single toy tiger, oriented in one direction, maybe two, and being able to reason that a picture of a real Tiger is also a Tiger. Now that IS AI and should be shouted out to the world as load as possible.

One-shot, and even two-shot learning is critical to getting people to accept the true worth of Akida.

AI by Grunt systems need to be trained with thousands of different pictures and then can only chose a solution if presented with an image that is contained within that training set, or so close that it is pretty much indistinguishable. That is simply a database lookup and has zero intelligence. It can only use logic that is coded into it. It cannot deviate from that code!

MetaAI and ChatGP have astounded me with the relevance of their answers. But still, not AI IMHO.
One of the best posts in a long time. Intelligence is much misunderstood and for that I don’t understand why BRN are not promoting the one-shot side of Akida more? One of the things that stood out to me in the Accenture patents, particularly the first patent description was how much it espoused the advantages of learning in real time with real world use cases to explain the differences between the said capability and the current state-of-the-art.
 
  • Like
  • Love
Reactions: 17 users

TECH

Regular
Good morning,
Would you be able to explain to me how these current partners of BRN agreements work …..are they in fact 100 % required to announce that their products are with BRN inside prior to any sale ?.
Is it fact that they MUST announce us as part of products , or does the existing agreement protect them from announcing anything to do with BRN in their products ?.
Iam at little confused as to what the agreements with Renesas, megachip, Prophesse, ARM, Intel ext currently state.
Is it wrong of me to think they are protected and do NOT require any type of information regarding BRN in their products moving forward and it will only be seen in the financials
Thanks in advanced.

Hi....first off I am not privy to any confidential agreements that have been made between Brainchip and partners.

Megachips and Renesas both signed a License agreement, paid the agreed sum to Brainchip, which has been seen sighted in recent
4C documents, as to whether both of these agreements were the same or similar I wouldn't know.

It is well known that, any company we deal with can choose whether to mention us within their own product or not, as far as I'm
aware, we have absolutely no control over that at all.

As an IP company, as with ARM, around 80% of all mobile smart phones have ARM technology embedded within them, BUT would
the public be aware of that, just ask a group of your friends who ARM are, the odds are they have absolutely NO idea.

As frustrating as this current period is for all shareholders and the company alike, we just have to wait and wait and..........until the
current pattern changes, progress is being made, because we can't physically see it yet, doesn't mean that it's not bubbling away
under the surface.

Ask yourself this question, why are investors buying BRN shares each and everyday, some can see the huge potential like me.

Kind regards...Tech. (y)
 
  • Like
  • Love
  • Fire
Reactions: 28 users

Bravo

If ARM was an arm, BRN would be its biceps💪!


Socionext joins GSA to drive connected intelligent systems

Socionext joins GSA to drive connected intelligent systems​

Business news | June 17, 2024
By Jean-Pierre Joosting



Socionext Inc., a leading provider of advanced SoCs for a wide range of applications, has announced that they have become a member of Global Semiconductor Alliance (GSA), the voice of the global semiconductor industry.​

“Socionextʼs membership in the GSA marks a significant milestone in our journey towards creating more connected and intelligent systems,” said Masahiro Koezuka, President and CEO of Socionext Inc.


As a new member of the GSA, Socionext stands alongside over 300 corporate members across six continents, representing 75% of the $575B+ semiconductor industry.

GSA is renowned for its role in connecting the semiconductor industry and promoting a profitable and sustainable global ecosystem. Socionextʼs alignment with the GSAʼs vision will enhance the companyʼs ability to navigate the complexities of the market and leverage emerging opportunities in semiconductor design, manufacturing, and sales.

“Socionext brings a wealth of industry knowledge to the Alliance and we look forward to their contributions to our industry events, interest groups and resources” said Jodi Shelton, co-founder and CEO of GSA. “As the GSA gains new members, we are ultimately supporting and enhancing the global semiconductor ecosystem.”

As a member of the GSA, Socionext will benefit from the unique neutral platform provided for collaboration, where global executives may interface and innovate with peers, partners and customers to accelerate industry growth and maximize return on invested and intellectual capital.






Screenshot 2024-06-18 at 11.57.53 am.png

 
  • Like
  • Fire
  • Love
Reactions: 27 users

Cirat

Regular
A finance officer is not that high up a position, he has nothing to do with the development of the company but just count money basically.

I am looking forward to a IP deal as promised by the 2 very top people, it's about time!
Ken is the Chief Financial Officer which in any reasonable size company is a senior executive role reporting directly to the the CEO and an extremely important role for the success of the company!!!
 
  • Like
  • Love
  • Fire
Reactions: 23 users
Hi....first off I am not privy to any confidential agreements that have been made between Brainchip and partners.

Megachips and Renesas both signed a License agreement, paid the agreed sum to Brainchip, which has been seen sighted in recent
4C documents, as to whether both of these agreements were the same or similar I wouldn't know.

It is well known that, any company we deal with can choose whether to mention us within their own product or not, as far as I'm
aware, we have absolutely no control over that at all.

As an IP company, as with ARM, around 80% of all mobile smart phones have ARM technology embedded within them, BUT would
the public be aware of that, just ask a group of your friends who ARM are, the odds are they have absolutely NO idea.

As frustrating as this current period is for all shareholders and the company alike, we just have to wait and wait and..........until the
current pattern changes, progress is being made, because we can't physically see it yet, doesn't mean that it's not bubbling away
under the surface.

Ask yourself this question, why are investors buying BRN shares each and everyday, some can see the huge potential
 
Last edited:
Cheers Tech, it also puts this content of we don’t have any IP deals that some continue to push as an issue in to its place.
Lots going on as you say that we are not privileged too and it’s not all about IP deals.
Go Brainchip
 
  • Like
Reactions: 3 users

Bravo

If ARM was an arm, BRN would be its biceps💪!

Could brain-like computers be a 'competition killer'?​

4 hours ago
By Zoe Corbyn, Technology Reporter
Share
Getty Images An employee checks a server room in the Samsung Networks' Telco Data Center
Getty Images
Electricity demand from data centres is rising fast
Modern computing's appetite for electricity is increasing at an alarming rate.
By 2026 consumption by data centres, artificial intelligence (AI) and cryptocurrency could be as much as double 2022 levels, according to a recent report from the International Energy Agency (IEA).
It estimates that in 2026 energy consumption by those three sectors could be roughly equivalent to Japan's annual energy needs.
Companies like Nvidia - whose computer chips underpin most AI applications today - are working on developing more energy efficient hardware.
But could an alternative path be to build computers with a fundamentally different type of architecture, one that is more energy efficient?
Some firms certainly think so, and are drawing on the structure and function of an organ which uses a fraction of the power of a conventional computer to perform more operations faster: the brain.
In neuromorphic computing, electronic devices imitate neurons and synapses, and are interconnected in a way that resembles the electrical network of the brain.
It isn't new - researchers have been working on the technique since the 1980s.
But the energy requirements of the AI revolution are increasing the pressure to get the nascent technology into the real world.
Current systems and platforms exist primarily as research tools, but proponents say they could provide huge gains in energy efficiency,
Amongst those with commercial ambitions include hardware giants like Intel and IBM.
A handful of small companies are also on the scene. “The opportunity is there waiting for the company that can figure this out,” says Dan Hutcheson, an analyst at TechInsights. “[And] the opportunity is such that it could be an Nvidia killer”.

SpiNNcloud Systems Racks of SpiNNcloud computer chips
SpiNNcloud Systems
SpiNNcloud says its neuromorphic computer will be more energy efficient for AI

In May SpiNNcloud Systems, a spinout of the Dresden University of Technology, announced it will begin selling neuromorphic supercomputers for the first time, and is taking pre-orders.
“We have reached the commercialisation of neuromorphic supercomputers in front of other companies,” says Hector Gonzalez, its co-chief executive.
It is a significant development says Tony Kenyon, a professor of nanoelectronic and nanophotonic materials at University College London who works in the field.
“While there still isn’t a killer app… there are lots of areas where neuromorphic computing will provide significant gains in energy efficiency and performance, and I’m sure we’ll start to see wide adoption of the technology as it matures,” he says.
Neuromorphic computing covers a range of approaches - from simply a more brain-inspired approach, to a near-total simulation of the human brain (which we are really nowhere near).
But there are some basic design properties that set it apart from conventional computing.
First, unlike conventional computers, neuromorphic computers don’t have separate memory and processing units. Instead, those tasks are performed together on one chip in a single location.
Removing that need to transfer data between the two reduces the energy used and speeds up processing time, notes Prof Kenyon.
Also common can be an event-driven approach to computing.
In contrast to conventional computing where every part of the system is always on and available to communicate with any other part all the time, activation in neuromorphic computing can be sparser.
The imitation neurons and synapses only activate in a moment of time when they have something to communicate, much the same way plenty of neurons and synapses in our brains only spring into action where there is a reason.
Doing work only when there is something to process also saves power.
And while modern computers are digital – using 1s or 0s to represent data - a neuromorphic computing can be analogue.
Historically important, that method of computing relies on continuous signals and can be useful where data coming from the outside world needs to be analysed.
However, for reasons of ease, most commercially oriented neuromorphic efforts are digital.

Commercial applications envisaged fall into two main categories.
One, which is where SpiNNcloud is focused, is in providing a more energy efficient and higher performance platform for AI applications – including image and video analysis, speech recognition and the large-language models that power chatbots such as ChatGPT.
Another is in "edge computing" applications – where data is processed not in the cloud, but in real time on connected devices, but which operate on power constraints. Autonomous vehicles, robots, cell phones and wearable technology could all benefit.
Technical challenges, however, remain. Long regarded as a main stumbling block to the advance of neuromorphic computing generally is developing the software needed for the chips to run.
While having the hardware is one thing, it must be programmed to work, and that can require developing from scratch a totally different style of programming to that used by conventional computers.
“The potential for these devices is huge… the problem is how do you make them work,” sums up Mr Hutcheson, who predicts it will be at least a decade, if not two, before the benefits of neuromorphic computing are really felt.
There are also issues with cost. Whether they use silicon, as the commercially oriented efforts do, or other materials, creating radically new chips is expensive, notes Prof Kenyon.
Intel Mike Davies, director of Intel’s neuromorphic computing lab
Intel
Intel is making "rapid progress" with its neuromorphic computer says Mike Davies (right)
Intel’s current prototype neuromorphic chip is called Loihi 2.
In April, the company announced it had brought together 1,152 of them to create Hala Point, a large-scale neuromorphic research system comprising more than 1.15 billion fake neurons and 128 billion fake synapses.
With a neuron capacity roughly equivalent to an owl brain, Intel claims is the world’s largest system to date.
At the moment it is still a research project for Intel.
“[But Hala Point] is showing that there's some real viability here for applications to use AI,” says Mike Davies, director of Intel’s neuromorphic computing lab.
About the size of a microwave oven, Hala Point is “commercially relevant” and “rapid progress” is being made on the software side, he says.

IBM calls its latest brain-inspired prototype chip NorthPole.
Unveiled last year, it is an evolution of its previous TrueNorth prototype chip. Tests show it is more energy efficient, space efficient and faster than any chip currently on the market, says Dharmendra Modha, the company’s chief scientist of brain-inspired computing. He adds that his group is now working to demonstrate chips can be dialed together into a larger system.
“Path to market will be at story to come,” he says. One of the big innovations with NorthPole, notes Dr Modha, is that it has been co-designed with the software so the full capabilities of the architecture can be exploited from the get-go.
Other smaller neuromorphic companies include BrainChip, SynSense and Innatera.


IBM IBM's NorthPole chip
IBM
IBM says its NorthPole chip is more energy efficient and faster than other chips
SpiNNcloud’s supercomputer commercialises neuromorphic computing developed by researchers at both TU Dresden and the University of Manchester, under the umbrella of the EU’s Human Brain Project.
Those efforts have resulted in two research-purpose neuromorphic supercomputers: the SpiNNaker1 machine based at the University of Manchester consisting of over one billion neurons, and operational since 2018.
A second generation SpiNNaker2 machine at TU Dresden, which is currently in the process of being configured, has the capacity to emulate at least five billion neurons. The commercially available systems offered by SpiNNcloud can reach an even higher level of at least 10 billion neurons, says Mr Gonzalez.
The future will be one of different types of computing platforms - conventional, neuromorphic and quantum, which is another novel type of computing also on the horizon - all working together, says Prof Kenyon.

 
  • Like
  • Fire
  • Love
Reactions: 33 users

Diogenese

Top 20
This SWF/eetimes article discusses SensiML's automated model building - similar to Edge Impulse. So, for those who've been wondering why I've been banging on about Edge impulse recently, this explains it much more clearly than I could.

Models are a keystone of any NN application.

Here are some extracts from the article:

SensiML Open-Sources TinyML Auto ML Tools - EE Times Podcast


The other is that this is a realm that’s typically been data firmware engineers and people that are more skilled in embedded development, not necessarily AI and machine learning skill sets. And we’ve seen that a lot in customer interactions we’ve had, is often admitted the first AI project that a team has undertaken. So they are going through the learning curve themselves on what it takes to get the project to be successful. And so there’s that issue as well.

And when we’re looking at AI at the edge, it may or may not be those types of applications, because some of these are physical sensors where you’ve got to collect data that isn’t readily available in large quantity for training. So somebody has to go do that work in order to have something that’s an input to train the models. So those kinds of things together kind of make this a challenging space that, you know, we’re seeing some of those friction points now.

So Data Studio is all about capturing sensor data, labeling that data, sort of if you think of your traditional supervised ML data set, you’ve got not only the source data that you’re looking to train, but you also have kind of the answers of what are you expecting that data to provide as the insight, and you’re using that to generalize a model. So the data studio is all about collecting and basically curating those data sets. It’s an ML data ops tool as a utility. The Analytic Studio is the one that we’re open sourcing. And that is an AutoML tool. And by AutoML, for those that aren’t familiar perhaps with that term, AutoML is basically using machine learning to develop machine learning. So taking the same schemes that you’re using for the inference model, but also going through a training process of saying, you know, there are a variety of different approaches I could take to building the model itself. I could either put that in the hands of somebody who’s knowing and wise in AI, and knows what kinds of models and how to configure those models, or I can use the power of computing in the cloud to do a search algorithm and come up with what those things should be. So AutoML tools I think really help empower the embedded developer to be able to do the kinds of things that data scientists could do, as well as as a work aid for data scientists who don’t have to necessarily do all that work manually and can leverage the AutoML capabilities as a work aid for productivity and being able to look at a much broader space of models potentially. o that’s what the Analytic Studio does. It basically takes the training data on the input side and what it spits out on the other side is a functioning model. And in the case of Analytic Studio, it takes the model and reduces it to actual Seek source code that can be readily integrated into your firmware for that edge device.

I think there’s a lot of need for explainability, transparency, and understanding what these models are actually doing. If you’re going to build a product, you need to support that product. And so if customers come back and say this isn’t behaving the way we thought it was going to behave, if it’s a black box, how do I know how to support that and to remedy the situation? So you want as much explainability in the model as you can, and certainly having the ability to look under the hood and see what’s going on, not only in the model, but also the tools that are used to build the model have some benefit. That’s one piece of it. For us, I think even the more interesting piece is that as a small development team that’s building these tools, there’s only so much that we personally can do if we’re working on this in sort of the traditional proprietary software sense of taking on a software roadmap and building features in at the rate we are able to do so. We looked at some of the other examples out there and saw that the open-source model as a development model just makes a lot of sense. You have to give something to get something. In our cases, giving our foundation code for the Analytic Studio seemed like a reasonable trade* for the prospect of being able to build the community out there that could go after some of the new emerging technologies that are being talked about, that show a lot of promise for addressing some of the bottlenecks that we talked about earlier.


* [ As Sir Humphrey would have said: “That would be a very courageous move, Minister.”]
 
Last edited:
  • Like
  • Fire
  • Thinking
Reactions: 24 users
  • Like
  • Wow
Reactions: 5 users

Diogenese

Top 20
Hi....first off I am not privy to any confidential agreements that have been made between Brainchip and partners.

Megachips and Renesas both signed a License agreement, paid the agreed sum to Brainchip, which has been seen sighted in recent
4C documents, as to whether both of these agreements were the same or similar I wouldn't know.

It is well known that, any company we deal with can choose whether to mention us within their own product or not, as far as I'm
aware, we have absolutely no control over that at all.

As an IP company, as with ARM, around 80% of all mobile smart phones have ARM technology embedded within them, BUT would
the public be aware of that, just ask a group of your friends who ARM are, the odds are they have absolutely NO idea.

As frustrating as this current period is for all shareholders and the company alike, we just have to wait and wait and..........until the
current pattern changes, progress is being made, because we can't physically see it yet, doesn't mean that it's not bubbling away
under the surface.

Ask yourself this question, why are investors buying BRN shares each and everyday, some can see the huge potential like me.

Kind regards...Tech. (y)
Hi tech,

I think there are differences between the agreements.

The initial Renesas agreement was for two nodes for "low end" applications for "single use", probably 28 nm. I thought this was intended for a specific Renesas MCU.

Megachips was for FD-Soi, 22 nm*. I thought that Megachips would produce Akida 1 to order for customers.

So I guess the Megachips one would have been more profit per chip royalty as it would normally have more than 2 nodes.

Now we don't know if those licences have been varied with the advent of Akida 2, but you can be sure that both were informed of Akida 2 as soon as the patent was filed 2 years ago, shortly after the licences were announced.

In the "Akida generations" page, there are comments by Renesas referring specifically to TeNNs/ViT.

https://brainchip.com/akida-generations/

We see an increasing demand for real-time, on-device, intelligence in AI applications powered by our MCUs and the need to make sensors smarter for industrial and IoT devices. We licensed Akida neural processors because of their unique neuromorphic approach to bring hyper-efficient acceleration for today’s mainstream AI models at the edge. With the addition of advanced temporal convolution and vision transformers, we can see how low-power MCUs can revolutionize vision, perception, and predictive applications in a wide variety of markets like industrial and consumer IoT and personalized healthcare, just to name a few.

- Roger Wendelken, Senior Vice President at Renesas’ IoT and Infrastructure Business Unit



* Correction: Megachips was for vanilla Akida 1 28 nm. Akida 1500 FD-Soi was Global Foundries, but Megachips did claim some involvement with "backend" design.
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 38 users

manny100

Regular
Despite the setback, McDonald's still believes in the potential of voice recognition for food ordering. The company plans to find a new partner for more extensive research by the end of the year. IBM says it is also in talks with other fast-food chains. It appears that the two companies are not parting on good terms with this project.

... GMAC & BrainChip with the QSRBot 247 might help.

GMAC Intelligence Goes Big with BrainChip Partnership
Link: https://brainchip.com/gmac-intelligence-goes-big-with-brainchip-partnership/

.
.



TENNS noise filter would be ideal for McDonalds and any pub I have ever been to.
 
  • Like
  • Haha
Reactions: 8 users

Diogenese

Top 20

Could brain-like computers be a 'competition killer'?​

4 hours ago
By Zoe Corbyn, Technology Reporter
Share
Getty Images An employee checks a server room in the Samsung Networks' Telco Data Center' Telco Data Center
Getty Images
Electricity demand from data centres is rising fast
Modern computing's appetite for electricity is increasing at an alarming rate.
By 2026 consumption by data centres, artificial intelligence (AI) and cryptocurrency could be as much as double 2022 levels, according to a recent report from the International Energy Agency (IEA).
It estimates that in 2026 energy consumption by those three sectors could be roughly equivalent to Japan's annual energy needs.
Companies like Nvidia - whose computer chips underpin most AI applications today - are working on developing more energy efficient hardware.
But could an alternative path be to build computers with a fundamentally different type of architecture, one that is more energy efficient?
Some firms certainly think so, and are drawing on the structure and function of an organ which uses a fraction of the power of a conventional computer to perform more operations faster: the brain.
In neuromorphic computing, electronic devices imitate neurons and synapses, and are interconnected in a way that resembles the electrical network of the brain.
It isn't new - researchers have been working on the technique since the 1980s.
But the energy requirements of the AI revolution are increasing the pressure to get the nascent technology into the real world.
Current systems and platforms exist primarily as research tools, but proponents say they could provide huge gains in energy efficiency,
Amongst those with commercial ambitions include hardware giants like Intel and IBM.
A handful of small companies are also on the scene. “The opportunity is there waiting for the company that can figure this out,” says Dan Hutcheson, an analyst at TechInsights. “[And] the opportunity is such that it could be an Nvidia killer”.

SpiNNcloud Systems Racks of SpiNNcloud computer chips
SpiNNcloud Systems
SpiNNcloud says its neuromorphic computer will be more energy efficient for AI

In May SpiNNcloud Systems, a spinout of the Dresden University of Technology, announced it will begin selling neuromorphic supercomputers for the first time, and is taking pre-orders.
“We have reached the commercialisation of neuromorphic supercomputers in front of other companies,” says Hector Gonzalez, its co-chief executive.
It is a significant development says Tony Kenyon, a professor of nanoelectronic and nanophotonic materials at University College London who works in the field.
“While there still isn’t a killer app… there are lots of areas where neuromorphic computing will provide significant gains in energy efficiency and performance, and I’m sure we’ll start to see wide adoption of the technology as it matures,” he says.
Neuromorphic computing covers a range of approaches - from simply a more brain-inspired approach, to a near-total simulation of the human brain (which we are really nowhere near).
But there are some basic design properties that set it apart from conventional computing.
First, unlike conventional computers, neuromorphic computers don’t have separate memory and processing units. Instead, those tasks are performed together on one chip in a single location.
Removing that need to transfer data between the two reduces the energy used and speeds up processing time, notes Prof Kenyon.
Also common can be an event-driven approach to computing.
In contrast to conventional computing where every part of the system is always on and available to communicate with any other part all the time, activation in neuromorphic computing can be sparser.
The imitation neurons and synapses only activate in a moment of time when they have something to communicate, much the same way plenty of neurons and synapses in our brains only spring into action where there is a reason.
Doing work only when there is something to process also saves power.
And while modern computers are digital – using 1s or 0s to represent data - a neuromorphic computing can be analogue.
Historically important, that method of computing relies on continuous signals and can be useful where data coming from the outside world needs to be analysed.
However, for reasons of ease, most commercially oriented neuromorphic efforts are digital.

Commercial applications envisaged fall into two main categories.
One, which is where SpiNNcloud is focused, is in providing a more energy efficient and higher performance platform for AI applications – including image and video analysis, speech recognition and the large-language models that power chatbots such as ChatGPT.
Another is in "edge computing" applications – where data is processed not in the cloud, but in real time on connected devices, but which operate on power constraints. Autonomous vehicles, robots, cell phones and wearable technology could all benefit.
Technical challenges, however, remain. Long regarded as a main stumbling block to the advance of neuromorphic computing generally is developing the software needed for the chips to run.
While having the hardware is one thing, it must be programmed to work, and that can require developing from scratch a totally different style of programming to that used by conventional computers.
“The potential for these devices is huge… the problem is how do you make them work,” sums up Mr Hutcheson, who predicts it will be at least a decade, if not two, before the benefits of neuromorphic computing are really felt.
There are also issues with cost. Whether they use silicon, as the commercially oriented efforts do, or other materials, creating radically new chips is expensive, notes Prof Kenyon.
Intel Mike Davies, director of Intel’s neuromorphic computing lab
Intel
Intel is making "rapid progress" with its neuromorphic computer says Mike Davies (right)
Intel’s current prototype neuromorphic chip is called Loihi 2.
In April, the company announced it had brought together 1,152 of them to create Hala Point, a large-scale neuromorphic research system comprising more than 1.15 billion fake neurons and 128 billion fake synapses.
With a neuron capacity roughly equivalent to an owl brain, Intel claims is the world’s largest system to date.
At the moment it is still a research project for Intel.
“[But Hala Point] is showing that there's some real viability here for applications to use AI,” says Mike Davies, director of Intel’s neuromorphic computing lab.
About the size of a microwave oven, Hala Point is “commercially relevant” and “rapid progress” is being made on the software side, he says.

IBM calls its latest brain-inspired prototype chip NorthPole.
Unveiled last year, it is an evolution of its previous TrueNorth prototype chip. Tests show it is more energy efficient, space efficient and faster than any chip currently on the market, says Dharmendra Modha, the company’s chief scientist of brain-inspired computing. He adds that his group is now working to demonstrate chips can be dialed together into a larger system.
“Path to market will be at story to come,” he says. One of the big innovations with NorthPole, notes Dr Modha, is that it has been co-designed with the software so the full capabilities of the architecture can be exploited from the get-go.
Other smaller neuromorphic companies include BrainChip, SynSense and Innatera.


IBM IBM's NorthPole chip's NorthPole chip
IBM
IBM says its NorthPole chip is more energy efficient and faster than other chips
SpiNNcloud’s supercomputer commercialises neuromorphic computing developed by researchers at both TU Dresden and the University of Manchester, under the umbrella of the EU’s Human Brain Project.
Those efforts have resulted in two research-purpose neuromorphic supercomputers: the SpiNNaker1 machine based at the University of Manchester consisting of over one billion neurons, and operational since 2018.
A second generation SpiNNaker2 machine at TU Dresden, which is currently in the process of being configured, has the capacity to emulate at least five billion neurons. The commercially available systems offered by SpiNNcloud can reach an even higher level of at least 10 billion neurons, says Mr Gonzalez.
The future will be one of different types of computing platforms - conventional, neuromorphic and quantum, which is another novel type of computing also on the horizon - all working together, says Prof Kenyon.

The Nvidia killer:

The Scottish AI play:

Is this a neuron that I see before me?
The synapse towards my hand?
Come let me grasp thee.
I have thee not, and yet I see thee still.
Art thou not, pixeled vision, sensible to feeling as to sight?
Or art thou but a neuron of the analog, a false creation,
proceeding from the heat-oppressed 20 W brain?
I see thee yet, in form as palpable as this laser point cloud which I now draw.
 
  • Like
  • Haha
  • Fire
Reactions: 20 users

Tony Coles

Regular

Could brain-like computers be a 'competition killer'?​

4 hours ago
By Zoe Corbyn, Technology Reporter
Share
Getty Images An employee checks a server room in the Samsung Networks' Telco Data Center' Telco Data Center
Getty Images
Electricity demand from data centres is rising fast
Modern computing's appetite for electricity is increasing at an alarming rate.
By 2026 consumption by data centres, artificial intelligence (AI) and cryptocurrency could be as much as double 2022 levels, according to a recent report from the International Energy Agency (IEA).
It estimates that in 2026 energy consumption by those three sectors could be roughly equivalent to Japan's annual energy needs.
Companies like Nvidia - whose computer chips underpin most AI applications today - are working on developing more energy efficient hardware.
But could an alternative path be to build computers with a fundamentally different type of architecture, one that is more energy efficient?
Some firms certainly think so, and are drawing on the structure and function of an organ which uses a fraction of the power of a conventional computer to perform more operations faster: the brain.
In neuromorphic computing, electronic devices imitate neurons and synapses, and are interconnected in a way that resembles the electrical network of the brain.
It isn't new - researchers have been working on the technique since the 1980s.
But the energy requirements of the AI revolution are increasing the pressure to get the nascent technology into the real world.
Current systems and platforms exist primarily as research tools, but proponents say they could provide huge gains in energy efficiency,
Amongst those with commercial ambitions include hardware giants like Intel and IBM.
A handful of small companies are also on the scene. “The opportunity is there waiting for the company that can figure this out,” says Dan Hutcheson, an analyst at TechInsights. “[And] the opportunity is such that it could be an Nvidia killer”.

SpiNNcloud Systems Racks of SpiNNcloud computer chips
SpiNNcloud Systems
SpiNNcloud says its neuromorphic computer will be more energy efficient for AI

In May SpiNNcloud Systems, a spinout of the Dresden University of Technology, announced it will begin selling neuromorphic supercomputers for the first time, and is taking pre-orders.
“We have reached the commercialisation of neuromorphic supercomputers in front of other companies,” says Hector Gonzalez, its co-chief executive.
It is a significant development says Tony Kenyon, a professor of nanoelectronic and nanophotonic materials at University College London who works in the field.
“While there still isn’t a killer app… there are lots of areas where neuromorphic computing will provide significant gains in energy efficiency and performance, and I’m sure we’ll start to see wide adoption of the technology as it matures,” he says.
Neuromorphic computing covers a range of approaches - from simply a more brain-inspired approach, to a near-total simulation of the human brain (which we are really nowhere near).
But there are some basic design properties that set it apart from conventional computing.
First, unlike conventional computers, neuromorphic computers don’t have separate memory and processing units. Instead, those tasks are performed together on one chip in a single location.
Removing that need to transfer data between the two reduces the energy used and speeds up processing time, notes Prof Kenyon.
Also common can be an event-driven approach to computing.
In contrast to conventional computing where every part of the system is always on and available to communicate with any other part all the time, activation in neuromorphic computing can be sparser.
The imitation neurons and synapses only activate in a moment of time when they have something to communicate, much the same way plenty of neurons and synapses in our brains only spring into action where there is a reason.
Doing work only when there is something to process also saves power.
And while modern computers are digital – using 1s or 0s to represent data - a neuromorphic computing can be analogue.
Historically important, that method of computing relies on continuous signals and can be useful where data coming from the outside world needs to be analysed.
However, for reasons of ease, most commercially oriented neuromorphic efforts are digital.

Commercial applications envisaged fall into two main categories.
One, which is where SpiNNcloud is focused, is in providing a more energy efficient and higher performance platform for AI applications – including image and video analysis, speech recognition and the large-language models that power chatbots such as ChatGPT.
Another is in "edge computing" applications – where data is processed not in the cloud, but in real time on connected devices, but which operate on power constraints. Autonomous vehicles, robots, cell phones and wearable technology could all benefit.
Technical challenges, however, remain. Long regarded as a main stumbling block to the advance of neuromorphic computing generally is developing the software needed for the chips to run.
While having the hardware is one thing, it must be programmed to work, and that can require developing from scratch a totally different style of programming to that used by conventional computers.
“The potential for these devices is huge… the problem is how do you make them work,” sums up Mr Hutcheson, who predicts it will be at least a decade, if not two, before the benefits of neuromorphic computing are really felt.
There are also issues with cost. Whether they use silicon, as the commercially oriented efforts do, or other materials, creating radically new chips is expensive, notes Prof Kenyon.
Intel Mike Davies, director of Intel’s neuromorphic computing lab
Intel
Intel is making "rapid progress" with its neuromorphic computer says Mike Davies (right)
Intel’s current prototype neuromorphic chip is called Loihi 2.
In April, the company announced it had brought together 1,152 of them to create Hala Point, a large-scale neuromorphic research system comprising more than 1.15 billion fake neurons and 128 billion fake synapses.
With a neuron capacity roughly equivalent to an owl brain, Intel claims is the world’s largest system to date.
At the moment it is still a research project for Intel.
“[But Hala Point] is showing that there's some real viability here for applications to use AI,” says Mike Davies, director of Intel’s neuromorphic computing lab.
About the size of a microwave oven, Hala Point is “commercially relevant” and “rapid progress” is being made on the software side, he says.

IBM calls its latest brain-inspired prototype chip NorthPole.
Unveiled last year, it is an evolution of its previous TrueNorth prototype chip. Tests show it is more energy efficient, space efficient and faster than any chip currently on the market, says Dharmendra Modha, the company’s chief scientist of brain-inspired computing. He adds that his group is now working to demonstrate chips can be dialed together into a larger system.
“Path to market will be at story to come,” he says. One of the big innovations with NorthPole, notes Dr Modha, is that it has been co-designed with the software so the full capabilities of the architecture can be exploited from the get-go.
Other smaller neuromorphic companies include BrainChip, SynSense and Innatera.


IBM IBM's NorthPole chip's NorthPole chip
IBM
IBM says its NorthPole chip is more energy efficient and faster than other chips
SpiNNcloud’s supercomputer commercialises neuromorphic computing developed by researchers at both TU Dresden and the University of Manchester, under the umbrella of the EU’s Human Brain Project.
Those efforts have resulted in two research-purpose neuromorphic supercomputers: the SpiNNaker1 machine based at the University of Manchester consisting of over one billion neurons, and operational since 2018.
A second generation SpiNNaker2 machine at TU Dresden, which is currently in the process of being configured, has the capacity to emulate at least five billion neurons. The commercially available systems offered by SpiNNcloud can reach an even higher level of at least 10 billion neurons, says Mr Gonzalez.
The future will be one of different types of computing platforms - conventional, neuromorphic and quantum, which is another novel type of computing also on the horizon - all working together, says Prof Kenyon.


Wow! That was a good read @ Bravo, i remember FF had wrote about the Nvidia Killer.
 
  • Like
  • Fire
Reactions: 8 users

davidfitz

Regular
Good morning,

Our CFO Ken Scarince seems to be very active recently, he must be kicking some big goals, correct me if I'm slightly off here,
but hasn't he been awarded/converted around 5 million shares recently, moving into the top 20 for the first time that I'm aware of.

What's he know that I don't ? :unsure: ...sounds like he's in the prime spot, 1 out, 1 back on the track. :ROFLMAO::ROFLMAO:

Quarter coming to an end, I'm not expecting anything to be totally honest, the pattern has showed no signs of changing just yet, sadly.

Tech.
For those who have been around long enough Ken was heavily involved with the investor side of things before his title changed to CFO. Whilst we have no idea of the impact of his tireless work in the past we may just be a bit closer to finding out :unsure:. The respect shown to some of us earlier investors was admirable.


Attending these shows represents excellent opportunities to educate investment communities about how AI will impact markets into the future as well as evangelize BrainChip’s unique vision and capabilities in the technology space.

He was also instrumental in creating the ADR.


BrainChip Chief Financial officer, Ken Scarince commented, “The versatility and widespread appeal of an ADR program among US institutions will allow BrainChip to continue on its path of pursuing high accessibility to the US capital markets. We believe the strong demand for high-growth potential, Artificial Intelligence stocks in the US will result in an influx of US investment and ultimately an increase in shareholder value.”

He has continued to do so from time to time.


“From detailing how Edge AI is the compute model of the future to showcasing our leading position as the first and only commercial producer of neuromorphic AI IP solutions, we look forward to showing Oppenheimer investors how we are helping overcome the challenges in a high-potential AIoT market that will deploy more than a trillion intelligent Edge devices by 2030,” said Ken Scarince, CFO at BrainChip. “We’re using neuromorphic computing as a critical enabler to accelerate radically new intelligent services and applications. This is an excellent opportunity to learn more about this technological sea change and how BrainChip is leading the way with its Akida IP.”
 
  • Like
  • Love
  • Fire
Reactions: 30 users
Top Bottom