BRN Discussion Ongoing

Has anyone considered that the buying is quiet possibly someone buying to control the vote at the next AGM,Hmmmmmm just a thought
Unless there is some better financials $$$ or another asx price sensitive announcement, then it could be a possible chance after the last AGM. But it would be a big gamble if you’d ask me and I forgot your a








































1708509354449.gif
 
  • Haha
  • Fire
  • Like
Reactions: 15 users

Diogenese

Top 20
SW-F put me onto Roger Levinson's analog adventure:

https://www.eetimes.com/blumind-harnesses-analog-for-ultra-low-power-intelligence/

Canadian startup Blumind recently developed an analog computing architecture for ultra-low power AI acceleration of sensor data, Blumind CEO Roger Levinson told EE Times. The company hopes to enable widespread intelligence in Internet of Things (IoT) devices.
Advanced process nodes aren’t cost effective for tiny chips used in tens of hundreds of millions of units in the IoT. Combine this with the fragmentation of the IoT market, the need for application-specific silicon, and the requirement for zero additional power consumption and it’s easy to see why the IoT has been slow to adopt AI, Levinson said.

“The request from our customer was: I need to lower my existing power, add no cost, and add intelligence to my system,” he said. “That isn’t possible, but how close to zero [power consumption] can you get? We’re adding a piece to the system, so we have to add zero to the system power, and cost has to be negligible, and then we have a shot. Otherwise, they’re not going to add intelligence to the devices, they’re going to wait, and that’s what’s happening. People are waiting.”

This is the problem Blumind is taking on. Initial development work on efficient machine learning (ML) at ultra-low power by Blumind cofounder and CTO John Gosson forms the basis for the startup’s architecture today.


“John said, ‘What you need to do is move charge around as the information carrier, and not let it go between power supplies’,” Levinson said. “Energy [consumption] happens when charge moves from the power supply to ground, and heat is generated. So he built an architecture [around that idea] which is elegant in its simplicity and robustness.”

Like some of its competitors in the ultra-low power space, Blumind is focusing on analog computing.

“We’ve solved the system-level always-on problem by making it all analog,” he said. “We look like a compute in memory architecture because we use a single transistor to store coefficients for the network, and that device also does the multiplication.”

The transistor’s output is the product of the input and the stored weight; the signal integrates for a certain amount of time, which generates a charge proportional to that product. This charge is then accumulated on a capacitor. A proprietary scheme measures the resulting charge and generates an output proportional to it which represents the activation.

“Everything is time based, so we are not looking at absolute voltages or currents,” he said. “All our calculations are ratiometric, which makes us insensitive to process, voltage and temperature. To maintain analog dynamic range, we do have to compensate for temperature, so even though the ratios remain stable, the magnitudes of signals can change.”

Levinson said Blumind has chosen to focus on “use cases that are relevant to the world today”—keyword spotting and vision—partly in an effort to prove to the market analog implementations of neural networks are viable in selected use cases.


Blumind test silicon (Source: Blumind) Blumind test silicon. (Source: Blumind)
One of the biggest challenges has been, does it have to be software configurable, or not?” he said. “Our first architecture is not configurable in terms of the network—we build a model in silicon, which happens to robust for the class of applications we’re going after, and is orders of magnitude more power and area efficient.”

Model weights are adjustable, but everything else is fixed. However, this is enough flexibility to cater for a class of problems, Levinson said.

The layers are fixed, the neurons and synapses are fixed,” he said. “We’re starting with audio because our [customer] wants an always-on voice system. However, our silicon is capable of doing anything that can utilize a recurrent neural network.”

Blumind’s software stack supports customer training of the recurrent neural network (RNN) its silicon is designed for with customers’ own data.

This strategy helps minimize power consumption, but it means a separate tapeout for every new class of application Blumind wants to target. Levinson said that at legacy 22-nm nodes, the cost of an analog/mixed-signal tapeout is a little over $1 million, and requires a team of just five to eight people.

In tinyML today, the performance difference from changing models is minor, he argues.

“There is a hard limit at the edge, especially in sensors,” he said. “I have X amount of memory and X amount of compute power, and a battery. The data scientist has to fit the model within these constraints.”
Blumind has test chips for its first product, the RNN accelerator designed for keyword spotting, voice activity detection and similar time series data applications. This silicon achieves 10 nJ per inference; combined with feature extraction, it consumes a few microwatts during always-on operation. The chip also includes an audio data buffer (required for the Amazon Echo specification) within “single digit microwatts,” Levinson said.

Blumind’s chip connects directly to an analog microphone for input, and sends a wake up signal to an MCU when it detects a keyword. The current generation requires weight storage in external non-volatile memory, but Blumind plans to incorporate that in future devices.


1708508666940.png


Tapeout for the commercial version of the RNN accelerator is underway.

Blumind’s also currently bringing up test silicon of a convolutional neural network (CNN) accelerator it’s designed for vision applications in its lab, which it plans to demonstrate this summer. The target is object detection, such as person detection, at up to 10 fps using 5-20 µW, depending on configuration, Levinson said.

The company’s also working with an academic partner on a software-definable version of its analog architecture for future technology generations.

First samples of Blumind’s RNN accelerator are due in Q3.


Having fixed layers and synapses designed according to each customers data means designing a new tapeout for each customer - a mere bagatelle according to Levinson. $1M a pop.

I wonder about accuracy. This is for low hanging fruit which is not safety-critical, so there may be a market for ultra-low power near-enuf-is-good-enuf NNs.

.PS: Roger's looking pretty ripped, so don't tell him I said this.
 
Last edited:
  • Like
  • Haha
  • Wow
Reactions: 14 users

Diogenese

Top 20
Yes I believe its still coming soon. I saw a conference can't find it now but the renesas speaker blurbed 2023 then said 2024 so anytime now as its 2024. Date no idea. This was probably of no help to you lol 🤪
Hi MD,

A perfect example of information v knowledge :)
 
  • Like
  • Haha
  • Fire
Reactions: 5 users
SW-F put me onto Roger Levinson's analog adventure:

https://www.eetimes.com/blumind-harnesses-analog-for-ultra-low-power-intelligence/

Canadian startup Blumind recently developed an analog computing architecture for ultra-low power AI acceleration of sensor data, Blumind CEO Roger Levinson told EE Times. The company hopes to enable widespread intelligence in Internet of Things (IoT) devices.
Advanced process nodes aren’t cost effective for tiny chips used in tens of hundreds of millions of units in the IoT. Combine this with the fragmentation of the IoT market, the need for application-specific silicon, and the requirement for zero additional power consumption and it’s easy to see why the IoT has been slow to adopt AI, Levinson said.

“The request from our customer was: I need to lower my existing power, add no cost, and add intelligence to my system,” he said. “That isn’t possible, but how close to zero [power consumption] can you get? We’re adding a piece to the system, so we have to add zero to the system power, and cost has to be negligible, and then we have a shot. Otherwise, they’re not going to add intelligence to the devices, they’re going to wait, and that’s what’s happening. People are waiting.”

This is the problem Blumind is taking on. Initial development work on efficient machine learning (ML) at ultra-low power by Blumind cofounder and CTO John Gosson forms the basis for the startup’s architecture today.


“John said, ‘What you need to do is move charge around as the information carrier, and not let it go between power supplies’,” Levinson said. “Energy [consumption] happens when charge moves from the power supply to ground, and heat is generated. So he built an architecture [around that idea] which is elegant in its simplicity and robustness.”

Like some of its competitors in the ultra-low power space, Blumind is focusing on analog computing.

“We’ve solved the system-level always-on problem by making it all analog,” he said. “We look like a compute in memory architecture because we use a single transistor to store coefficients for the network, and that device also does the multiplication.”

The transistor’s output is the product of the input and the stored weight; the signal integrates for a certain amount of time, which generates a charge proportional to that product. This charge is then accumulated on a capacitor. A proprietary scheme measures the resulting charge and generates an output proportional to it which represents the activation.

“Everything is time based, so we are not looking at absolute voltages or currents,” he said. “All our calculations are ratiometric, which makes us insensitive to process, voltage and temperature. To maintain analog dynamic range, we do have to compensate for temperature, so even though the ratios remain stable, the magnitudes of signals can change.”

Levinson said Blumind has chosen to focus on “use cases that are relevant to the world today”—keyword spotting and vision—partly in an effort to prove to the market analog implementations of neural networks are viable in selected use cases.


Blumind test silicon (Source: Blumind) Blumind test silicon. (Source: Blumind)
One of the biggest challenges has been, does it have to be software configurable, or not?” he said. “Our first architecture is not configurable in terms of the network—we build a model in silicon, which happens to robust for the class of applications we’re going after, and is orders of magnitude more power and area efficient.”

Model weights are adjustable, but everything else is fixed. However, this is enough flexibility to cater for a class of problems, Levinson said.

The layers are fixed, the neurons and synapses are fixed,” he said. “We’re starting with audio because our [customer] wants an always-on voice system. However, our silicon is capable of doing anything that can utilize a recurrent neural network.”

Blumind’s software stack supports customer training of the recurrent neural network (RNN) its silicon is designed for with customers’ own data.

This strategy helps minimize power consumption, but it means a separate tapeout for every new class of application Blumind wants to target. Levinson said that at legacy 22-nm nodes, the cost of an analog/mixed-signal tapeout is a little over $1 million, and requires a team of just five to eight people.

In tinyML today, the performance difference from changing models is minor, he argues.

“There is a hard limit at the edge, especially in sensors,” he said. “I have X amount of memory and X amount of compute power, and a battery. The data scientist has to fit the model within these constraints.”
Blumind has test chips for its first product, the RNN accelerator designed for keyword spotting, voice activity detection and similar time series data applications. This silicon achieves 10 nJ per inference; combined with feature extraction, it consumes a few microwatts during always-on operation. The chip also includes an audio data buffer (required for the Amazon Echo specification) within “single digit microwatts,” Levinson said.

Blumind’s chip connects directly to an analog microphone for input, and sends a wake up signal to an MCU when it detects a keyword. The current generation requires weight storage in external non-volatile memory, but Blumind plans to incorporate that in future devices.


View attachment 57407

Tapeout for the commercial version of the RNN accelerator is underway.

Blumind’s also currently bringing up test silicon of a convolutional neural network (CNN) accelerator it’s designed for vision applications in its lab, which it plans to demonstrate this summer. The target is object detection, such as person detection, at up to 10 fps using 5-20 µW, depending on configuration, Levinson said.

The company’s also working with an academic partner on a software-definable version of its analog architecture for future technology generations.

First samples of Blumind’s RNN accelerator are due in Q3.


Having fixed layers and synapses designed according to each customers data means designing a new tapeout for each customer - a mere bagatelle according to Levinson. $1M a pop.

I wonder about accuracy. This is for low hanging fruit which is not safety-critical, so there may be a market for ultra-low power near-enuf-is-good-enuf NNs.

.PS: Roger's looking pretty ripped, so don't tell him I said this.
Sounds like they only have "one" customer, at the moment, from what they say?...

Might be a "good" one though..

AKIDA could do all that, but if they are custom designing each chip, for their customer/s needs, that is an advantage, with development time/costs etc, if my read is correct?
 
  • Like
Reactions: 4 users

wilzy123

Founding Member
AKIDA could do all that, but if they are cusrom designing each chip, for their customer/s needs, that is an advantage, with development time/costs etc, if my read is correct?

Don't ask if your read is correct.

Start with asking yourself if you even understand what it is you are saying. If you cannot answer that, maybe @Galaxycar help.
 
  • Haha
Reactions: 1 users
Don't ask if your read is correct.

Start with asking yourself if you even understand what it is you are saying. If you cannot answer that, maybe @Galaxycar help.
Not enough antagonists, back on the forum for you yet, Wilzy?
 
  • Like
  • Haha
  • Love
Reactions: 8 users

Galaxycar

Regular
Just sayin the against vote was 160million last time, This time it may be 200 plus million they own. Think it may be on the cards that some consortium may want a seat on the board and it may be what all this price rise is about. I admit Brainchip has made leaps and bounds behind the scenes lately and at some stage that will turn to fruit.But you have to look at all possibilities to the current buying.
 
  • Like
  • Thinking
Reactions: 5 users

Earlyrelease

Regular
Edge box’s.
My following post will demonstrate my lack of memory and retrieval skills from this site but here goes anyway.

My understanding was that when Akida 1000 was made we paid TSMC for the disc to be made and then various samples were used to validate the run and then various demo kits were provided to uni’s etc. This was part of the requirement to prove the idea worked in silicon which it surpassed most expectations. There was obviously a quantity of chips left over ( unknown number of discs made) so the exact number of chips remaining after that first run remains a mystery. We also sold a very limited number of devices on our web page with even some TSE members posting their purchase photos on the system.

So my belief is that the initial lull in take up in our IP model only approach resulted in a desire to show the product in a more user friendly environment and the edge box opportunity presented and allowed the company to maximise the remaining 1000 model chips to be used.

Now I have not seen anything in the financial records or company releases that we have gone back to TSMC and had another run of the 1000 model chips made.

Thus my expectations of edge box sales are not huge.

What is huge is the addition of a tool, cheap one too, that will provide those interested but maybe on the fence or cautions over funds customers to try the tech and play before they commit.

OK those with better research skills and memory shoot me down in flames. (Be kind ❤️)
 
  • Like
  • Fire
  • Love
Reactions: 18 users

skutza

Regular
Just think though. In all of this, BRN is like a tightly wound spring. Here is the list of investors/watchers IMO.
  1. Some totally understand and are waiting for the confirmation.
  2. Some mostly understand the company, what they can achieve but are still a little cautious
  3. Some get that the company has something, they don't exactly know what, but with all the other companies talking and partnering with BRN- well....
  4. Some don't get the whole thing but invest for the hype
  5. Some don't get it and call it a meem stock but keep a close eye, just in case.
The one thing this group of people all have in common is that, if/when that Intel, Nvidia, Mercedes or other really big name take a bite and the link confirmed, well that spring will bust and the Shorters will run for the hills, the investors will want more and the ones sitting on the side-lines will smash it hard.

$2.50 in the first 2 days will be blown away and we'll be still trying to close our mouths at $3. Step by step.
 
  • Like
  • Fire
  • Wow
Reactions: 26 users
Hi All

Just something to weigh against those who criticise the information flow from Brainchip and mock the adherence to NDAs.

It appears Intel also tell their shareholders the same true story.

This being the case why do a certain group who claim a level of knowledge and sophistication about the markets keep screaming the opposite.

I will leave it to you to decide what motivates them:

“Paul Alcorn: Can you address the persistent rumours that IFS is doing packaging work with Nvidia, or for Nvidia?


Stu Pann: I can't. I am aware of the rumour. In this business, especially now, our customers are asking for confidentiality; they don't want to disclose. They will decide when and where they want to disclose and what they'll disclose. I’m not speaking with Nvidia in mind, but just generically – all the customers feel that way. When they're ready to talk, they will talk, and they will let us know when they're going to talk. I would love to be more visible, but I also have to respect what they want to say.”


My opinion only DYOR
Fact Finder
 
  • Like
  • Fire
  • Love
Reactions: 71 users
  • Like
  • Fire
  • Love
Reactions: 37 users
Edge box’s.
My following post will demonstrate my lack of memory and retrieval skills from this site but here goes anyway.

My understanding was that when Akida 1000 was made we paid TSMC for the disc to be made and then various samples were used to validate the run and then various demo kits were provided to uni’s etc. This was part of the requirement to prove the idea worked in silicon which it surpassed most expectations. There was obviously a quantity of chips left over ( unknown number of discs made) so the exact number of chips remaining after that first run remains a mystery. We also sold a very limited number of devices on our web page with even some TSE members posting their purchase photos on the system.

So my belief is that the initial lull in take up in our IP model only approach resulted in a desire to show the product in a more user friendly environment and the edge box opportunity presented and allowed the company to maximise the remaining 1000 model chips to be used.

Now I have not seen anything in the financial records or company releases that we have gone back to TSMC and had another run of the 1000 model chips made.

Thus my expectations of edge box sales are not huge.

What is huge is the addition of a tool, cheap one too, that will provide those interested but maybe on the fence or cautions over funds customers to try the tech and play before they commit.

OK those with better research skills and memory shoot me down in flames. (Be kind ❤️)
Hi Earlyrelease

As you know there were two trips to the TSMC Foundry. The first one in 2020 involved fabricating the engineering samples as they were called and for this purpose to save money they shared wafers with others and it was thought that Brainchip possibly received between 3,000 and 7,000 AKD1000 engineering samples. These tested perfectly and in 2021 with some design tweaks to improve efficiency the commercial AKD1000 was produced. We were never advised as to the number of wafers that TSMC produced in this run and no numbers were given for the chips received.

There was a great deal of speculation around how many. Most speculation was centred on the statement by Anil Mankar that they had sent the commercial tapeout to TSMC for 'mass production' of the commercial version of AKD1000. One of the very enterprising members of HC/TSEx contacted TSMC by email asking what they called 'mass production' and received an email stating that a production run of 10 million chips or more was considered to be mass production.

This number was never confirmed by anyone at Brainchip that I am aware nor was it denied nor was any other number stated.

It should be remembered that the commercial AKD1000 was produced at a time when Brainchip was still a chip and IP company and expecting to continue that way into the future.

Accordingly it is not impossible that Brainchip may have had a commercial quantity of AKD1000 produced to achieve economies of scale and to have plenty of product available to service expected demand.

It was also being touted by Peter van der Made at that time that some of this production run would be used to create an AKIDA USB stick for general sale. Anil Mankar spoke about this USB stick at the Ai Field Day and even suggested it would retail, when he was asked by someone in the audience, for about $50 to $60. It was clearly a real plan that was only displaced much later.

At the Meet & Greet I specifically asked Sean Hehir CEO in respect of the VVDN AKIDA Edge Box if they had sufficient AKD1000 chips or if they would need to have another production run and he said words to the effect that no they had plenty of chips that was not an issue.

The above is all from my memory but I hope it assists.

My opinion only DYOR
Fact Finder
 
  • Like
  • Fire
  • Love
Reactions: 39 users
I wonder whose AI they are using :unsure:

Could've given us a plug at least :LOL:



17/2/2024

Circle8 Integrates the Power of AI to Solve the Global Plastic Crisis​

30
willie_horton
Business
Rating & reviews (0 reviews)

Share your opinion
Perth, Western Australia – Feb. 6, 2024 –

The amount of plastic products that continue to be dumped in landfills is a global problem. Mark Grogan, owner of Circle8, has harnessed the power of artificial intelligence (AI) for an innovative solution to waste removal and management. It has the ability to disrupt the worldwide plastic crisis and deposit return systems.

Circle8’s unique solution employs smart bins and an integrated data platform to identify waste using ultra-low powered AI. It’s affordable, consumer-centered, and encourages everyone to participate in addressing the ever-growing plastics problem.

Circle8 has designed the next step in green solutions. Data is collected in real-time that provides insights that can then be used to improve sustainability. The company utilizes advanced cybersecurity and privacy protection for all data. The company’s advanced technology is particularly beneficial for container deposit systems.

Despite the social, economic and environmental benefits, many people simply throw plastics with a deposit or that are recyclable in the trash. Only 13 percent of plastic packaging is recycled and approximately 36 percent of plastic drink bottles. When plastics break down, they release microplastics that pollute soil, air and water. Microplastics have been found in humans, pets and wildlife. The company’s unique solution results in better environmental outcomes for people and the planet.

The company’s innovative technology is also an important element of environmental, social and corporate governance (ESG) that’s an increasingly important aspect for businesses and organizations of all sizes and scopes. The data empowers businesses to manage their imge and effectively plan and implement ESG policies. The innovative approach to waste and its management developed by Circle8 is changing the way people think about plastic trash. The combination of AI and integrated data platform provides the means for everyone to aid in the reduction of plastic pollution in the environment for a healthier planet that also conserves valuable natural resources.
 
  • Like
  • Fire
Reactions: 15 users
I wonder whose AI they are using :unsure:

Could've given us a plug at least :LOL:



17/2/2024

Circle8 Integrates the Power of AI to Solve the Global Plastic Crisis​

30
willie_horton
Business
Rating & reviews (0 reviews)

Share your opinion
Perth, Western Australia – Feb. 6, 2024 –

The amount of plastic products that continue to be dumped in landfills is a global problem. Mark Grogan, owner of Circle8, has harnessed the power of artificial intelligence (AI) for an innovative solution to waste removal and management. It has the ability to disrupt the worldwide plastic crisis and deposit return systems.

Circle8’s unique solution employs smart bins and an integrated data platform to identify waste using ultra-low powered AI. It’s affordable, consumer-centered, and encourages everyone to participate in addressing the ever-growing plastics problem.

Circle8 has designed the next step in green solutions. Data is collected in real-time that provides insights that can then be used to improve sustainability. The company utilizes advanced cybersecurity and privacy protection for all data. The company’s advanced technology is particularly beneficial for container deposit systems.

Despite the social, economic and environmental benefits, many people simply throw plastics with a deposit or that are recyclable in the trash. Only 13 percent of plastic packaging is recycled and approximately 36 percent of plastic drink bottles. When plastics break down, they release microplastics that pollute soil, air and water. Microplastics have been found in humans, pets and wildlife. The company’s unique solution results in better environmental outcomes for people and the planet.

The company’s innovative technology is also an important element of environmental, social and corporate governance (ESG) that’s an increasingly important aspect for businesses and organizations of all sizes and scopes. The data empowers businesses to manage their imge and effectively plan and implement ESG policies. The innovative approach to waste and its management developed by Circle8 is changing the way people think about plastic trash. The combination of AI and integrated data platform provides the means for everyone to aid in the reduction of plastic pollution in the environment for a healthier planet that also conserves valuable natural resources.



Mark Grogan and Megan Grogan — Founders of Circle.io

Whitehatseo

Whitehatseo
·
Follow
2 min read
·
6 days ago

Listen
Share
In the quest for sustainability, every small step counts. Mark and Megan Grogan, the innovative minds behind Circle.io, are proving this with their revolutionary approach to waste management. Their smart bins are not just receptacles for garbage; they represent a paradigm shift in how we handle waste collection.

The Visionaries Behind Circle.io:

Mark and Megan Grogan, the founders of Circle.io, share a passion for environmental conservation and technological innovation. With backgrounds in engineering and entrepreneurship, they combined their expertise to tackle one of the world’s pressing challenges: waste management.

Introducing Circle.io Smart Bins:

Circle.io’s smart bins are not your average waste receptacles. Equipped with cutting-edge sensors and IoT technology, these bins are capable of monitoring waste levels in real-time, optimizing collection routes, and even alerting authorities when they need servicing. This intelligent system not only streamlines waste collection processes but also reduces operational costs and carbon emissions associated with inefficient collection routes.

The Environmental Impact:

By optimizing waste collection routes, Circle.io smart bins minimize the carbon footprint of traditional collection methods. Additionally, their real-time monitoring capabilities ensure that bins are emptied only when necessary, reducing unnecessary trips and fuel consumption. This efficiency not only benefits the environment but also contributes to cost savings for municipalities and waste management companies.

Empowering Communities:

Mark and Megan Grogan envision Circle.io smart bins as more than just a technological solution; they are tools for community empowerment. By providing municipalities and businesses with the means to manage waste more efficiently, Circle.io helps communities take control of their environmental impact. Furthermore, the data collected by these bins can inform policy decisions and drive sustainable initiatives at the local level.

A Sustainable Future:

In a world grappling with environmental challenges, solutions like Circle.io smart bins offer a glimmer of hope. By harnessing the power of technology, Mark Groganand Megan Grogan are demonstrating that sustainability and innovation can go hand in hand. Their commitment to making a difference, one smart bin at a time, is inspiring individuals and organizations worldwide to rethink their approach to waste management.

Conclusion:

Mark and Megan Grogan, the masterminds behind Circle.io, are proving that small changes can have a big impact. Their smart bins are not only revolutionizing waste collection but also paving the way for a more sustainable future. By combining their passion for environmental conservation with innovative technology, they are empowering communities to take control of their waste management practices. As we continue to face environmental challenges, the work of visionaries like the Grogans serves as a beacon of hope, reminding us that with ingenuity and determination, we can build a better world for future generations.
 
  • Like
  • Love
  • Fire
Reactions: 17 users

Diogenese

Top 20
Hi Earlyrelease

As you know there were two trips to the TSMC Foundry. The first one in 2020 involved fabricating the engineering samples as they were called and for this purpose to save money they shared wafers with others and it was thought that Brainchip possibly received between 3,000 and 7,000 AKD1000 engineering samples. These tested perfectly and in 2021 with some design tweaks to improve efficiency the commercial AKD1000 was produced. We were never advised as to the number of wafers that TSMC produced in this run and no numbers were given for the chips received.

There was a great deal of speculation around how many. Most speculation was centred on the statement by Anil Mankar that they had sent the commercial tapeout to TSMC for 'mass production' of the commercial version of AKD1000. One of the very enterprising members of HC/TSEx contacted TSMC by email asking what they called 'mass production' and received an email stating that a production run of 10 million chips or more was considered to be mass production.

This number was never confirmed by anyone at Brainchip that I am aware nor was it denied nor was any other number stated.

It should be remembered that the commercial AKD1000 was produced at a time when Brainchip was still a chip and IP company and expecting to continue that way into the future.

Accordingly it is not impossible that Brainchip may have had a commercial quantity of AKD1000 produced to achieve economies of scale and to have plenty of product available to service expected demand.

It was also being touted by Peter van der Made at that time that some of this production run would be used to create an AKIDA USB stick for general sale. Anil Mankar spoke about this USB stick at the Ai Field Day and even suggested it would retail, when he was asked by someone in the audience, for about $50 to $60. It was clearly a real plan that was only displaced much later.

At the Meet & Greet I specifically asked Sean Hehir CEO in respect of the VVDN AKIDA Edge Box if they had sufficient AKD1000 chips or if they would need to have another production run and he said words to the effect that no they had plenty of chips that was not an issue.

The above is all from my memory but I hope it assists.

My opinion only DYOR
Fact Finder
Hi FF,

If memory serves, a couple of those tweaks after customers had tested the engineering samples included a change from 1-bit to 4-bit weights and activations to improve accuracy, and CNN2SNN so customer "legacy" models could be converted for use on Akida, basically adding backward compatibility.

PvdM's White paper "4 bits are Enough" was produced after the big boys started talking about standardizing model size, settling on 8 bits. Hence Akida 2 uses 8 bits for the input and output interfaces, but runs 4 bits internally, and can step back to lower power 1 bit or 2 bit operation if lower precision is acceptable.

https://brainchip.com/4-bits-are-enough/
 
  • Like
  • Fire
  • Love
Reactions: 37 users

Earlyrelease

Regular
FF
I thank you for the benefit of your gift and I am now excited on both fronts as the number available to sell whilst still unknown is greater than my expectations so happy days
 
  • Like
  • Fire
  • Love
Reactions: 16 users
Looks like Huawei on the bandwagon too.

Just need one of these tech behemoths out there to publicly say...you know what...BRN, you can save us sheetload of R&D time and $ so here's a signature :)



HUAWEI RESEARCH CENTER ZÜRICH

Research Engineer in Neuromorphic and Neuro-inspired Computing Algorithms​

If you are enthusiastic in shaping Huawei’s European Research Institute together with a multicultural team of leading researchers, this is the right opportunity for you!​

Apply for this job

Huawei is a leading global information and communications technology (ICT) solutions provider. Driven by a commitment to sound operations, ongoing innovation, and open collaboration, we have established a competitive ICT portfolio of end-to-end solutions in telecom and enterprise networks, devices, and cloud technology and services. Our ICT solutions, products, and services are used in more than 170 countries and regions, serving over one-third of the world's population. With 180,000 employees, Huawei is committed to enabling the future information society, and building a Better Connected World.
Our Huawei Zurich Research Center is embedded in the European network and plays a pivotal role in driving innovation. The research work of our Labs is carried out not only by Huawei's internal research staff but also by our academic research partners in universities across Europe. .
The Neuromorphic Computing Research Lab is in charge of incubating and developing cutting-edge Computing and AI technologies. Neuromorphic and Brain-Inspired Computing will play a key role in the next generation AI.
More specifically, the Lab conducts research and development on cutting edge Neuro-inspired Computing Algorithms/Materials, including but not limited to brain-mimetic algorithms design, neuromorphic computing elements, spiking neural networks, their training and optimal topology, distributed intelligent agents, knowledge representation and learning algorithms, algorithm optimization and acceleration, multi-modal learning.
For our Neuromorphic and Neuro-inspired Computing Research Lab in Zurich, we are looking for a high caliber:
Research Engineer in Neuromorphic and Neuro-inspired Computing Algorithms

Responsibilities

  • Research on Neuro-inspired computing and learning frameworks, key technologies and industry best practices;
  • Research and develop architectures and learning algorithms for next generation AI tools and applications;
  • Keep up to date with the latest research literature, attend conferences, and learn continuously to stay on the forefront of research
  • Work closely with a multidisciplinary team inside Huawei for integrated solutions;
  • Develop patents and publish academic articles in top venues (NeurIPS, ICLR, ICML, …);
 
  • Like
  • Fire
  • Thinking
Reactions: 19 users

cosors

👀
  • Fire
  • Like
  • Haha
Reactions: 9 users

Tothemoon24

Top 20

IMG_8443.jpeg

Space Machines Company: When the stars align​

A meeting between the CEO of a space start-up and a UTS Director led to a mutually beneficial collaboration that will forever alter the way the Australian space industry works.​

The commercialisation of the Australian space sector is light years away from where it was just three years ago.
It was around then that UTS Tech Lab Director of Business Development Roger Kermode first met with Australian high-tech start-up Space Machines Company (SMC) CEO Rajat Kulshrestha who shared a vision to build a business capable of inspecting, repairing, relocating, servicing, upgrading and disposing of spacecraft to create a more sustainable space industry.
The timing of the meeting coincided with the announcement by the previous Federal Government that it aimed to triple the national space economy from AU$4 billion to AU$12 billion and create 20,000 jobs by 2030. The State Government of the day argued NSW was best placed “economically, politically and geographically” to maximise the benefits to business from this new space era.
Space-Machines-Optimus-OSV-Spacecraft-scaled-1.jpg

SPACE MACHINES COMPANY OPTIMUS SPACECRAFT

Yet while SMC’s pathway was clear, it became obvious that its dual goals of technological advancement and commercial viability would require a supporting partner with access to academic and technical expertise as well as cutting-edge facilities, equipment and workshop facilities.
The collaboration unfolded in sync with UTS’s strategic initiative to amplify investment in industry partnerships, particularly by establishing innovation precincts tailored to fulfill the requirements of NSW’s broader innovation strategy. The strategic initiative seeks to cultivate best-practice research models, fostering a dynamic culture of collaboration and innovation within the university and its partner networks.
Previously, siloed thinking paired with a reluctance to collaborate between universities and industry meant many industry players viewed Australia’s tertiary providers primarily as a source of talent rather than places to further develop and commercialise technology.
But UTS Tech Lab’s very purpose was to highlight what could be achieved when industry and universities partnered to push boundaries and achieve groundbreaking outcomes.
Having discerned SMC’s individual needs, UTS Tech Lab was able to develop an innovation ecosystem that saw SMC provided with access to state-of-the-art equipment that would otherwise have to be rented or purchased.
As a direct result of this new approach to collaboration with industry, UTS Tech Lab constructed a bespoke facility on campus where the design and fitout were detailed specifically to accommodate SMC’s next generation of satellites. Like-minded partners and suppliers were actively recruited to become neighbours and enable an innovation ecosystem built around the lab and clean room occupied by SMC.
UTS Tech Lab’s academic team was approached with the opportunity to work alongside SMC to undertake the advanced vibration testing of their satellites at Tech Lab necessary to qualify them for launch. Tech Lab’s Multi-Axial-Simulation-Table (MAST) was utilised to undertake a static load test on SMC’s ‘Optimus’ spacecraft where it successfully mimicked launch conditions ahead of the upcoming SpaceX Transporter 10 rideshare launch.
Typically designed to simulate earthquakes rather than simulating the constant load of 12-plus g-force that occurs during launch, the bespoke test was designed to simulate the launch load and can now be used to test satellites for future missions.
SMC-UTS-Vibration-Test-2023.jpg

Image (L-R): Assoc Prof Ben Halkon, Peter Brown (UTS Senior Project Engineer) , Paul Hilton (SMC Mission Manager), Murali Shan (SMC Mechanical Engineer), Kenny Ng (SMC Manufacturing Engineer)
Rajat Kulsrestha, Space Machines Company CEO, says having the opportunity to forge such a close and constructive collaboration with UTS Tech Lab put a rocket under SMC’s program to launch Optimus.
It was fantastic to be able to develop our program out of UTS Tech Lab’s top class, purpose-built facility and have the exposure to the exceptional expertise of their people at the heart of their innovation ecosystem, Rajat says. The collaboration was a critical factor in getting our spacecraft through the rigorous pre-launch testing to be ready for deployment.
With the collaboration now in its third year, UTS Tech Lab and SMC are working together to complete a shared use facility for manufacturing and testing of larger satellites of up to 500kg, nearly twice the 270kg Optimus payload.
The facility will have the ability to be used by groups both inside UTS and out – including other universities and companies – as a means of sharing operating and maintenance costs and generating outcomes that would otherwise be impossible.
Dr Roger Kermode, UTS Faculty of Engineering and IT Director of Business Development, says there are other ways UTS Tech Lab is helping industry to push into new frontiers and achieve groundbreaking outcomes well beyond the use of loaned facilities and equipment.
In addition to creating new jobs and new research opportunities for researchers, forging closer relationships with industry partners also provides a host of internship and graduate opportunities for students, he says.
Currently, over 20 per cent of SMC employees come from UTS and SMC is now on the cusp of effecting a critical step change to the Australian space industry when at full capacity will result in flying at least three to four missions per year and having a permanent presence in space, he says.
We know that SMC needs the right sorts of employees, trained the right way to meet their needs. By working closely with Space Machines team to create internships and capstone projects we unlock the means for SMC to identify interested students and give them a trial run on a project that is low cost to [the business] and allows the student to work in interesting areas that both gain the academic credit and also the chance to impress a future employer.

This dramatically increases the opportunities for Australian space companies to fly their payloads. We will have a genuine sovereign space industry that our nascent space supply chain companies can use to deliver payloads into and in which students can embark on space-related studies confident that they will have a have a meaningful high-value job waiting for them upon graduation.
Roger Kermode, UTS Engineering & IT Director of Business Development.
UTS Tech Lab Director, Professor Robert Fitch says the partnership with SMC exemplifies the core principles guiding Tech Lab’s approach to industry collaboration and a prime example of what can be achieved when universities and industry are open to capturing the value from the IP they each generate while working together to achieve the best outcome.
“The whole of these sorts of collaborations is so much bigger than the sum of its parts,” Professor Fitch says. “The ripple effect is enormous, with positive outcomes cascading throughout Tech Lab and Space Machines Company in particular and Australia’s fledging space industry in general which it is helping to establish for generations to
 
  • Like
  • Fire
  • Love
Reactions: 55 users
Top Bottom