BRN Discussion Ongoing

cosors

👀
Wow - it's still rising. There are still 2½ hours to go.
View attachment 57053
Already answered, but I don't delete it.
A total for Germany in one day of 5,866,407. I can't remember, it is a long time ago...
 
  • Like
  • Fire
  • Wow
Reactions: 21 users

IloveLamp

Top 20
  • Like
  • Sad
  • Fire
Reactions: 9 users

Deadpool

hyper-efficient Ai
Good morning. Guten Morgan. I must say with all the BRN action This week, I've turned into a sitzpinkler..... just so I can keep my eye on the trading....
Had to google that, wish I hadn't.

Fran Healy Reading GIF by Travis
 
Last edited:
  • Haha
Reactions: 9 users

IloveLamp

Top 20
Good morning. Guten Morgan. I must say with all the BRN action This week, I've turned into a sitzpinkler..... just so I can keep my eye on the trading....🤣
I think they have a cream for that 🤔
1389.gif
 
  • Haha
Reactions: 6 users

charles2

Regular
Lots of mega-dollars with aspirations sloshing around the new Holy Grail of AI chips/energy savings/Cloud avoidance security with the magical black box (BRN) apparently holding many of the essential codes for Grail realization.

Convince me that this monster accumulation is merely the market playing catch up for an undervalued equity. And not the expression of a Holy Shit epiphany.

The major obstacle to shareholders may be for Brainchip to cling to independence while our surreal/magic sauce has time to play out unfettered

Keep running Bravo...we are cheering you on.
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 44 users

Gemmax

Regular
Lots of mega-dollars with aspirations sloshing around the new Holy Grail of AI chips/energy savings/Cloud avoidance security with the magical black box (BRN) apparently holding many of the essential codes for Grail realization.

Convince me that this monster accumulation is merely the market playing catch up for an undervalued equity. And not the expression of a Holy Shit epiphany.

The major obstacle to shareholders may be for Brainchip to cling to independence while our surreal/magic sauce has time to play out unfettered

Keep running Bravo...we are cheering for you.
Run Bravo Run!!
 
  • Haha
  • Like
Reactions: 13 users

Wags

Regular
Lots of mega-dollars with aspirations sloshing around the new Holy Grail of AI chips/energy savings/Cloud avoidance security with the magical black box (BRN) apparently holding many of the essential codes for Grail realization.

Convince me that this monster accumulation is merely the market playing catch up for an undervalued equity. And not the expression of a Holy Shit epiphany.

The major obstacle to shareholders may be for Brainchip to cling to independence while our surreal/magic sauce has time to play out unfettered

Keep running Bravo...we are cheering for you.
I think she is out shopping for a much bigger spa and extra clothes racks. @Bravo
 
  • Like
  • Haha
  • Fire
Reactions: 10 users

SiDEvans

Regular
This is beginning to feel like the good ole days.🤣
Imagine how we'll fly when we get a name or two attached along with a few ongoing dollars.🤣
Soon it'll be time to start looking at cars again........then come the boats....... then the aircraft.🤣
Then the islands.......🤣
Something to browse whilst we are sitzpinkling today.
 
  • Haha
  • Like
  • Love
Reactions: 21 users

Esq.111

Fascinatingly Intuitive.
Good Morning Chippers ,

Some quick loose global shareprice equivalents for 🧠 🍟 at close of trade.

Australia 🇦🇺 , $0.36 AU, up 26.327%.

Exchange rate USD $ 1.oo = AU $1.5306755 .

Germany 🇩🇪 €0.24. , up 38.34% , Converted = $0.39598319 AU.

America 🇺🇸
A, $0.245 , up 23.12% , Converted = $0.3750154975 AU
B, ADR. $10.50. , up 33.93% , Converted = $16.072092 AU
* the above ADR ÷40 units per ADR = $0.4018023 AU equivalent.

Regards,
Esq.
 
  • Like
  • Love
  • Fire
Reactions: 89 users

IloveLamp

Top 20
Waiting for market open on Monday like
1000013371.gif
 
  • Haha
  • Like
Reactions: 30 users

Tothemoon24

Top 20
Mainstream media starting to wake up
To the up coming edge revolution ,

IMG_8384.jpeg


 
  • Like
  • Fire
Reactions: 28 users

charles2

Regular





AI drives explosion in edge computing


AI is driving massive demand for edge computing infrastructure, as industrial and commercial users need to process more data locally to take advantage of AI's capabilities.


Why it matters: The shift follows years of big tech companies pushing organizations to migrate data to cloud computing services in remote data centers.


Hardware and software providers alike are concluding that edge computing — moving processing power closer to where data is being generated — provides a bridge between 5G networks and cloud data centers. They're starting to package all three services.


Driving the news: Intel, AWS, Nokia and Ericsson announced collaborations Feb. 12 to deliver edge AI services for manufacturing plants, transportation hubs such as ports, and other complicated sites — such as the ancient Chichen Itza temple in Mexico.


Labs are popping up — including one in St. Louis announced Feb. 15 by Intel, World Wide Technology and Federated Wireless — to help companies test and build customized private networks.


"The whole industry is figuring out how to trim these [AI] models to fit at the edge without loss of accuracy," Sameer Vuyyuru, AWS head of worldwide telecommunications business development, tells Axios.


Many of the world's biggest tech companies are preparing to launch new edge AI hardware and software at Mobile World Congress in Barcelona, Spain, starting Feb. 25.


Be smart: The growth of edge AI is evidence that bigger is not always better in AI.


Edge computing can enable faster data processing for time-sensitive applications and compliance with high security and privacy standards — suiting it to sectors such as health and finance.


What's happening: "75% of data compute is moving to the edge," Kirk Skaugen, head of Lenovo's infrastructure business, tells Axios.


"We're working with a large car company to get rid of 250,000 embedded PCs and replace them with probably 25,000 edge servers," he said.


All the edge computing providers Axios spoke to said they're seeing high demand from organizations that operate in remote locations or have special security needs — from financial services to hospitals.


Details: Many new edge services are driven by customer demand.


In the case of remote oil and gas rigs, owners found they were collecting massive amounts of data that became obsolete within seconds, so they needed ways to process that data quicker, Intel vice president Caroline Chan told Axios.


Lenovo sees a huge market for edge AI to support smart city technologies — ranging from using computer vision to trigger responses to fires to helping vision-impaired people navigate streets.


By the numbers: The global market for edge computing is already worth over $200 billion a year.


Yes, but: To make better use of all the data they are producing, many large organizations will need to look for combinations of cloud computing, edge computing and private 5G networks to ensure their AI-enabled services run seamlessly.


What they're saying: "We've created a marketplace for private networks," AWS' Vuyyuru says.


"Private networks are really thriving in four environments: Remote locations, rugged environments, places where resilience is needed — two networks at the same time — and restricted environments, in terms of who needs to have access to it," he says.


Chan sees the most potential in "personalized private networks" — ones that customize edge computing servers and private 5G network hardware with AI application software to meet the specific needs of an organization.


"Move AI to where the data is," Skaugen advises.
 
  • Like
  • Love
  • Fire
Reactions: 35 users

Diogenese

Top 20





AI drives explosion in edge computing


AI is driving massive demand for edge computing infrastructure, as industrial and commercial users need to process more data locally to take advantage of AI's capabilities.


Why it matters: The shift follows years of big tech companies pushing organizations to migrate data to cloud computing services in remote data centers.


Hardware and software providers alike are concluding that edge computing — moving processing power closer to where data is being generated — provides a bridge between 5G networks and cloud data centers. They're starting to package all three services.


Driving the news: Intel, AWS, Nokia and Ericsson announced collaborations Feb. 12 to deliver edge AI services for manufacturing plants, transportation hubs such as ports, and other complicated sites — such as the ancient Chichen Itza temple in Mexico.


Labs are popping up — including one in St. Louis announced Feb. 15 by Intel, World Wide Technology and Federated Wireless — to help companies test and build customized private networks.


"The whole industry is figuring out how to trim these [AI] models to fit at the edge without loss of accuracy," Sameer Vuyyuru, AWS head of worldwide telecommunications business development, tells Axios.


Many of the world's biggest tech companies are preparing to launch new edge AI hardware and software at Mobile World Congress in Barcelona, Spain, starting Feb. 25.


Be smart: The growth of edge AI is evidence that bigger is not always better in AI.


Edge computing can enable faster data processing for time-sensitive applications and compliance with high security and privacy standards — suiting it to sectors such as health and finance.


What's happening: "75% of data compute is moving to the edge," Kirk Skaugen, head of Lenovo's infrastructure business, tells Axios.


"We're working with a large car company to get rid of 250,000 embedded PCs and replace them with probably 25,000 edge servers," he said.


All the edge computing providers Axios spoke to said they're seeing high demand from organizations that operate in remote locations or have special security needs — from financial services to hospitals.


Details: Many new edge services are driven by customer demand.


In the case of remote oil and gas rigs, owners found they were collecting massive amounts of data that became obsolete within seconds, so they needed ways to process that data quicker, Intel vice president Caroline Chan told Axios.


Lenovo sees a huge market for edge AI to support smart city technologies — ranging from using computer vision to trigger responses to fires to helping vision-impaired people navigate streets.


By the numbers: The global market for edge computing is already worth over $200 billion a year.


Yes, but: To make better use of all the data they are producing, many large organizations will need to look for combinations of cloud computing, edge computing and private 5G networks to ensure their AI-enabled services run seamlessly.


What they're saying: "We've created a marketplace for private networks," AWS' Vuyyuru says.


"Private networks are really thriving in four environments: Remote locations, rugged environments, places where resilience is needed — two networks at the same time — and restricted environments, in terms of who needs to have access to it," he says.


Chan sees the most potential in "personalized private networks" — ones that customize edge computing servers and private 5G network hardware with AI application software to meet the specific needs of an organization.


"Move AI to where the data is," Skaugen advises.
The mountain is coming to Akida.
 
  • Like
  • Love
  • Fire
Reactions: 48 users

BigDonger101

Founding Member
The people who are threatening the second strike are hilarious.

Share price goes up, a lot go quiet. So weak.

You guys really are just a joke.
 
  • Like
  • Fire
  • Love
Reactions: 21 users

IloveLamp

Top 20

1000013375.jpg
1000013373.jpg
 
  • Like
  • Love
  • Fire
Reactions: 57 users

Tothemoon24

Top 20
More links to Renesas & hearing aids
IMG_8386.jpeg



IMG_8385.jpeg
 
  • Like
  • Fire
  • Love
Reactions: 25 users
Hi All

I have been giving some thought to what could be driving interest in Brainchip other than all the information that is known and shared here on TSEx. One thing which I don’t think has raised much attention even here on TSEx is the stature of the new Chief Technology Officer Dr. M. Anthony Lewis (Tony). I have no doubt that TSEx regulars know all about him but for those who don’t or need a refresh the following is a snapshot of his CV from Wikipedia:

“M. Anthony Lewis is an American robotics researcher and currently serves as the Vice President of Hewlett-Packard and the head of Hewlett-Packard's Compute Lab for disruptive edge technologies. Formerly, he served as the Head of was the former Senior Director of Technology at Qualcomm Technologies and was the creator of Zeroth neural processing unit and its software API. He is past CEO of Iguana Robotics, a company specializing in the development of biomorphic robotics technologies.[1]Lewis received his Ph.D. at the University of Southern California under the guidance of Michael Arbib and George Bekey.

He has served on the faculty of the University of California, Los Angeles and the University of Illinois and is currently on the faculty of the University of Arizona.

He is known for his work in evolutionary and biomorphic robotics, formation control of robotic systems, and investigations into the basis of movement control in humans and robots. He collaborated on a project to help paralyzed people, using studies of an eel's nerve circuitry.[2]In recent work, Lewis and colleagues have demonstrated a robot that claimed to be the most biologically accurate model of human locomotion to date.[3] This robotic uses a muscle architecture much like a human being, a simplified neural circuit meant to mimic neurons in the spinal cord, and sensory feedback mimicking the primary sensory pathways
found in human.”

Now keeping this in mind anyone even the man in the street knows that the big story in 2023 and again in 2024 is Sam Altman and CHATGpt. The Sam Altman story of course includes his quest to make CHATGpt more user friendly from a power consumption perspective and the general media as well as financial media are awash with stories about his trillion dollar deals. CHATGpt, Large Language Models and GenAi are building to be the technology story of this decade perhaps century. If you search for the market size for LLMs there are numerous reports all largely in line and pushing the common theme of sensation growth through to 2030. Statista has this to say:
  • The market size in the Natural Language Processing market is projected to reach US$29.19bn in 2024.
  • The market size is expected to show an annual growth rate (CAGR 2024-2030) of 13.79%, resulting in a market volume of US$63.37bn by 2030.
  • In global comparison, the largest market size will be in the United States (US$10,430.00m in 2024).
https://www.statista.com/outlook/tmo/artificial-intelligence/natural-language-processing/worldwide

Coming back to Dr. Lewis it is clear that he is extremely well known in the semiconductor industry and academic world and if you take the time to look on Google Scholar you will discover his academic contacts span the Globe including China.

So it is trite to say that when he posts on his LinkedIn account a lot of very intelligent well connected eyes are watching.

Putting the importance of Large Language Models together with Dr. Lewis’s reputation I do not feel I am going out on a limb to suggest that the following is potentially something of great significance and likely to have caused a number of very important players in academia and industry to sit up and take notice:

1708133065915.png




I would also suggest that you do not need to be a rocket scientist to understand the significance of Dr. Lewis telling the world
that Brainchip’s technology is to his knowledge the FIRST in the world to run tiny LLMs at the EDGE with very low power with
STATE OF THE ART (SOTA) performance.

The other significant feature of his post is that he is putting the world on notice that Brainchip is not satisfied with just the
Edge but is also looking to scale up AKIDA technology into the data centre/Cloud for the same purpose.

I do not need to tell long term followers of AKIDA technology that it has been the case since day one that Peter van der Made and Anil Mankar have continuously stated that they chose to design in digital because it allowed AKIDA to be scaled for eventual data centre applications.

Of course, I may well be wrong, and this is just my anonymous opinion but the more I think about it the less I doubt my conclusion
that this is a significant driver of the current resurgence by Brainchip on the Australian, German and US markets.

My opinion only DYOR

Fact Finder
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 137 users

IloveLamp

Top 20

1000013378.jpg
 
  • Like
  • Love
Reactions: 18 users

hotty4040

Regular
Hi All

I have been giving some thought to what could be driving interest in Brainchip other than all the information that is known and shared here on TSEx. One thing which I don’t think has raised much attention even here on TSEx is the stature of the new Chief Technology Officer Dr. M. Anthony Lewis (Tony). I have no doubt that TSEx regulars know all about him but for those who don’t or need a refresh the following is a snapshot of his CV from Wikipedia:

“M. Anthony Lewis is an American robotics researcher and currently serves as the Vice President of Hewlett-Packard and the head of Hewlett-Packard's Compute Lab for disruptive edge technologies. Formerly, he served as the Head of was the former Senior Director of Technology at Qualcomm Technologies and was the creator of Zeroth neural processing unit and its software API. He is past CEO of Iguana Robotics, a company specializing in the development of biomorphic robotics technologies.[1]Lewis received his Ph.D. at the University of Southern California under the guidance of Michael Arbib and George Bekey.

He has served on the faculty of the University of California, Los Angeles and the University of Illinois and is currently on the faculty of the University of Arizona.

He is known for his work in evolutionary and biomorphic robotics, formation control of robotic systems, and investigations into the basis of movement control in humans and robots. He collaborated on a project to help paralyzed people, using studies of an eel's nerve circuitry.[2]In recent work, Lewis and colleagues have demonstrated a robot that claimed to be the most biologically accurate model of human locomotion to date.[3] This robotic uses a muscle architecture much like a human being, a simplified neural circuit meant to mimic neurons in the spinal cord, and sensory feedback mimicking the primary sensory pathways
found in human.”

Now keeping this in mind anyone even the man in the street knows that the big story in 2023 and again in 2024 is Sam Altman and CHATGpt. The Sam Altman story of course includes his quest to make CHATGpt more user friendly from a power consumption perspective and the general media as well as financial media are awash with stories about his trillion dollar deals. CHATGpt, Large Language Models and GenAi are building to be the technology story of this decade perhaps century. If you search for the market size for LLMs there are numerous reports all largely in line and pushing the common theme of sensation growth through to 2030. Statista has this to say:
  • The market size in the Natural Language Processing market is projected to reach US$29.19bn in 2024.
  • The market size is expected to show an annual growth rate (CAGR 2024-2030) of 13.79%, resulting in a market volume of US$63.37bn by 2030.
  • In global comparison, the largest market size will be in the United States (US$10,430.00m in 2024).
https://www.statista.com/outlook/tmo/artificial-intelligence/natural-language-processing/worldwide

Coming back to Dr. Lewis it is clear that he is extremely well known in the semiconductor industry and academic world and if you take the time to look on Google Scholar you will discover his academic contacts span the Globe including China.

So it is trite to say that when he posts on his LinkedIn account a lot of very intelligent well connected eyes are watching.

Putting the importance of Large Language Models together with Dr. Lewis’s reputation I do not feel I am going out on a limb to suggest that the following is potentially something of great significance and likely to have caused a number of very important players in academia and industry to sit up and take notice:

View attachment 57068



I would also suggest that you do not need to be a rocket scientist to understand the significance of Dr. Lewis telling the world
that Brainchip’s technology is to his knowledge the FIRST in the world to run tiny LLMs at the EDGE with very low power with
STATE OF THE ART (SOTA) performance.

The other significant feature of his post is that he is putting the world on notice that Brainchip is not satisfied with just the
Edge but is also looking to scale up AKIDA technology into the data centre/Cloud for the same purpose.

I do not need to tell long term followers of AKIDA technology that it has been the case since day one that Peter van der Made and
Anil Mankar have continuously stated that they chose to design in digital because it allowed AKIDA to be scaled for eventual data
centre applications.

Of course, I may well be wrong, and this is just my anonymous opinion but the more I think about it the less I doubt my conclusion
that this is a significant driver of the current resurgence by Brainchip on the Australian, German and US markets.

My opinion only DYOR

Fact Finder

True to form FF, and much food for thought's in abundance, IMHO.

I wonder at what stage this scaling is at, currently ? We may learn, lots, at the upcoming session with Sean shortly.
I'll be " all ears " at this promising ' get together ' with other chippers from the forum. i.e. if I can work out how to connect to it, without too much hassle, hopefully.

Thnx for keeping me engaged. Love your input.

regards,

Akida Ballista >>>>> what a ride so far, ( exhausting though ) <<<<<

hotty...
 
  • Like
  • Love
  • Fire
Reactions: 28 users

Diogenese

Top 20
Hi All

I have been giving some thought to what could be driving interest in Brainchip other than all the information that is known and shared here on TSEx. One thing which I don’t think has raised much attention even here on TSEx is the stature of the new Chief Technology Officer Dr. M. Anthony Lewis (Tony). I have no doubt that TSEx regulars know all about him but for those who don’t or need a refresh the following is a snapshot of his CV from Wikipedia:

“M. Anthony Lewis is an American robotics researcher and currently serves as the Vice President of Hewlett-Packard and the head of Hewlett-Packard's Compute Lab for disruptive edge technologies. Formerly, he served as the Head of was the former Senior Director of Technology at Qualcomm Technologies and was the creator of Zeroth neural processing unit and its software API. He is past CEO of Iguana Robotics, a company specializing in the development of biomorphic robotics technologies.[1]Lewis received his Ph.D. at the University of Southern California under the guidance of Michael Arbib and George Bekey.

He has served on the faculty of the University of California, Los Angeles and the University of Illinois and is currently on the faculty of the University of Arizona.

He is known for his work in evolutionary and biomorphic robotics, formation control of robotic systems, and investigations into the basis of movement control in humans and robots. He collaborated on a project to help paralyzed people, using studies of an eel's nerve circuitry.[2]In recent work, Lewis and colleagues have demonstrated a robot that claimed to be the most biologically accurate model of human locomotion to date.[3] This robotic uses a muscle architecture much like a human being, a simplified neural circuit meant to mimic neurons in the spinal cord, and sensory feedback mimicking the primary sensory pathways
found in human.”

Now keeping this in mind anyone even the man in the street knows that the big story in 2023 and again in 2024 is Sam Altman and CHATGpt. The Sam Altman story of course includes his quest to make CHATGpt more user friendly from a power consumption perspective and the general media as well as financial media are awash with stories about his trillion dollar deals. CHATGpt, Large Language Models and GenAi are building to be the technology story of this decade perhaps century. If you search for the market size for LLMs there are numerous reports all largely in line and pushing the common theme of sensation growth through to 2030. Statista has this to say:
  • The market size in the Natural Language Processing market is projected to reach US$29.19bn in 2024.
  • The market size is expected to show an annual growth rate (CAGR 2024-2030) of 13.79%, resulting in a market volume of US$63.37bn by 2030.
  • In global comparison, the largest market size will be in the United States (US$10,430.00m in 2024).
https://www.statista.com/outlook/tmo/artificial-intelligence/natural-language-processing/worldwide

Coming back to Dr. Lewis it is clear that he is extremely well known in the semiconductor industry and academic world and if you take the time to look on Google Scholar you will discover his academic contacts span the Globe including China.

So it is trite to say that when he posts on his LinkedIn account a lot of very intelligent well connected eyes are watching.

Putting the importance of Large Language Models together with Dr. Lewis’s reputation I do not feel I am going out on a limb to suggest that the following is potentially something of great significance and likely to have caused a number of very important players in academia and industry to sit up and take notice:

View attachment 57068



I would also suggest that you do not need to be a rocket scientist to understand the significance of Dr. Lewis telling the world
that Brainchip’s technology is to his knowledge the FIRST in the world to run tiny LLMs at the EDGE with very low power with
STATE OF THE ART (SOTA) performance.

The other significant feature of his post is that he is putting the world on notice that Brainchip is not satisfied with just the
Edge but is also looking to scale up AKIDA technology into the data centre/Cloud for the same purpose.

I do not need to tell long term followers of AKIDA technology that it has been the case since day one that Peter van der Made and Anil Mankar have continuously stated that they chose to design in digital because it allowed AKIDA to be scaled for eventual data centre applications.

Of course, I may well be wrong, and this is just my anonymous opinion but the more I think about it the less I doubt my conclusion
that this is a significant driver of the current resurgence by Brainchip on the Australian, German and US markets.

My opinion only DYOR

Fact Finder
Edge LLMs is another area where we have a not-so-secret sauce, CNN2SNN for existing CNN LLMs, or just simply native Akida SNN if you are starting from scratch. Most (all) AI companies have a major investment in their CNN models. These legacy CNN models can be refurbished as SNN models, and BRN has the tools to do it. If you want to add a little French Polish to your LLMs, or double your Dutch LLMs' performance, try Akida MetaTF - it comes in 3 flavours, so there's something for everyone.
 
  • Like
  • Love
  • Fire
Reactions: 74 users
Top Bottom