
Nintendo Switch 2 to Launch Q1 2025, Report Claims
In a new report, it has been claimed that the Nintendo Switch 2 will be released in Q1 2025 and that games are in development for it now.

Had to google that, wish I hadn't.Good morning. Guten Morgan. I must say with all the BRN action This week, I've turned into a sitzpinkler..... just so I can keep my eye on the trading....
I think they have a cream for thatGood morning. Guten Morgan. I must say with all the BRN action This week, I've turned into a sitzpinkler..... just so I can keep my eye on the trading....![]()
Run Bravo Run!!Lots of mega-dollars with aspirations sloshing around the new Holy Grail of AI chips/energy savings/Cloud avoidance security with the magical black box (BRN) apparently holding many of the essential codes for Grail realization.
Convince me that this monster accumulation is merely the market playing catch up for an undervalued equity. And not the expression of a Holy Shit epiphany.
The major obstacle to shareholders may be for Brainchip to cling to independence while our surreal/magic sauce has time to play out unfettered
Keep running Bravo...we are cheering for you.
I think she is out shopping for a much bigger spa and extra clothes racks. @BravoLots of mega-dollars with aspirations sloshing around the new Holy Grail of AI chips/energy savings/Cloud avoidance security with the magical black box (BRN) apparently holding many of the essential codes for Grail realization.
Convince me that this monster accumulation is merely the market playing catch up for an undervalued equity. And not the expression of a Holy Shit epiphany.
The major obstacle to shareholders may be for Brainchip to cling to independence while our surreal/magic sauce has time to play out unfettered
Keep running Bravo...we are cheering for you.
Something to browse whilst we are sitzpinkling today.This is beginning to feel like the good ole days.
Imagine how we'll fly when we get a name or two attached along with a few ongoing dollars.
Soon it'll be time to start looking at cars again........then come the boats....... then the aircraft.
Then the islands.......![]()
The mountain is coming to Akida.![]()
AI drives explosion in edge computing — Axios
AI is driving massive demand for edge computing infrastructure, as industrial and commercial users need to process more data locally to take advantage of AI's capabilities.apple.news
AI drives explosion in edge computing
AI is driving massive demand for edge computing infrastructure, as industrial and commercial users need to process more data locally to take advantage of AI's capabilities.
Why it matters: The shift follows years of big tech companies pushing organizations to migrate data to cloud computing services in remote data centers.
Hardware and software providers alike are concluding that edge computing — moving processing power closer to where data is being generated — provides a bridge between 5G networks and cloud data centers. They're starting to package all three services.
Driving the news: Intel, AWS, Nokia and Ericsson announced collaborations Feb. 12 to deliver edge AI services for manufacturing plants, transportation hubs such as ports, and other complicated sites — such as the ancient Chichen Itza temple in Mexico.
Labs are popping up — including one in St. Louis announced Feb. 15 by Intel, World Wide Technology and Federated Wireless — to help companies test and build customized private networks.
"The whole industry is figuring out how to trim these [AI] models to fit at the edge without loss of accuracy," Sameer Vuyyuru, AWS head of worldwide telecommunications business development, tells Axios.
Many of the world's biggest tech companies are preparing to launch new edge AI hardware and software at Mobile World Congress in Barcelona, Spain, starting Feb. 25.
Be smart: The growth of edge AI is evidence that bigger is not always better in AI.
Edge computing can enable faster data processing for time-sensitive applications and compliance with high security and privacy standards — suiting it to sectors such as health and finance.
What's happening: "75% of data compute is moving to the edge," Kirk Skaugen, head of Lenovo's infrastructure business, tells Axios.
"We're working with a large car company to get rid of 250,000 embedded PCs and replace them with probably 25,000 edge servers," he said.
All the edge computing providers Axios spoke to said they're seeing high demand from organizations that operate in remote locations or have special security needs — from financial services to hospitals.
Details: Many new edge services are driven by customer demand.
In the case of remote oil and gas rigs, owners found they were collecting massive amounts of data that became obsolete within seconds, so they needed ways to process that data quicker, Intel vice president Caroline Chan told Axios.
Lenovo sees a huge market for edge AI to support smart city technologies — ranging from using computer vision to trigger responses to fires to helping vision-impaired people navigate streets.
By the numbers: The global market for edge computing is already worth over $200 billion a year.
Yes, but: To make better use of all the data they are producing, many large organizations will need to look for combinations of cloud computing, edge computing and private 5G networks to ensure their AI-enabled services run seamlessly.
What they're saying: "We've created a marketplace for private networks," AWS' Vuyyuru says.
"Private networks are really thriving in four environments: Remote locations, rugged environments, places where resilience is needed — two networks at the same time — and restricted environments, in terms of who needs to have access to it," he says.
Chan sees the most potential in "personalized private networks" — ones that customize edge computing servers and private 5G network hardware with AI application software to meet the specific needs of an organization.
"Move AI to where the data is," Skaugen advises.
Hi All
I have been giving some thought to what could be driving interest in Brainchip other than all the information that is known and shared here on TSEx. One thing which I don’t think has raised much attention even here on TSEx is the stature of the new Chief Technology Officer Dr. M. Anthony Lewis (Tony). I have no doubt that TSEx regulars know all about him but for those who don’t or need a refresh the following is a snapshot of his CV from Wikipedia:
“M. Anthony Lewis is an American robotics researcher and currently serves as the Vice President of Hewlett-Packard and the head of Hewlett-Packard's Compute Lab for disruptive edge technologies. Formerly, he served as the Head of was the former Senior Director of Technology at Qualcomm Technologies and was the creator of Zeroth neural processing unit and its software API. He is past CEO of Iguana Robotics, a company specializing in the development of biomorphic robotics technologies.[1]Lewis received his Ph.D. at the University of Southern California under the guidance of Michael Arbib and George Bekey.
He has served on the faculty of the University of California, Los Angeles and the University of Illinois and is currently on the faculty of the University of Arizona.
He is known for his work in evolutionary and biomorphic robotics, formation control of robotic systems, and investigations into the basis of movement control in humans and robots. He collaborated on a project to help paralyzed people, using studies of an eel's nerve circuitry.[2]In recent work, Lewis and colleagues have demonstrated a robot that claimed to be the most biologically accurate model of human locomotion to date.[3] This robotic uses a muscle architecture much like a human being, a simplified neural circuit meant to mimic neurons in the spinal cord, and sensory feedback mimicking the primary sensory pathways found in human.”
Now keeping this in mind anyone even the man in the street knows that the big story in 2023 and again in 2024 is Sam Altman and CHATGpt. The Sam Altman story of course includes his quest to make CHATGpt more user friendly from a power consumption perspective and the general media as well as financial media are awash with stories about his trillion dollar deals. CHATGpt, Large Language Models and GenAi are building to be the technology story of this decade perhaps century. If you search for the market size for LLMs there are numerous reports all largely in line and pushing the common theme of sensation growth through to 2030. Statista has this to say:
https://www.statista.com/outlook/tmo/artificial-intelligence/natural-language-processing/worldwide
- The market size in the Natural Language Processing market is projected to reach US$29.19bn in 2024.
- The market size is expected to show an annual growth rate (CAGR 2024-2030) of 13.79%, resulting in a market volume of US$63.37bn by 2030.
- In global comparison, the largest market size will be in the United States (US$10,430.00m in 2024).
Coming back to Dr. Lewis it is clear that he is extremely well known in the semiconductor industry and academic world and if you take the time to look on Google Scholar you will discover his academic contacts span the Globe including China.
So it is trite to say that when he posts on his LinkedIn account a lot of very intelligent well connected eyes are watching.
Putting the importance of Large Language Models together with Dr. Lewis’s reputation I do not feel I am going out on a limb to suggest that the following is potentially something of great significance and likely to have caused a number of very important players in academia and industry to sit up and take notice:
View attachment 57068
I would also suggest that you do not need to be a rocket scientist to understand the significance of Dr. Lewis telling the world
that Brainchip’s technology is to his knowledge the FIRST in the world to run tiny LLMs at the EDGE with very low power with
STATE OF THE ART (SOTA) performance.
The other significant feature of his post is that he is putting the world on notice that Brainchip is not satisfied with just the
Edge but is also looking to scale up AKIDA technology into the data centre/Cloud for the same purpose.
I do not need to tell long term followers of AKIDA technology that it has been the case since day one that Peter van der Made and
Anil Mankar have continuously stated that they chose to design in digital because it allowed AKIDA to be scaled for eventual data
centre applications.
Of course, I may well be wrong, and this is just my anonymous opinion but the more I think about it the less I doubt my conclusion
that this is a significant driver of the current resurgence by Brainchip on the Australian, German and US markets.
My opinion only DYOR
Fact Finder
Edge LLMs is another area where we have a not-so-secret sauce, CNN2SNN for existing CNN LLMs, or just simply native Akida SNN if you are starting from scratch. Most (all) AI companies have a major investment in their CNN models. These legacy CNN models can be refurbished as SNN models, and BRN has the tools to do it. If you want to add a little French Polish to your LLMs, or double your Dutch LLMs' performance, try Akida MetaTF - it comes in 3 flavours, so there's something for everyone.Hi All
I have been giving some thought to what could be driving interest in Brainchip other than all the information that is known and shared here on TSEx. One thing which I don’t think has raised much attention even here on TSEx is the stature of the new Chief Technology Officer Dr. M. Anthony Lewis (Tony). I have no doubt that TSEx regulars know all about him but for those who don’t or need a refresh the following is a snapshot of his CV from Wikipedia:
“M. Anthony Lewis is an American robotics researcher and currently serves as the Vice President of Hewlett-Packard and the head of Hewlett-Packard's Compute Lab for disruptive edge technologies. Formerly, he served as the Head of was the former Senior Director of Technology at Qualcomm Technologies and was the creator of Zeroth neural processing unit and its software API. He is past CEO of Iguana Robotics, a company specializing in the development of biomorphic robotics technologies.[1]Lewis received his Ph.D. at the University of Southern California under the guidance of Michael Arbib and George Bekey.
He has served on the faculty of the University of California, Los Angeles and the University of Illinois and is currently on the faculty of the University of Arizona.
He is known for his work in evolutionary and biomorphic robotics, formation control of robotic systems, and investigations into the basis of movement control in humans and robots. He collaborated on a project to help paralyzed people, using studies of an eel's nerve circuitry.[2]In recent work, Lewis and colleagues have demonstrated a robot that claimed to be the most biologically accurate model of human locomotion to date.[3] This robotic uses a muscle architecture much like a human being, a simplified neural circuit meant to mimic neurons in the spinal cord, and sensory feedback mimicking the primary sensory pathways found in human.”
Now keeping this in mind anyone even the man in the street knows that the big story in 2023 and again in 2024 is Sam Altman and CHATGpt. The Sam Altman story of course includes his quest to make CHATGpt more user friendly from a power consumption perspective and the general media as well as financial media are awash with stories about his trillion dollar deals. CHATGpt, Large Language Models and GenAi are building to be the technology story of this decade perhaps century. If you search for the market size for LLMs there are numerous reports all largely in line and pushing the common theme of sensation growth through to 2030. Statista has this to say:
https://www.statista.com/outlook/tmo/artificial-intelligence/natural-language-processing/worldwide
- The market size in the Natural Language Processing market is projected to reach US$29.19bn in 2024.
- The market size is expected to show an annual growth rate (CAGR 2024-2030) of 13.79%, resulting in a market volume of US$63.37bn by 2030.
- In global comparison, the largest market size will be in the United States (US$10,430.00m in 2024).
Coming back to Dr. Lewis it is clear that he is extremely well known in the semiconductor industry and academic world and if you take the time to look on Google Scholar you will discover his academic contacts span the Globe including China.
So it is trite to say that when he posts on his LinkedIn account a lot of very intelligent well connected eyes are watching.
Putting the importance of Large Language Models together with Dr. Lewis’s reputation I do not feel I am going out on a limb to suggest that the following is potentially something of great significance and likely to have caused a number of very important players in academia and industry to sit up and take notice:
View attachment 57068
I would also suggest that you do not need to be a rocket scientist to understand the significance of Dr. Lewis telling the world
that Brainchip’s technology is to his knowledge the FIRST in the world to run tiny LLMs at the EDGE with very low power with
STATE OF THE ART (SOTA) performance.
The other significant feature of his post is that he is putting the world on notice that Brainchip is not satisfied with just the
Edge but is also looking to scale up AKIDA technology into the data centre/Cloud for the same purpose.
I do not need to tell long term followers of AKIDA technology that it has been the case since day one that Peter van der Made and Anil Mankar have continuously stated that they chose to design in digital because it allowed AKIDA to be scaled for eventual data centre applications.
Of course, I may well be wrong, and this is just my anonymous opinion but the more I think about it the less I doubt my conclusion
that this is a significant driver of the current resurgence by Brainchip on the Australian, German and US markets.
My opinion only DYOR
Fact Finder
Comment from Mark Grogan, our partner of Circle8