BRN Discussion Ongoing


IMG_1539.jpeg


IMG_1538.jpeg
 
  • Like
  • Fire
  • Thinking
Reactions: 7 users

buena suerte :-)

BOB Bank of Brainchip
Is this the first time we have publicly stated that the company intends to "enable radical innovation required to bring LLM's to the edge.
Although I must admit my disappointment whenever I see another CR I was then very pleased to read what it was for.
Overall happy days
I agree Boab..I think this is going to be our much needed big step up moving forward !!!!

“The company will also bolster the CTO function, enabling radical innovation required to bring large language models, multi-modal operation and other state of art AI to the edge and ensure we remain the industry leaders in hyper-efficient Edge AI” SH
 
  • Like
  • Fire
  • Love
Reactions: 12 users

AARONASX

Holding onto what I've got
Nice find. Do you think this is us?

No mentioned in the product description but lots of familiar phrases… “Fanless” “supports neural networks”

View attachment 59974


Mentions that the CPU -Intel® Atom® x6414RE Processor SoC

Intel® Gaussian & Neural Accelerator 1.0


possibly not, as been around back in 2022, unless works together.

1711581130348.png
 
  • Like
  • Fire
Reactions: 5 users

Diogenese

Top 20
I need help from an expert here, but isn’t Vision Transformers at the Edge an Akida 2 speciality?

View attachment 59958
ViT is a new feature of Akida 2, but, according to Prof. Wiki, the idea of transformers has been around for several years, although the concept was applied to vision a couple of years ago.
 
  • Like
Reactions: 7 users

Bravo

If ARM was an arm, BRN would be its biceps💪!

Sony Semiconductor Brings Inference Close To The Edge​

Steve McDowell
Contributor
Chief Analyst & CEO, NAND Research.


https://www.forbes.com/sites/stevem...close-to-the-edge/?sh=660e80bb34f9#open-web-0
Mar 27, 2024,04:12pm EDT
Sony Semiconductor Solutions Group

Sony Semiconductor Solutions Group
NURPHOTO VIA GETTY IMAGES

AI only matters to businesses if the technology enables competitive differentiation or drives increased efficiencies. The past year has seen technology companies focus on training models that promise to change enterprises across industries. While training has been the focus, the recent NVIDIA GTC event showcased a rapid transition towards inference, where actual business value lay.


AI at the Retail Edge​

Retail is one of the industries that promises to benefit most from AI. Generative AI and large language models aside, retail organizations are already deploying image recognition systems for diverse tasks, from inventory control and loss prevention to customer service.

Earlier this year, Nvidia published its 2024 State of AI in Retail and CPG report that takes a survey-based approach to understanding the use of AI in the retail sector. Nvidia found that 42% percent of retailers already use AI, with an additional 34% assessing or piloting AI programs. Narrowing the aperture, among large retailers with revenues of more than $500 million, the adoption of AI stretches to 64%. That’s a massive market.


The challenge for retailers and the array of ecosystem partners catering to them is that AI can be complex. Large language models and generative AI require infrastructure that scales beyond the capabilities of many retail locations. Using the cloud to solve those problems isn't always practical, either, as applications like vision processing need to be done at the edge, where the data lives.


Sony’s Platform Approach to On-Device Inference​

Sony Semiconductor Solutions Corporation took on the challenge of simplifying vision processing and inference, resulting in the introduction of its AITRIOS edge AI sensing platform. AITRIOS addresses six significant cloud challenges based IoT systems, including handling large data volumes, enhancing data privacy, reducing latency, conserving energy, ensuring service continuity, and securing data.


AITRIOS accelerates the deployment of edge AI-powered sensing solutions across industries, enabling a comprehensive ecosystem for creating solutions that blend edge computing and cloud technologies.


 
  • Like
  • Thinking
  • Love
Reactions: 12 users

TheDrooben

Pretty Pretty Pretty Pretty Good
Nice find. Do you think this is us?

No mentioned in the product description but lots of familiar phrases… “Fanless” “supports neural networks”

View attachment 59974


I saw that it was updated Cupcake "2" and fanless so Larry has his fingers crossed......although no specific mention. We are in a new configuration of Cupcake edge AI servers
 
  • Like
Reactions: 10 users

DJM263

LTH - 2015
Can I ask you please about the Arm Cortex M55 and if we know if BRN has or can be integrated into this yet ?.
Hi @Smoothsailing

I actually sent the company an email back when the Arm Cortex M55 was announced (23/11/23) asking if Akida had been used in the same way as in the Cortex M85!

The following is the only response I got. Understanding PVDM is very busy so i thought I would just sit back and let things play out...

23/11/2023

Thank you for your email.

I have forwarded your email to Peter van der Made, and asked him to provide a response in regard to your question about Cortex M-52.

When Peter sends his response, I will forward it to you.

Regards


Tony Dawe
Director, Global Investor Relations

+61 (0)405 989 743
 
  • Like
  • Fire
Reactions: 11 users

stuart888

Regular
The Nvidia Youtube Channel is unreal. I also like Broadcom Youtube to see the Open System they are building to compete with Nvidia.

Note: Nvidia is Broadcom's largest customer!

Meta and Google buy Neural custom chips from Broadcom, and they got a third customer announced, Amazon, Apple, or Bytedance? Broadcom builds their custom chips start to finish in 18 months or less. They have a process perfected over 10 years. Very interesting.



NVIDIA Holodeck is a virtual reality (VR) innovation platform that brings designers, peers, and stakeholders together from anywhere in the world to build and explore creations in a highly realistic, collaborative, and physically simulated VR environment.

See how NVIDIA Holodeck can bring teams together, speed up productivity, and improve your entire creative process.
 
  • Fire
  • Like
Reactions: 6 users
Hi @Smoothsailing

I actually sent the company an email back when the Arm Cortex M55 was announced (23/11/23) asking if Akida had been used in the same way as in the Cortex M85!

The following is the only response I got. Understanding PVDM is very busy so i thought I would just sit back and let things play out...

23/11/2023

Thank you for your email.

I have forwarded your email to Peter van der Made, and asked him to provide a response in regard to your question about Cortex M-52.

When Peter sends his response, I will forward it to you.

Regards



Tony Dawe
Director, Global Investor Relations

+61 (0)405 989 743
Thanks my reason for this question is in relationship to this latest announcement regarding the Ambiq using the M55 and if we could be involved . 🫤
 
  • Like
Reactions: 3 users

stuart888

Regular

View attachment 59970
What did you say Akida SNN, power saver? Low Power AI MD!
"Artificial Intelligence may never be able to fully replace a doctor 👨‍⚕️, but it can certainly help supplement their work, speed up diagnostic processes, or offer data-driven 🧠 analyses to assist with decision making."
Go AI!
 
  • Like
  • Fire
Reactions: 8 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Here's an article about Sony's LEV-2 lunar robot with it's low-power IoT board computer SPRESENSE.

BTW, a few months ago Rob Telson "liked" a post about SPRESENSE in terms of it's intelligent HVAC capabilities.

Let's hope Sony is keeping abreast of our impressive incursions in the space vertical! We'd make great partners!

Screenshot_20230817_091848_LinkedIn.jpg








Sony's Technology Leading to Space:​

Behind the scenes of LEV-2 lunar exploration robot development​

March 7, 2024

On January 25, 2024, a photograph of Japan's first successful pinpoint lunar landing by the Smart Lander for Investigating Moon (SLIM), was released. The historic photo was captured by the Transformable Lunar Robot LEV-2 (nicknamed "SORA-Q"), jointly developed by Sony Group Corporation and Japan Aerospace Exploration Agency (JAXA), TOMY Company, Ltd. (TOMY), and Doshisha University. We spoke with Masaharu Nagata (Exploratory Deployment Group, Technology Platform, Sony Group Corporation), who led the Sony team, about what became the world's first fully autonomous robot lunar exploration project, and Sony's technologies that made it possible.

img01.jpg

World's first fully autonomous robot​

that captured SLIM landing on the moon​

LEV-2, jointly developed by the four parties, is the world's smallest and lightest fully autonomous lunar exploration robot at approx. 80mm in diameter (before transformation) and weighing around 250g. TOMY's knowledge and ideas from toy development have been put to use in the endearing spherical form and the frame design which allows it to travel across the lunar surface in a motion inspired by sea turtles.

LEV-2 was launched into space on board the SLIM lander and was released onto the lunar surface together with LEV-1, an ultra-compact Lunar Excursion Vehicle, on January 20, 2024. It then autonomously found the SLIM, taking pictures as it moved along, and transmitted high-quality image data of the lunar lander and its surroundings to the LEV-1, thereby delivering photos of the lunar surface to earth.

©JAXA/TOMY/Sony Group Corporation/Doshisha University

IoT board computer "SPRESENSE" and Sony's technology combine​

to make a super compact and lightweight robot​

The development of the lunar exploration robot began in 2016 by JAXA's Space Exploration Innovation Hub and TOMY. Sony joined in 2019, and Doshisha University followed in 2021. Sony's engineers were assigned the mission to realize a fully autonomous robot from the concept created by JAXA and TOMY for "a sphere-shaped robot that transforms to move around on uneven terrain." In addition, Sony led the selection of components as well as the technological development for the robot's motion control and imaging.

During the early stage of development, the team singled out a low-power IoT board computer "SPRESENSE™" developed and commercialized by Sony Semiconductor Solutions, which they determined to be the most suitable technology for what would be the world's smallest and lightest lunar exploration robot. Despite SPRESENSE's small size and low power consumption, it boasts a high performance programmable to work with camera and sensing devices enabling the robot to drive autonomously, capture images and perform other operations. After repeated environmental tests to verify its resistance to radiation and external shocks, SPRESENSE was selected as the core processor to control all the robot's movements. The development environment and source code are open to the public, which was also an advantage in working together with partners.

img03.jpg
Operation board installed on LEV-2
SPRESENSE mainboard mounted with components such as "ISX012" image sensors
However, minimizing the system load would be necessary to allow all tasks, including autonomous mobility, image capturing, and wireless communication, to be operated by SPRESENSE. To this end, engineers who had honed their skills in device design and development worked steadily on configuring the circuits and boards.

A further challenge was image processing, which is a vital element in enabling the robot to travel to the proper position as well as capture images. It was not possible to perfectly replicate the space environment and the conditions of the lunar landing for the tests on earth. "Without knowing the correct answer, we considered how to reliably detect and photograph the SLIM, and finally devised a system by which the LEV-2 could recognize the SLIM from the gold color of its exterior insulation. We then repeatedly experimented with every conceivable situation, including various light and shadow contrasts and reflections, to improve the accuracy of the technology. "

img04.jpg
Image data on the lunar captured and transmitted by LEV-2
(Test image received through the data transmission by test radio between LEV-2 to the radio station via LEV-1)

Expertise in manufacturing​

enabling a new approach in space exploration​

Nagata is also involved in Sony's "Earth MIMAMORI Platform" project, which aims to contribute to the sustainability of the Earth and society through Sony's technologies, and has been working on new communication technologies that leverage outer space. Through such experience, he is aware how challenging outer space development is, where you face risks and harsh environment that are completely different from those on earth, including impact of launch, temperature changes and cosmic rays.
"Every single component needs to be specialized for space, which requires a great deal of time and effort to develop. With that in mind, I feel that the development of the robot by these four parties proceeded quite quickly," he says.

In fact, an automated robot prototype was completed within about a year after Sony's participation in 2019. This allowed ample time for field testing and environmental testing by JAXA. This speedy development was made possible by a new approach taken by TOMY and Sony, both with roots in manufacturing, though in different fields.

"LEV-2 utilizes consumer devices for its components, and its frame is simple enough to be assembled by hand. Moreover, both companies excel at mass production of consistent and high-quality products. We were able to efficiently improve the overall quality of LEV-2 during its development by building multiple prototypes equipped with the same technology and testing them in parallel to verify the technologies each party was responsible for."

This was a novel approach from a new perspective, where in conventional development experiments were repeated with the single actual vehicle to be launched into space.

img05.jpg
Prototype of LEV-2 with the same core technology
as the actual robot that completed lunar exploration
Recently, more companies are showing an interest in space and entering the field by developing new technologies such as satellites or launching new businesses. Looking back on the joint research, Nagata says, "We were able to prove with our own hands that the technologies and skills we have honed at Sony can contribute to space, which is very gratifying. I also realized firsthand that fundamental principles of manufacturing remains the same, whether on the ground or in space. I would like to continue exploring the potential of Sony's technologies and development expertise in new fields such as outer space."

img06.jpg


Sony's R&D team who worked on the joint research project
 
Last edited:
  • Like
  • Wow
  • Thinking
Reactions: 17 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
"Renesas Electronics Corporation has announced what it said is the industry’s first general-purpose 32-bit RISC-V-based microcontrollers (MCUs) built with an internally developed CPU core."

Could be something. Could be nothing.


Screenshot 2024-03-28 at 11.43.22 am.png

 
  • Like
  • Wow
  • Thinking
Reactions: 13 users

JB49

Regular
OK this is a longgggg dot connection.

A few weeks ago when Circle8 posted on Linkedin that they are using Brainchip, and this bloke commented on the post.

1711588988250.png


Schneider Electric just teamed up with NTT Data for Edge AI https://datacenternews.asia/story/ntt-data-schneider-electric-team-up-for-ai-edge-computing. And they also teamed up with Nvidia https://www.se.com/uk/en/about-us/n...-for-ai-data-centres-65f957253d3c1a3da2093ee8.

NTT Data have experimented with Neuromorphic and spiking neural networks in the past https://group.ntt/en/newsrelease/2021/04/23/210423a.html.

I wouldn't usually make anything of this, but the comment from Stephen McGurk that Brainchip "are linked in" may mean we are somehow involved in all of this.
 
  • Like
  • Fire
  • Thinking
Reactions: 40 users

jtardif999

Regular
You will note that I didn't mention the day to day share price. I referred to the value of the company. Whether we like it or not, that is measured in the wider market by commercial success and a share price that attaches to that.

As I have said many times, I am a true believer in the technology but am concerned about how management is parlaying the technology into commercial success.

Pre-revenue valuations for publicly listed companies are based on something that may or may not happen in the future. Patents have an intrinsic value but only if they attach to a product that the market believes will be able to generate revenue. Likewise partnerships. In my view management has not been transparent enough to convince the market that commercial success is inevitable and consequently the value of the company is depressed. The current state of the enterprise may speak value to you but clearly it does not do so to the market. And to be quite frank its what the market perceives that dictates enterprise value, not what you or I would like to believe.
When BrainChip start generating consistent revenue the market will adjust the valuation. Management can’t improve that circumstance imo. Unfortunately it takes time to turn all the work being done into actual revenue. They are building a top down revenue stream through partners who have the capability to scale and whom (many of which) have NDAs for competitive advantage. It’s a recipe which tends to keep us in the dark more than any of us would be comfortable with but it is what it is. Eventually we will appreciate why the company have had to travel this path I think.
 
  • Like
  • Fire
  • Love
Reactions: 31 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Anyone know anything about the Azalea Satellite Cluster? Definitely worth looking into IMO.




DSEI 2023: BAE Systems Says First Azalea Satellite Cluster to be Launched in Early 2025​

Peter Felstead
25. September 2023


Print Friendly, PDF & Email
BAE Systems announced at DSEI 2023 on 14 September that it is set to launch its first Azalea multi-satellite cluster into low Earth orbit at the beginning of 2025. The group of satellites will use a range of sensors to collect visual, radar and radio frequency (RF) data to deliver high-quality information and intelligence from space in real time to military customers.
The venture follows BAE’s acquisition of UK firm In-Space Missions last year and is being delivered in partnership with Finland’s ICEYE, which will provide the Azalea cluster’s synthetic aperture radar (SAR) capabilities.
Each Azalea cluster will feature four satellites, combining an ICEYE SAR satellite with three others that provide RF data gathering, optical imaging and cluster computing to combine and analyse the data in space before delivering assured intelligence product directly back to the user back on Earth.
Speaking at DSEI 2023, Andy Challen, UK missions sales lead for ICEYE, explained that the data from the Azalea cluster “will be fused in space and then come down as a more complete picture”.
Azalea-cluster-BAE-Systems.jpg
Each Azalea cluster will feature a SAR satellite, an optical imaging satellite, an RF-gathering satellite and a processing satellite to fuse all of the gathered data in space, all of which will be enabled and updatable by software-defined radio. (Image: BAE Systems)
The in-space processing of the data gathered by the Azalea satellites will provide much more timely intelligence, as existing space-based sensors require multiple terabytes of raw data to be transferred to Earth before that data can be processed and distributed.
Each of the satellites will be enabled by software-defined radio (SDR), allowing them to be fully reconfigured and updated while in orb
it. Doug Little, CEO of In-Space Missions, said at DSEI 2023 of the Azalea constellation, “We say we have a software-defined satellite capability.”
Also speaking during the Azalea briefing at DSEI 2023, Elizabeth Seward, head of space strategy and future business for BAE Systems’ Digital Intelligence sector, noted that the initial cluster will be launched by US satellite launch provider Spacex. Seward said that, following the launch of the initial four-satellite Azalea cluster, which will provide an initial operational capability, subsequent additions to the constellation would be rolled out according to customer requirements. She added that a 12-satellite constellation could provide comprehensive coverage, but that, as the constellation grows, BAE would look to put satellites into multiple orbital planes.
ICEYE already has 27 SAR satellites in orbit, which has significantly de-risked the Azalea system’s SAR capabilities and already provided the team with SAR data to work with, but the ICEYE SAR satellite launched as part of the initial Azalea cluster will be its first SDR-enabled satellite.
The Azalea programme supports the UK government’s Defence Space Strategy, published in February this year, which identified Earth observation as a priority area to help protect and defend UK interests. BAE has stated that this is a sovereign capability that the Azalea programme could provide.
As well as providing military intelligence, such as the location of hostile platforms and weapon systems, the Azalea constellation will also be able to assist during natural disasters, for example by locating people at risk.

 
  • Like
  • Love
  • Fire
Reactions: 18 users

IloveLamp

Top 20
Last edited:
  • Like
  • Love
Reactions: 12 users

7für7

Regular
This sound also very promising could be something… could be not

“Researchers trained artificial intelligence (AI) to detect and scrutinise the sound of excretion. Someday, it will be useful for the diagnosis of high-fatality infectious diseases such as cholera, and to prevent a potential outbreak (explosive spread of infectious diseases).”

 
  • Haha
  • Fire
Reactions: 3 users

7für7

Regular
Fujitsu and TOTO collaboration trial for safety toilet experience. Recognising older people when they have a accident in a toilet and so on..


Fujitsu AI website… since they are Japanese, maybe they are partner of some of ours ecosystem partner? I didn’t research yet

 
Last edited:
  • Thinking
  • Fire
Reactions: 3 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Screenshot 2024-03-24 at 1.19.12 pm.png
 
  • Haha
  • Wow
  • Like
Reactions: 27 users
Top Bottom