BRN Discussion Ongoing

Diogenese

Top 20
Whatever you're selling, I'm not interested in buying, thanks.

Someone keeps ringing me at least 50 times a day trying to sell me solar panels and that's annoying enough for me to contend with at the moment.
... you'll need a battery with that!
 
  • Haha
  • Like
Reactions: 6 users

ArtGonzo

Emerged
https://x.com/NueralNews/status/1915311018504327329
Screen Shot 2025-04-24 at 9.24.36 pm.png
 
  • Like
  • Fire
Reactions: 9 users

Frangipani

Top 20


This poster titled GRAIN - Radiation-Tolerant Edge AI, presented by Kenneth Östberg, one of its two co-authors (the other being Daniel Andersson), during the “RISC-V in Space” workshop in Gothenburg on Thursday…


View attachment 81573
View attachment 81577

…finally reveals what NEURAVIS stands for - the name of that R&T project, which ESA awarded to the five consortium partners Airbus Toulouse, Airbus Ottobrunn, BrainChip, Frontgrade Gaisler and Neurobus in mid-2024 (see the July 2024 LinkedIn post by Airbus Space Project Manager Jérémy Lebreton below):

Neuromorphic Evaluation of Ultra-low-power Rad-hard Acceleration for Vision Inferences in Space.

The poster also provides more information with regards to the use cases currently being explored in the NEURAVIS project, although I’m afraid I couldn’t decipher everything due to the small print - maybe someone with eagle eyes or a magic tool to blow up the photo and unblur the small print can add in resp. correct what I’ve gathered so far:

1. Moon landing
Use Case #1: Vision-Based Navigation for Lunar Lander

Also see Alf Kuchenbuch’s recent comment on Argonaut, ESA’s lunar lander programme:

https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-452257

View attachment 81579

2. Debris detection/collect (?)
Use Case #2: Monitoring (?) Building Block for In-orbit Maintenance


3. Docking


4. Object ? (looks like “simulation”, but appears to be a longer word?)



In addition, the poster lists four “Application scenarios” for GRAIN’s Radiation-Tolerant Edge AI:

1. Remote Terminal Unit
2. Stand-alone Controller
3. Near-edge processing unit
4. Auxiliary data-processing module

Lots of small print to decipher here as well! 🔍





View attachment 81578


If I understand the above post correctly, we have yet to hear about what suggestion the NEURAVIS proposal has for AKD1500 (“BrainChip is proud that Airbus selected Akida for both COTS chips and IP in their proposal. ESA awarded the Airbus “NEURAVIS” proposal, including Akida in the Akida 1500 chip and on an FPGA together with Frontgrade Gaisler’s NOEL-V processor.”).

Whereas the underlined appears to refer to Frontgrade Gaisler’s newly revealed GR801 SoC that will incorporate Akida 1.0 IP - greatly benefitting the work of Airbus Toulouse computer vision experts such as Jérémy Lebreton (project lead) and Roland Brochard, as can be inferred from the GRAIN poster’s four listed use cases - there has to be another specific proposal by Airbus how to utilise our COTS chip AKD1500, then.

So I presume Airbus Ottobrunn and Neurobus might be the consortium partners currently collaborating on that second part of the NEURAVIS proposal?

View attachment 81582 View attachment 81583




View attachment 81584

Here is a close-up of the GRAIN - Radiation-Tolerant Edge AI poster Kenneth Östberg presented at the “RISC-V in Space” workshop last month - so no more deciphering of the small print needed 😊:

As I already mentioned the other day 👆🏻, it finally reveals what NEURAVIS stands for: Neuromorphic Evaluation of Ultra-low-power Rad-hard Acceleration for Vision Inferences in Space.



EF871DEF-8011-49A8-A23D-A81EB173EC23.jpeg
D31025E5-CAF4-488B-8445-EF216AE35BFB.jpeg
 

Attachments

  • D6E41C74-55B4-4265-AD77-28127AC6146A.jpeg
    D6E41C74-55B4-4265-AD77-28127AC6146A.jpeg
    198.8 KB · Views: 70
Last edited:
  • Like
  • Fire
  • Love
Reactions: 25 users

MDhere

Top 20
Unbelievable!

I just got a warning from Dreddb0t because I replied to 7fur7's EXTREMELY WEIRD post which showed a hand slapping a woman's butt, with this gif below.

So, this little girl's weirded-out look is more offensive than someone randomly replying to a serious post of mine with a slapping butt gif?

I note that my gif of the little girl has been removed but 7fur7's "slapped butt" gif still remains.



View attachment 83062


View attachment 83065
His post WAS removed thank goodness, but now its back here lol
 
  • Like
Reactions: 2 users

MDhere

Top 20
Moving back onto a better subject called Brainchip. The SAB of Brainchip Andre Van Schaik, I am intrigued by this man and also Gregory Cohen. I am hoping that one day our technology will be in amateur telescopes which these two are interested in as well.

I would love to buy a telescope and know exactly what I'm looking and at the same time track the event base image and still know I am looking at the same thing and then something else comes into the visual and it's something new. Would be super cool and educational. Will be keeping an eye on Andre in Manchester, where I think Brainchip may pay a visit there in July?
 
  • Like
  • Fire
  • Love
Reactions: 14 users

MDhere

Top 20
ok back to Akida now.... :)
 
  • Like
Reactions: 2 users

MDhere

Top 20
Bernd Westhoff, love this man at Renesas.




🙏
 
  • Like
  • Thinking
  • Fire
Reactions: 9 users
Also curious who the "Other" (NDA?) MCU SDK's are on the slides @Frangipani posted.

IMG_20250424_213944.jpg
 
  • Like
  • Fire
  • Wow
Reactions: 17 users

Frangipani

Top 20
Uploaded a couple of hours ago with Doug McLelland via Frontgrade channel.

You can find the presentation slides from Douglas McLelland’s talk that @Fullmoonfever shared on the conference material webpage:



Think Frontgrade got month wrong .. should be April. What is it with tech companies and simple errors :ROFLMAO:


The RISC-V in Space Workshop, organized by the European Space Agency in collaboration with RISC-V International, was held on May 2–3, 2025, in Gothenburg, Sweden.





In addition, they managed to upload the wrong paper (totally unrelated to Akida) with regards to Douglas McLelland’s presentation:



It should of course have been this one:


Low-power Ship Detection in Satellite Images Using Neuromorphic Hardware by the presenter himself and his co-author Gregor Lenz, who was the CTO of Neurobus at the time of publication in June 2024.

As I shared the other day, Gregor Lenz recently stepped down from his position to join Paddington Robotics in London, where he lives: https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-456183 and https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-456824).
Good to know somebody with hands-on experience of Akida is onboard.

Slightly more (but still very little) information on their website now:

https://paddington-robotics.com/

E87F921D-05D1-426A-AAE7-E7FE089FB49D.jpeg
AFD788F3-0A51-4CA6-A958-18F0B731699B.jpeg



Here is an article published earlier today that explains what is meant by Embodied AI:



After-the-Humanoids-The-Real-Future-of-Embodied-AI-1-1920x1075.jpg


After the Humanoids: The Real Future of Embodied AI​

UK’s Embodied AI revolution redefines robotics beyond humanoids. Paddington Robotics and Imagine Health tackle hardware chaos and edge cases, driving practical innovation with £1.3B support. Discover the future now!

1e362e98d59be639c75d2539bde59bfa
byPaul Morrison

April 24, 2025

The UK is a global leader in Embodied AI (EAI), where AI powers physical systems to learn and adapt in real-world settings, from self-driving cars to healthcare diagnostics. With over 3,000 AI firms and £1.3 billion in government backing, the UK is shaping the future of robotics beyond sci-fi fantasies. I recently attended a TechAI panel featuring Zehan Wang (founder of Paddington Robotics) and Mohan Mahadevan (CEO Founder at Imagine Health). Their insights revealed where EAI is really heading, and it’s not about the humanoids.

About the Companies​

Paddington Robotics
Founded in 2024 in London, Paddington Robotics is a startup tackling AI for general-purpose robotics. Led by Zehan Wang, a former Twitter AI executive with a PhD from Imperial College London, the company aims to enable widespread robotics adoption by solving real-world data challenges. With $2.26 million in funding from 7percent Ventures, their work on adaptive, non-humanoid systems aligns with EAI’s focus on practical, environment-driven intelligence, starting with applications in spaces like supermarkets.



Imagine Health
Imagine Health, a London-based startup founded in 2025, automates ultrasound diagnostics using embodied AI and robotic arms guided by computer vision. Co-Founder and CEO Mohan Mahadevan, with 20+ years in AI and robotics from Amazon and Tractable, drives its mission to deliver safe, scalable healthcare solutions, reducing practitioner strain.


EAI and Robotics – What’s the Difference?​

Embodied AI (EAI) and robotics are related but distinct. EAI focuses on AI systems integrated into physical forms that learn and adapt through environmental interaction, emphasizing intelligent, adaptive behavior (e.g., Wayve’s self-driving cars). Robotics is a broader engineering field designing physical machines for tasks, often with pre-programmed or basic automation, not always requiring advanced AI (e.g., factory robotic arms). While EAI prioritizes cognitive adaptability, robotics centers on mechanical precision and task execution, with EAI being a subset that leverages embodiment for smarter, more flexible systems.

Three Takeaways That Challenged My Thinking​

  1. The humanoid form is overrated. “We don’t need a humanoid standing at the sink doing dishes,” Mohan argued, emphasizing function over sci-fi aesthetics. Zehan advocated for designs like “soft core with multiple limbs on wheels” that prioritize efficiency. This resonates with UK EAI research, where academics draw from nature—like the 1940s cybernetic tortoise that navigated without a human-like form. Paddington Robotics’ focus on general-purpose, non-humanoid systems and Wayve’s autonomous vehicles embody this ethos. The lesson? Let the task dictate the design, not Hollywood.
  2. The hardware ecosystem lacks standardisation. Mohan and Zehan highlighted a critical bottleneck: robotics hardware is a mess of fragmented components, with “firmware in Chinese and no standards.” Unlike the PC industry’s plug-and-play revolution, robotics innovators wrestle with incompatible parts, pushing them toward suboptimal humanoid platforms. The UK’s AI sector echoes this—53% of firms in a 2023 study flagged resource access as a barrier. Yet, the 2025 AI Opportunities Action Plan promises a 20x compute capacity boost and AI Growth Zones like Culham, which could drive standardization. For now, companies like Paddington Robotics must navigate this chaos strategically.
  3. Edge cases are the real battleground. Mohan nailed the data challenge: it’s not just volume but diversity. “Once 50 robots are in the field and one is in a bright sunny room not working right, how quickly can I fix that robot in an IP-controlled client environment?” This is where EAI lives or dies. UK researchers tackle this with frameworks like Deep Evolutionary Reinforcement Learning, helping robots adapt to unpredictable settings. Imagine’s work on diverse sensory data and Wayve’s real-world autonomous testing show how EAI must evolve. For founders, this means prioritising adaptability — your robot’s smarts need to keep up with the world.

A Note for Would-be EAI Founders​

For founders diving into Embodied AI, the UK is a goldmine: tap into DSIT’s £1.3 billion AI investment or the AI Playbook to find high-impact use cases, from healthcare to infrastructure. Maintenance is your secret weapon—robots aren’t built for easy field repairs, so plan for it. Start small, solve real problems, and leverage the UK’s vibrant ecosystem to scale your vision.

Shaping the Future of EAI​

The UK’s EAI revolution, fueled by £1.3 billion in investment opportunity and innovators like Paddington Robotics and Imagine Health, is redefining robotics as adaptive, intelligent systems far beyond humanoid fantasies. Zehan and Mohan’s insights—ditching outdated designs, tackling hardware chaos, and mastering edge cases—point to a future where EAI solves real-world challenges with precision and flexibility. As the UK leads with initiatives like the Culham AI Growth Zone, the question remains: how will we balance UK leadership and innovation with ethical and practical hurdles to make EAI a cornerstone of tomorrow’s world?

1e362e98d59be639c75d2539bde59bfa
byPaul Morrison
Published April 24, 2025
 
  • Like
  • Fire
  • Wow
Reactions: 15 users

HopalongPetrovski

I'm Spartacus!
Unbelievable!

I just got a warning from Dreddb0t because I replied to 7fur7's EXTREMELY WEIRD post which showed a hand slapping a woman's butt, with this gif below.

So, this little girl's weirded-out look is more offensive than someone randomly replying to a serious post of mine with a slapping butt gif?

I note that my gif of the little girl has been removed but 7fur7's "slapped butt" gif still remains.



View attachment 83062


View attachment 83065
Hi Bravo.
I have had a few run ins with the dumb mechanism too.
It has very limited intelligence and virtually zero understanding when it comes to an appreciation of subtlety or ironic humour.
It has either been designed that way intentionally as a hammer, or more likely in my view, what we are seeing is the limitations in programming inherent in current AI moderation.
Whoever is currently running this forum either does not have the will or the capacity to monitor it more effectively and perhaps more fairly.
But even where human moderation is used, such as over on the crapper, or in times past on Facebook etc, it is still rife with both ineptitude and obvious bias.
Beyond that, differing people will have differing sensitivities and flashpoint's in regard various topics due to their particular lived experience and so one's sense of humour, like one's appreciation of art, or what is and isn't, appropriate, is very subjective.
We tend to like people with similar predilections and it takes a degree of emotional maturity to maintain both tolerance and equanimity in the face of brute extremism.
Hang in there B.
We'll all be happier when BrainChip's success is better reflected in its share price. 🤣
 
  • Like
  • Fire
  • Love
Reactions: 13 users

7für7

Top 20
Since my posts were deleted but Bravo’s weren’t, let me just say something in my defense… I just find it astonishing how someone can get so emotionally worked up over a simple illustration. It’s just an illustration… a joke… not meant to be taken seriously at all. Like I already said… some people here are acting like I personally slapped her... Come back down to reality. This is ridiculous… honestly. I can’t believe that this is a serious topic for some people…
 
  • Like
  • Haha
  • Sad
Reactions: 6 users

Cardpro

Regular
Since my posts were deleted but Bravo’s weren’t, let me just say something in my defense… I just find it astonishing how someone can get so emotionally worked up over a simple illustration. It’s just an illustration… a joke… not meant to be taken seriously at all. Like I already said… some people here are acting like I personally slapped her... Come back down to reality. This is ridiculous… honestly. I can’t believe that this is a serious topic for some people…
You should think about it... if I punch you in the face as a joke, is it acceptable? No

Stop trying to defend yourself, you fked up, and the joke was inappropriate, you should go sit in the corner and think about it. Ask ur wife, friends, etc, or even chatGPT.
 
  • Love
  • Like
  • Fire
Reactions: 14 users

HopalongPetrovski

I'm Spartacus!
Since my posts were deleted but Bravo’s weren’t, let me just say something in my defense… I just find it astonishing how someone can get so emotionally worked up over a simple illustration. It’s just an illustration… a joke… not meant to be taken seriously at all. Like I already said… some people here are acting like I personally slapped her... Come back down to reality. This is ridiculous… honestly. I can’t believe that this is a serious topic for some people…
As I recall, you got pretty exercised when I posted Musk doing a nazi salute a month or so ago.
And seemed happy to be provoked when I posted a clip of an economist giving his opinion regarding the state of world affairs a few weeks ago.
We all have our hot buttons.
 
  • Like
Reactions: 10 users

7für7

Top 20
After realizing how some users here compare real people with a direct impact on human lives and actual interactions, to something as harmless as an illustration that was never meant to be malicious (which I mentioned before), I’ve decided to say goodbye.

Good luck to everyone else, and have fun. My fun is gone today.
 
  • Sad
  • Like
  • Haha
Reactions: 13 users

HopalongPetrovski

I'm Spartacus!
After realizing how some users here compare real people with a direct impact on human lives and actual interactions, to something as harmless as an illustration that was never meant to be malicious (which I mentioned before), I’ve decided to say goodbye.

Good luck to everyone else, and have fun. My fun is gone today.
So long, and Thanks for all the fish. 🤣
 
  • Like
  • Haha
Reactions: 7 users
Though it's unlikely to hit this past qtr, but will see, it appears the DOD has outlayed about $77k to BRN as part of the recent contract awarded under COVID spending. Whether that's already physically paid previously and the record is retrospective, we won't know till the relevant qtrly I guess.

Screenshot_2025-04-25-12-56-23-22_4641ebc0df1485bf6b47ebd018b5ee76.jpg
Screenshot_2025-04-25-12-56-47-80_4641ebc0df1485bf6b47ebd018b5ee76.jpg
Screenshot_2025-04-25-12-58-39-53_4641ebc0df1485bf6b47ebd018b5ee76.jpg
 
  • Like
  • Fire
Reactions: 13 users

Diogenese

Top 20
Jensen Huang says that autonomous driving needs machine learning:

12:00 >

But he gets the answer wrong:

ML needs massive compute = Blackwell!!!!


https://wccftech.com/nvidia-fully-e...ompletely-different-architecture-from-hopper/

NVIDIA’s Fully-Enabled Blackwell B200 GPUs Consume Up To 1200W, Completely Different Architecture From Hopper​

Hassan Mujtaba•Mar 22, 2024 at 09:32am EDT

I'm not sure how much of that power is needed for inference and ML, but even if it's only 10%, that's more than 100 Akidas.
 
  • Like
  • Fire
  • Love
Reactions: 12 users

Diogenese

Top 20
Jensen Huang says that autonomous driving needs machine learning:

12:00 >

But he gets the answer wrong:

ML needs massive compute = Blackwell!!!!


https://wccftech.com/nvidia-fully-e...ompletely-different-architecture-from-hopper/

NVIDIA’s Fully-Enabled Blackwell B200 GPUs Consume Up To 1200W, Completely Different Architecture From Hopper​

Hassan Mujtaba•Mar 22, 2024 at 09:32am EDT

I'm not sure how much of that power is needed for inference and ML, but even if it's only 10%, that's more than 100 Akidas.

Jensen on Inferencing:

 
  • Like
  • Wow
  • Fire
Reactions: 4 users

miaeffect

Oat latte lover
After realizing how some users here compare real people with a direct impact on human lives and actual interactions, to something as harmless as an illustration that was never meant to be malicious (which I mentioned before), I’ve decided to say goodbye.

Good luck to everyone else, and have fun. My fun is gone today.
Ms7YJF.gif
 
  • Haha
  • Fire
  • Like
Reactions: 15 users
Top Bottom