BRN Discussion Ongoing

Deadpool

hyper-efficient Ai
I just assumed if you bought a Dutch Oven that the chimney would come free? Silly me!😘

One would hope so

dutch oven GIF
 
  • Haha
Reactions: 10 users

dippY22

Regular
I dont remember Brainchip used the word collaboration before.

"Our collaboration with Arm is set to unlock the next generation of intelligent edge devices with unprecedented levels of performance and functionality," said Rob Telson, BrainChip Vice President of Ecosystem and Partnerships. "We look forward to joining Arm on their Tech Talk presentation and educate listeners how Akida and the Arm Cortex-M85 will deliver capabilities not previously possible."

View attachment 35722


Just saying 😊😊😊 or am I reading too much in this media release.

Learning 🏖
Cheer to a green Friday 🍺🍺🍺

Hmmmm,........from using the word partner to now using a word like collaboration. Interesting.


Well, it's good I suppose that collaboration starts with the letter C which is in the word that I want to see more of,......CUSTOMER.
 
  • Like
Reactions: 8 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Hmmmm,........from using the word partner to now using a word like collaboration. Interesting.


Well, it's good I suppose that collaboration starts with the letter C which is in the word that I want to see more of,......CUSTOMER.

ARM + BrainChip! Its as easy ABC!

Screen Shot 2023-05-05 at 9.48.44 pm.png


 
  • Like
  • Fire
Reactions: 14 users
  • Like
  • Fire
  • Love
Reactions: 10 users

manny100

Regular
I dont remember Brainchip used the word collaboration before.

"Our collaboration with Arm is set to unlock the next generation of intelligent edge devices with unprecedented levels of performance and functionality," said Rob Telson, BrainChip Vice President of Ecosystem and Partnerships. "We look forward to joining Arm on their Tech Talk presentation and educate listeners how Akida and the Arm Cortex-M85 will deliver capabilities not previously possible."

View attachment 35722


Just saying 😊😊😊 or am I reading too much in this media release.

Learning 🏖
Cheer to a green Friday 🍺🍺🍺
Pretty clear that ARM will be a customer from that talk. Likely the reason for the SP move today.
Having said that the SP was just waiting for a chance to break out.
A few indecisive days on low volume just waiting for an excuse to break out.
 
  • Like
  • Fire
Reactions: 13 users

Deadpool

hyper-efficient Ai
Hmmmm,........from using the word partner to now using a word like collaboration. Interesting.


Well, it's good I suppose that collaboration starts with the letter C which is in the word that I want to see more of,......CUSTOMER.
Not to mention the other C word.






Waynes World Reaction GIF
 
  • Haha
  • Like
Reactions: 8 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
  • Haha
  • Like
  • Fire
Reactions: 17 users
A new post from our friend Magnus at Mercedes discussing the MBUX screen safety features to reduce driver distraction

1683290492214.png
 
  • Like
  • Fire
  • Wow
Reactions: 15 users
:unsure::unsure::unsure::unsure:

Some excerpts...full article in link.



1683295666878.png


Jan 23, 2023

12 min read

Unleashing the full potential of IoE with Edge Computing Technology​


The Internet of Things (IoT) is exploding on such a scale that many are already redefining terms to rather call it the Internet of Everything (IoE) - as a better illustration of how today’s connected world is shaping. Still, as often seen in many contexts, massive growth puts unsustainable pressure on resources. In this case, a pervasively connected world requires more and more hardware units - which in turn require power - as well as increasing data bandwidth - to deal with the massive amounts of data being transmitted and processed. In this article, we will try to shed a little light on how edge computing, particularly in ultra low power edge devices, will contribute to solving these constraints and unleashing the full potential of the Internet of Everything.​


New approaches

ONiO is playing with new approaches integrating edge computing and neuromorphic concepts - namely to implement neural networks in ultra-low-power integrated circuits. The end goal is to implement extremely power-efficient hardware accelerators which can perform algorithmic operations based on peripheral/sensor data without the intervention of the CPU. This will allow offloading heavy software tasks on hardware, thus saving power by waking up the CPU only when needed.

But what does this means exactly? In practical terms, this means that we can create wireless, batteryless smart edge nodes/devices capable of processing data and making simple decisions locally - without having to send data back and forth to the cloud, thus saving on data transmission (bandwidth) and computing loads in data centres. Obviously, we are bound by the constraints of ultra-low-power systems - so not really running full chess games there! So, what is it good for?.....

.....In a nutshell, power aware neural network solutions shall be capable of mixing inputs from several sources and detecting relevant events (when something happens) without CPU intervention, filtering-off non-relevant data points and report/transmit relevant data/events only, as opposed to continuously sending data to the backend/cloud.
 
Last edited:
  • Like
  • Love
  • Thinking
Reactions: 47 users

RobjHunt

Regular
BIOTRONIK's link to CardiaCare could this possibly be the unconventional telco that Sean was alluding to?????????????????

CardiaCare is a clinical stage Digital Therapeutic company that develops the world’s first closed loop wearable for non-invasive treatment of cardiac and other Central Nervous System related clinical symptoms. Providing a non-invasive, nerve stimulation treatment with continuous monitoring and AI based alerts capabilities, CardiaCare’s device is the only platform that enables complete remote patient management and personalized treatment. The device stems from many years of treatment of real-world patients and integrates know how and innovation from diverse disciplines. CardiaCare has a functional prototype and successfully completed a FIM study. CardiaCare is a 2022 MedTech Innovator cohort company.

It's still my opinion that the telco could very well be Tesla/Space X.
 
  • Like
  • Wow
  • Love
Reactions: 13 users

FKE

Regular
  • Like
  • Fire
  • Love
Reactions: 37 users
D

Deleted member 118

Guest
  • Haha
  • Fire
Reactions: 5 users

charles2

Regular
  • Like
  • Fire
Reactions: 11 users

charles2

Regular
In the US BRCHF up>9% 45 minutes before the close.

And I thought all the Black Swans were in Australia.

Que tengan un buen fin de semana.

Edit: Last minute spurt. Closed up 11.3% at (a measly) $.3009 US

First time that I've seen aggressive buying (chasing the price up) in forever.
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 36 users
Seeing that Brainchip mention drones & agriculture as a key target market, the article below seems quite promising:


"Precision AI is accelerating artificial intelligence (AI) through enhancing farming practices and creating the world’s first AI-powered drones for plant-level herbicide applications at a broad-acre scale. Specifically, their Precision Spray Drone System with ZeroDrift can detect weeds from crop in real time and target spray the weeds in a single pass. These drones achieve this feat by utilizing edge computing, which allows faster and more accurate onboard processing, removing the need for broadband internet and creating near instant identification at record speeds – 8x faster than the industry average, in fact."

"Recently, they were chosen for the prestigious John Deere 2023 Startup Collaborator and are excited about its potential to speed up their entry into the commercial market and for the unique collaboration opportunities with like-minded startups."

"In the next five years, the Precision AI team is preparing for the Precision Spray Drone System to hit the market (anticipated in 2024) and continuing to focus on expanding their unique technology to additional industries, cropping systems and farms."

Here is another article Precision AI has released where they talk about taking the technology to the far edge https://www.precision.ai/resources/654/edge-computing

Here is a most recent article about them https://www.precision.ai/resources/837/coming-for-the-mega-farms

"Precision AI says its approach can reduce herbicide use by as much as 90% compared to traditional methods. The startup was one of a dozen winners of BloombergNEF’s 2023 Pioneers award, which aims to spotlight early-stage climate tech innovators with game-changing potential."

For now, Precision AI’s drone is operated with supervision from a human pilot. But McCann says his company is poised to introduce a fully autonomous spraying drone that can take off, fly and land by itself – as long as regulators grant permission.

"The startup plans to commercialize its on-demand spraying service next year, allowing farmers to book as needed — not dissimilar to how consumers order an Uber. It will also sell the spraying drone to farmers who want more control over their crop management and charge a fee for its AI operating software on a pay-as-you-go basis."
 
  • Like
  • Fire
  • Love
Reactions: 23 users

wilzy123

Founding Member


 
  • Like
  • Fire
  • Love
Reactions: 45 users

Tothemoon24

Top 20

SNAPSHOT​

A team of NASA personnel and contractors has prototyped a new set of algorithms that will enable instruments in space to process data more efficiently. Using these algorithms, space-based remote sensors will be able to provide the most important data to scientists on the ground more quickly and may also be able to autonomously determine which Earth phenomena are the most important to observe.

Photo of the International Space Station in orbit with the Earth in the background

The International Space Station, where Steve Chien and his team prototyped a new set of AI algorithms that will reduce data latency and improve dynamic targeting capabilities for satellites. (Credit: NASA/ISS)
Earth-observing instruments can gather a world’s worth of information each day. But transforming that raw data into actionable knowledge is a challenging task, especially when instruments have to decide for themselves which data points are most important.

“There are volcanic eruptions, wildfires, flooding, harmful algal blooms, dramatic snowfalls, and if we could automatically react to them, we could observe them better and help make the world safer for humans,” said Steve Chien, a JPL Fellow and Head of Artificial Intelligence at NASA’s Jet Propulsion Laboratory.

Engineers and researchers from JPL and the companies Qualcomm and Ubotica are developing a set of AI algorithms that could help future space missions process raw data more efficiently. These AI algorithms allow instruments to identify, process, and downlink prioritized information automatically, reducing the amount of time it would take to get information about events like a volcanic eruption from space-based instruments to scientists on the ground.

These AI algorithms could help space-based remote sensors make independent decisions about which Earth phenomena are most important to observe, such as wildfires.

“It’s very difficult to direct a spacecraft when we’re not in contact with it, which is the vast majority of the time. We want these instruments to respond to interesting features automatically,” said Chien

Chien prototyped the algorithms using commercially available advanced computers onboard the International Space Station (ISS). During several different experiments, Chien and his team investigated how well the algorithms ran on Hewlett Packard Enterprise’s Spaceborne Computer-2 (SBC-2), a traditional rack server computer, as well as on embedded computers.

These embedded computers include the Snapdragon 855 processor, previously used in cell phones and cars, and the Myriad X processor, which has been used in terrestrial drones and low Earth orbit satellites.

Including ground tests using PPC-750 and Sabertooth processors – which are traditional spacecraft processors – these experiments validated more than 50 image processing, image analysis, and response scheduling AI software modules.

The experiments showed these embedded commercial processors are very suitable for space-based remote sensing, which will make it much easier for other scientists and engineers to integrate the processors and AI algorithms into new missions.

The full results of these experiments were published in a series of three papers at the 2022 IEEE Geoscience and Remote Sensing Symposium, which can be accessed through the links below.

Chien explains that while it is easiest to deploy AI algorithms from ground computers to larger, rack-mounted servers like the SBC-2, satellites and rovers have less space and power, which means they would need to use smaller, low-power, embedded processors similar to the Snapdragon or Myriad units.

By processing the data onboard, these AI algorithms prevent important or urgent information from being buried within larger data transmissions. A researcher wouldn’t have to downlink and process an entire transmission to see that a hurricane is intensifying or a harmful algal bloom has formed.

“A large image could have gigabytes of data, so it might take a day to get it to the ground and process it. But you don’t need to process all that data to identify a wildfire. These algorithms pre-process data onboard so that researchers get the most important information first,” said Chien.

These algorithms could be useful not only for Earth-observing instruments, but also for instruments observing other planets as well. The proposed Europa Lander mission, for example, could use Chien’s algorithms to help search for life on the Jovian moon.

“There are several missions that are in concept development right now that could use this technology. They’re still in the early phases of development, but these are missions that need the kind of onboard analysis, understanding, and response these algorithms enable,” said Chien.

The team is also testing neural network models to interpret Mars satellite imagery. “Someday such a neural net could enable a satellite to detect a new impact ejecta, evidence of a meteorite impact, and alert other spacecraft or take follow-up images,” said JPL Data Scientist Emily Dunkel. “Rovers could also use these processors with neural networks to determine where it is safe for the rover to drive,” Dr. Dunkel added.

“We used the CogniSat framework to deploy models to the Myriad X, reducing the effort to develop deep learning models for onboard use. This experience helps prove that this advanced hardware and software system is ready now for space missions,” said Léonie Buckley, Senior Engineer at Ubotica.

As climate change continues to alter the world we live in, information systems like Chien’s allow scientific instruments to be as dynamic as the Earth systems they observe.

“We don’t often think about the fact that we’re walking around with more computing power in our cell phones than supercomputers had forty years ago. It’s an amazing world we live in, and we’re trying to incorporate those advancements into NASA missions,” said Chien.
E1F47BE8-200B-4035-8D32-E9F583BC2C4C.jpeg
E1F47BE8-200B-4035-8D32-E9F583BC2C4C.jpeg
 
Last edited:
  • Like
  • Fire
  • Wow
Reactions: 20 users

Andy38

The hope of potential generational wealth is real
I am excited that Rob is excited that I’m excited about what he calls the tip of the iceberg 😂😂.
Anyone else feel that momentum is building nicely?!!!!
Have a great weekend Chippers
 

Attachments

  • IMG_0700.jpeg
    IMG_0700.jpeg
    481.8 KB · Views: 222
  • Like
  • Love
  • Fire
Reactions: 36 users

MDhere

Regular
I think this is some ARM vector acceleration called Helium, he briefly mentions it in the video, sadly not Brainchip....
As @Diogenese mentioned Helium is a tm of Arm. And also in the display is a miniature motor vibration detection for predictive maintenance using M85 😀 now who is into predictive maintenance 😀 and i live when the commentaror asks for is this the star of the show the M85 the Renesas.guy says at the drop of the hat with sheer utter confidence he says yeah OF COURSE IT IS 😀 and he knows to. He has our IP and Brainchip integrates into the top of the range M85 so i say this if someone asks me are you happy right now MdHere and i say yeah OF COURSE (with sheer utter confidence)
And i think i heard him say the word BILLIONS

Have a great weekend fellow brners im off til Tues at Byron surfing reunion 😀🤙 ive pencilled in the May9 9am techtalk with arm but can't work out if that's 1am or 2am 10th south east qld time?
Screenshot_20230506-080101_Chrome.jpg
 
  • Like
  • Love
  • Fire
Reactions: 24 users

Salena

Member
I dont remember Brainchip used the word collaboration before.

"Our collaboration with Arm is set to unlock the next generation of intelligent edge devices with unprecedented levels of performance and functionality," said Rob Telson, BrainChip Vice President of Ecosystem and Partnerships. "We look forward to joining Arm on their Tech Talk presentation and educate listeners how Akida and the Arm Cortex-M85 will deliver capabilities not previously possible."

View attachment 35722


Just saying 😊😊😊 or am I reading too much in this media release.

Learning 🏖
Cheer to a green Friday 🍺🍺🍺
Interesting how Arm is also in collaboration with Intel. Just saying but could be nothing 😊
 

Attachments

  • Screenshot_20230506-100450_LinkedIn.jpg
    Screenshot_20230506-100450_LinkedIn.jpg
    761.2 KB · Views: 100
  • Like
  • Thinking
  • Fire
Reactions: 9 users
Top Bottom