BRN Discussion Ongoing

Cartagena

Regular
So thank you for posting this.
I read it today in the news form the IAA 2023 in munich.
Another Information found :

https ://www.iaa-transportation.com/en/newsroom/iaa-voices-interview-apex-ai

Interview Apex.AI​

".......

Before joining Apex.AI, he held various positions with companies, including Nvidia, Daimler Group, and Delphi.

..."


I love to see this memberchip.....like these "body members" and "voting members" from soafee.

https ://www.soafee.io/about/members

https ://www.prnewswire.com/news-releases/autonomous-driving-moia-counts-on-apexai-software-for-passenger-management-development-301834296.html

"....
The partnership pays towards MOIA's goal of working with Volkswagen Commercial Vehicles (VWCV) to develop Europe's first type-certified AD-MaaS system and successfully launch an integrated autonomous, scalable ridepooling system on the road in Hamburg after 2025.
...."


....and who did shortly an improvement on his patent in case of "semantic segmentation":)
(https ://cdn-api.markitdigital.com/apiman-gateway/ASX/asx-research/1.0/file/2924-02702229-2A1468998?access_token=83ff96335c2d45a094df02a206a39ff4)

well Apex ai seems to know, what they are talking about....

https ://www.apex.ai/wir

Have you ever faced system freezes and shutdowns when processing large amounts of data? Specifically, image processing algorithms like deep neural network based object detection and semantic segmentation in autonomous driving applications are very demanding in terms of data transfer rate and processing power.


In this talk, we show how to efficiently implement a computer vision pipeline using Apex.OS (a safety certified fork of ROS2) which utilizes zero-copy optimizations on the middleware level to reduce bandwidth requirements. In addition, we use hardware accelerated versions of the algorithms to increase the throughput. We also explain how to abstract this hardware acceleration in the application code to decouple it from the underlying SoC.

I have no clue if it has to do with brainchip IP(uuuuu):cry::p

Hi Learning and Stockduck,

Volkswagen Commercial Vehicles sounds like one hell of a client to have onboard and interestingly ApexAI focuses on the main things Brainchip is in like object detection and machine learning. Yes we sure hope that "we are" the hardware in this application and we are not going for a joyride and our brains/tech secrets are being freely shared rather than being used to create future revenue contracts.

In my view shouldn't we be signing IP licensing contracts with our partners if we are collaborating with them to safeguard our IP and bring products to market? Emotion3D is not under an NDA so why have we not heard of such contracts yet?

I remain positive about this and hope we hear an announcement of a license or IP contract with Emotion 3D or any of our other partners in the not too distant future. Meanwhile we remain patient on Akida Gen 2 release, only around 3 weeks to go before end of this quarter. 😑
 
  • Like
  • Fire
Reactions: 9 users

cosors

👀
So thank you for posting this.
I read it today in the news form the IAA 2023 in munich.
Another Information found :

https ://www.iaa-transportation.com/en/newsroom/iaa-voices-interview-apex-ai

Interview Apex.AI​

".......

Before joining Apex.AI, he held various positions with companies, including Nvidia, Daimler Group, and Delphi.

..."


I love to see this memberchip.....like these "body members" and "voting members" from soafee.

https ://www.soafee.io/about/members

https ://www.prnewswire.com/news-releases/autonomous-driving-moia-counts-on-apexai-software-for-passenger-management-development-301834296.html

"....
The partnership pays towards MOIA's goal of working with Volkswagen Commercial Vehicles (VWCV) to develop Europe's first type-certified AD-MaaS system and successfully launch an integrated autonomous, scalable ridepooling system on the road in Hamburg after 2025.
...."


....and who did shortly an improvement on his patent in case of "semantic segmentation":)
(https ://cdn-api.markitdigital.com/apiman-gateway/ASX/asx-research/1.0/file/2924-02702229-2A1468998?access_token=83ff96335c2d45a094df02a206a39ff4)

well Apex ai seems to know, what they are talking about....

https ://www.apex.ai/wir

Have you ever faced system freezes and shutdowns when processing large amounts of data? Specifically, image processing algorithms like deep neural network based object detection and semantic segmentation in autonomous driving applications are very demanding in terms of data transfer rate and processing power.


In this talk, we show how to efficiently implement a computer vision pipeline using Apex.OS (a safety certified fork of ROS2) which utilizes zero-copy optimizations on the middleware level to reduce bandwidth requirements. In addition, we use hardware accelerated versions of the algorithms to increase the throughput. We also explain how to abstract this hardware acceleration in the application code to decouple it from the underlying SoC.

I have no clue if it has to do with brainchip IP(uuuuu):cry::p
They will expand their cooperation with China and are already doing so in order to survive.
I see the change of CEO as critical and negative. They are fighting not to go under, even if we can't quite understand and believe it yet.
So better look for VW in China, maybe SiFive?

They don't really have their own know-how that works as the market matures. So they buy in in China.
The opposite of MB.
But that is just my assumption.
 
Last edited:
  • Like
  • Thinking
Reactions: 4 users

stockduck

Regular
Bravo I respect your work so much, but I think we have to stop digging and looking for dots.
You/ We've found so many connections over the years but nothing at all eventuated to something material.
It's just time that the company finally delivers. The initial date for revenue was end of 2022. Now everyone acts like it has always been 2024 – 2025.

The last months I received a lot of backlash. I turned from this excited investor who believes in the technology (what I still do btw it's the management that I find highly uncapable) into someone who's very critical in his posts.
The responses are always the same. Either, the shorters are blamed for everything. Or it's just fine losing 80%+ of a company's value.
What I'm saying is that the share price is a consequence of the capability of our management. And they failed massively.
Someone on this forum even said he'd bet his life that we are involved with Valeo. Backed up was this claim by some reactions of the presenter during a presentation when asked if Akida was inside. Or something similar, I don't know what the correct wording was.
Well now we know we're not as literally everyone moved away from our failed first gen product, seemingly even Mercedes Benz who were very outspoken.

Now it's time for the management to finally achieve something material. If that happens we can start looking for dots again. Up until this point I'll never take any dot serious, no matter how convincing it could be.
What if the colaboration between valeo and mobileye was born because of an positiv partnerprogram between Intel and brainchip?
So there is a possibility, that everything is fine......., but also maybe not.

here is information from IAA2023 to mobileye and a future goal:

https ://www.iaa-mobility.com/en/newsroom/news/future-technology/the-evolution-of-vehicle-sensor-systems

"....

Lidar systems for real-time image recognition​

Lidar systems are based on optical signals. As a result, obstacles, apart from metal objects, can be better detected than when using radar. Simpler short-distance lidar systems are already being used in emergency braking assists, but the truly high-performance devices are still in development. Alongside a number of smaller manufacturers such as the Intel subsidiary Mobileye, for a few years now the German sectoral giant Continental has also been active in this market. With its High Resolution 3D Flash Lidar, in future Continental is looking to enable real-time 3D monitoring of the surroundings, with image interpretation. The developers at Mobileye are planning something similar, and in addition to the sensor system they are also researching new data processing hardware. For 2025, the specialist is planning the market launch of a silicon-based “system-on-chip”, capable of better processing the huge data volumes generated with lidar systems.
....."
 
  • Like
Reactions: 6 users

stockduck

Regular
fasten your seatbelts ......!:LOL:

https ://www.iaa-mobility.com/en/newsroom/news/autonomous-driving/the-road-to-fully-autonomous-vehicles

"....
For example, a recent OTA provided to Zeekr, an electric vehicle brand owned by Geely that uses SuperVision, allowed the company to provide an advanced update to the vehicle's adaptive cruise control and highway assistance systems. Now, instead of looking only at the vehicle immediately ahead, the updated system takes into account the entire scene around the vehicle, much like a human driver.
For example, it can detect a traffic jam ahead, even if the car immediately ahead has not yet begun to brake. The system can also react to various objects and situations, such as a vehicle on the side of the road with its door open, or a pedestrian on the side of the road. The update also enables the car to drive on any road with clear lane markings at speeds of up to 130 km/h.

...."
 
  • Like
  • Fire
Reactions: 6 users

Foxdog

Regular
Talking about neuromorphic, isn't this what Brainchip does?


Australian Defence Magazine
Menu



  • Credit: Defence
NextNext

Sydney Nano develops neuromorphic sensor for RAAF​

24 May 2021
Comments 0 Comments
ShareFacebookTwitterLinkedInRedditTelegramEmailCopy Link

2 / 3 free articles left.
The Jericho Smart Sensing Lab at the University of Sydney Nano Institute has developed a prototype sensor system for the RAAF that mimics the brain’s neural architecture to deliver high-tech sensing technology.
Dubbed MANTIS, the system under development integrates a traditional camera with a visual system inspired by neurobiological architectures. This ‘neuromorphic’ sensor works at 'incredibly high' speeds, which will allow aircraft, ships and vehicles to identify fast moving objects, such as drones.
“Combining traditional visual input with the neuromorphic sensor is inspired by nature. The praying mantis has five eyes – three simple eyes and two composite. Our prototype works a bit like this, too,” Professor Ben Eggleton, Director of Sydney Nano, said.
MANTIS combines two visual sensing modes in a lightweight portable unit, operated via dashboard interface, with a processing system that is on board the craft, ship or vehicle. The aim is to enable a direct comparison of images and allow for the rapid exploration of the neuromorphic sensor capabilities.
Sydney Nano worked with the School of Architecture, Design and Planning to develop the prototype in just three months.
“There are many things that excite me about MANTIS. The level of detail that it provides and its ability to track high-speed events is very impressive," Air Vice-Marshal Cath Roberts, Head of Air Force Capability, said. “It's a promising sensor fusion that has really strong potential across Defence.”

Professor Eggleton leads the Jericho Lab team that saw delivery of the prototype. The four-kilogram small-form design will allow the camera to be easily used on aircraft, ships and vehicles to detect challenging targets.

“The neuromorphic sensor has exquisite sensing capabilities and can see what can't be seen with traditional cameras,” he said. “It invokes the idea of the eye in animals but has leading-edge technology built into it.”

Whereas a traditional camera is constrained by frame rates, each pixel in a neuromorphic camera functions independently and is always ‘on’. This means the imaging system is triggered by events. If it’s monitoring a static scene, the sensor sees nothing and no data is generated.

“When there is an event, the sensor has incredible sensitivity, dynamic range and speed,” Professor Eggleton said. “The data generated is elegantly interfaced with an IT platform allowing us to extract features using machine-learning artificial intelligence.”

“We look forward to developing this device further and collaborating with other experts in this area, including Western Sydney University’s International Centre for Neuromorphic Systems, which are the leaders in neuromorphic research in Australia.

MANTIS is the result of the partnership between the University of Sydney Nano Institute and Air Force’s Jericho Disruptive Innovation.

“A rapid prototype of this type and scale happening in three months, during COVID, is remarkable,” said Wing Commander Paul Hay, head of advanced sensing at RAAF Jericho Disruptive Innovation.

The Defence Science and Technology Group (DSTG) was also involved in the collaboration, providing early guidance and input.
Confirm this article was over 2 years ago?
 
  • Like
Reactions: 2 users

manny100

Top 20
BRN have no control overr when demand for the Edge arrives and grows.
BRN has control over setting up the company to be in the best possible position to take advantage of the demand when it arrives. I think BRN are doing a great job in preparing the business for when this occurs.
I am patient and not worried about my investment which a medium/long term outlook.
 
  • Like
  • Fire
  • Love
Reactions: 32 users

Cartagena

Regular
  • Like
Reactions: 2 users

Damo4

Regular
Akida’s neuromorphic architecture delivers high performance with extreme energy efficiency enabling AI solutions previously not possible on battery-operated or fan-less embedded Edge devices.

Interesting that RT would like the below found by @IloveLamp , and then release a PR statement that includes such specific wording about battery powered (wearables) and fan-less devices.

https://www.bosch-sensortec.com/news/worlds-smallest-particulate-matter-sensor-bmv080.html

View attachment 43522
 
  • Like
  • Fire
  • Thinking
Reactions: 12 users
  • Like
  • Fire
Reactions: 15 users

IloveLamp

Top 20
I'm just gonna leave this here.........
Screenshot_20230907_074316_LinkedIn.jpg


Screenshot_20230907_074355_LinkedIn.jpg


She also commented on this...

Screenshot_20230907_075258_LinkedIn.jpg


And Sougata from SAMSUNG reciprocated....

Screenshot_20230907_075328_LinkedIn.jpg



WAYMO too......

Screenshot_20230907_075842_LinkedIn.jpg
Screenshot_20230907_075854_LinkedIn.jpg
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 38 users

Learning

Learning to the Top 🕵‍♂️
Hi Learning and Stockduck,

Volkswagen Commercial Vehicles sounds like one hell of a client to have onboard and interestingly ApexAI focuses on the main things Brainchip is in like object detection and machine learning. Yes we sure hope that "we are" the hardware in this application and we are not going for a joyride and our brains/tech secrets are being freely shared rather than being used to create future revenue contracts.

In my view shouldn't we be signing IP licensing contracts with our partners if we are collaborating with them to safeguard our IP and bring products to market? Emotion3D is not under an NDA so why have we not heard of such contracts yet?

I remain positive about this and hope we hear an announcement of a license or IP contract with Emotion 3D or any of our other partners in the not too distant future. Meanwhile we remain patient on Akida Gen 2 release, only around 3 weeks to go before end of this quarter. 😑
Hi Cartagena

I do to believe in the near distant future, Brainchip will have new licensing and IP deals sign.

With regard to Emotion3D.

"Laguna Hills, Calif. – February 26, 2023 – BrainChip Holdings Ltd (ASX: BRN, OTCQX: BRCHF, ADR: BCHPY), the world’s first commercial producer of ultra-low power, fully digital, event-based, neuromorphic AI IP, today announced that it has entered into a partnership with emotion3D to demonstrate in-cabin analysis that makes driving safer and enables next level user experience"



Hence, I don't believe we will see an licensing or IP contract announcement in the future. From my understanding it's a joint partnership for the project. Brainchip and Emotion3D will collaborate to develop and most likely will profit share from the result.

I don't remember with podcast; but I confident, everytime Brainchip has entered into a partnership with there partners. There would be contractual arrangements between parties to protect their IP.

Learning 🪴
 
  • Like
  • Fire
  • Love
Reactions: 15 users

Tony Coles

Regular
  • Haha
Reactions: 14 users
Some interesting comments made here by a Nvidia exec:


"According to Das, the total addressable market for AI will consist of $300 billion in chips and systems, $150 billion in generative AI software, and $150 billion in omniverse enterprise software. These figures represent growth over the "long term," Das said, though he did not specify a target date."

"If you think about the traditional computing systems based on CPU computing, what has changed over the decades is simply the location — you're in the cloud, you're doing it on your phone, but it's essentially the same style of computing," Das told the audience. "And more and more, the functions of companies are being done in computing. That means you need more and more computing in the world, which means you need more data centers. You need more energy. You need more horsepower, and it's just not sustainable."

Maybe contact the Brainchip Helpline!
 
  • Like
  • Fire
  • Love
Reactions: 21 users

DK6161

Regular
  • Haha
  • Like
  • Love
Reactions: 6 users

Balliwood

Member
The article posted by Cartegna, “Sydney Nano develops neiromorphoc sensor for RAAF” about an Australian group developing, in three months, an impressive neuromorphic detector, is a bit depressing.

Nearly three years ago Akida chips were available on boards. I had thought bright nerds bought them, ran to their garages and started soldering. But where are the Youtube videos of brilliant devices? It is not that there are no commercial examples of Alida - there seem to be no working examples of any sort.

We have bragged of being three years ahead, but the peloton has caught us. What has gone wrong?
 
  • Sad
  • Like
  • Haha
Reactions: 10 users

Cartagena

Regular
Hi Cartagena

I do to believe in the near distant future, Brainchip will have new licensing and IP deals sign.

With regard to Emotion3D.

"Laguna Hills, Calif. – February 26, 2023 – BrainChip Holdings Ltd (ASX: BRN, OTCQX: BRCHF, ADR: BCHPY), the world’s first commercial producer of ultra-low power, fully digital, event-based, neuromorphic AI IP, today announced that it has entered into a partnership with emotion3D to demonstrate in-cabin analysis that makes driving safer and enables next level user experience"


Hence, I don't believe we will see an licensing or IP contract announcement in the future. From my understanding it's a joint partnership for the project. Brainchip and Emotion3D will collaborate to develop and most likely will profit share from the result.

I don't remember with podcast; but I confident, everytime Brainchip has entered into a partnership with there partners. There would be contractual arrangements between parties to protect their IP.

Learning 🪴


Hi Learning,

Appreciate your response. As you say, if there are contractual arrangements (well I hope) being made, why aren't they being announced as these are surely material developments and warrant some form of proper announcement to shareholders?

In saying this, profit sharing from a joint venture is a commercial contractual agreement and from my business experience this definitely needs to be solidified into a contract if Brainchip and another partner such as Emotion3D are working together on our IP.
 
  • Like
Reactions: 4 users

HopalongPetrovski

I'm Spartacus!
  • Haha
Reactions: 6 users

Dijon101

Regular
The article posted by Cartegna, “Sydney Nano develops neiromorphoc sensor for RAAF” about an Australian group developing, in three months, an impressive neuromorphic detector, is a bit depressing.

Nearly three years ago Akida chips were available on boards. I had thought bright nerds bought them, ran to their garages and started soldering. But where are the Youtube videos of brilliant devices? It is not that there are no commercial examples of Alida - there seem to be no working examples of any sort.

We have bragged of being three years ahead, but the peloton has caught us. What has gone wrong?

Nice try dipshit.
 
  • Like
  • Haha
  • Love
Reactions: 33 users
Top Bottom