BRN Discussion Ongoing

IloveLamp

Top 20
@DingoBorat @Proga
200w (8).gif
 
  • Haha
Reactions: 14 users

Tothemoon24

Top 20
IMG_8239.jpeg


Integrated visual communication is entering a new dimension with #MBUX Surround Navigation. It marries route guidance and driving assistance for a seamless real-time experience brought to life by powerful Unity 3D game engine graphics.

Facilitated by integration with MB.OS, this advanced feature pairs a real-world view of the car’s surroundings using the vehicle sensors. The driver benefits from significantly enhanced situational awareness and “seeing what the car sees.” Incorporation of information into the driver display provides everything the driver needs to know in a single glance. For instance, it shows other cars, vans, trucks, cyclists and even pedestrians and potential road hazards.

It superimposes route guidance into a realistic representation of the surroundings, which is very helpful in busy urban environments. The customer clearly sees exactly where the next turn will take them with easily recognizable buildings and infrastructure. The system also provides integrated visualizations of the car’s current status, including activated indicators, headlights and vehicle movement depicted by spinning wheels.

What cities would you like to test the upcoming MBUX Surround Navigation in first?

IMG_8240.jpeg
 
  • Like
  • Wow
Reactions: 17 users

7für7

Top 20
View attachment 55156

Integrated visual communication is entering a new dimension with #MBUX Surround Navigation. It marries route guidance and driving assistance for a seamless real-time experience brought to life by powerful Unity 3D game engine graphics.

Facilitated by integration with MB.OS, this advanced feature pairs a real-world view of the car’s surroundings using the vehicle sensors. The driver benefits from significantly enhanced situational awareness and “seeing what the car sees.” Incorporation of information into the driver display provides everything the driver needs to know in a single glance. For instance, it shows other cars, vans, trucks, cyclists and even pedestrians and potential road hazards.

It superimposes route guidance into a realistic representation of the surroundings, which is very helpful in busy urban environments. The customer clearly sees exactly where the next turn will take them with easily recognizable buildings and infrastructure. The system also provides integrated visualizations of the car’s current status, including activated indicators, headlights and vehicle movement depicted by spinning wheels.

What cities would you like to test the upcoming MBUX Surround Navigation in first?

View attachment 55157
Did you just put some pictures together with your own text? 🤔
 
  • Haha
Reactions: 2 users
Excellent recent EE article espousing the need and use of in memory and SNN AI.



Revolutionizing AI Inference: Unveiling the Future of Neural Processing​

January 12, 2024 Virgile Javerliac
To overcome CPU and GPU limitations, hardware accelerators have been designed specifically for AI inference workloads, enabling highly efficient and optimized processing while minimizing energy consumption.

The AI industry encompasses a dynamic environment influenced by technological advancements, societal needs and regulatory considerations. Technological progress in machine learning, natural-language processing and computer vision has accelerated AI’s development and adoption. Societal demands for automation, personalization and efficiency across various sectors, including healthcare, finance and manufacturing, have further propelled the integration of AI technologies.

Additionally, the evolving regulatory landscape emphasizes the importance of ethical AI deployment, data privacy and algorithmic transparency, guiding the responsible development and application of AI systems.

The AI industry combines both training and inference processes to create and deploy AI solutions effectively. Both AI inference and AI training are integral components of the overall AI lifecycle, and their significance depends on the specific context and application. While AI training is crucial for developing and fine-tuning models by learning patterns and extracting insights from data, AI inference plays a vital role in utilizing these trained models to make real-time predictions and decisions. The growing importance of AI inference—more than 80% of AI tasks today—lies in its pivotal role in driving data-driven decision-making, personalized user experiences and operational efficiency across diverse industries.

Efficient AI inference implementation faces challenges concerning data availability, computational resources, algorithmic complexity, interpretability and regulatory compliance. Adapting to dynamic environments and managing scalability while controlling costs pose additional hurdles. Overcoming these challenges requires comprehensive strategies, including robust data management practices, advancements in hardware capabilities and algorithmic refinements. Developing explainable AI models and adhering to ethical and regulatory guidelines are crucial for building user trust and ensuring compliance. Furthermore, balancing resource allocation and cost management through efficient operational practices and technological innovations is essential for achieving sustainable and effective AI inference solutions across diverse industry sectors.

The pivotal role of AI inference

By automating tasks, enhancing predictive maintenance and enabling advanced analytics, AI inference optimizes processes, reduces errors and improves resource allocation. AI inference powers natural-language processing, improving communication and comprehension between humans and machines. Its impact on manufacturing includes predictive maintenance, quality control and supply chain management, fostering efficiency, reduced waste and enhanced product quality, highlighting its transformative influence on industry operations.

Industry challenges in sustainable AI inference

AI inference faces challenges concerning high energy consumption, intensive computational demands and real-time processing constraints, leading to increased operational costs and environmental impact. More than 60% of total AI power consumption comes from inference, and the increase of inference demands led to a 2.5× increase in data center capacity over two years (GAFA data). For servers, heat generation during intensive computations necessitates sophisticated cooling systems that further contribute to the overall energy consumption of AI processes.

Furthermore, balancing the need for efficient real-time processing with low-latency requirements, mandatory for servers, advanced driver-assistance systems (ADAS) or manufacturing applications, poses a significant challenge, requiring advanced hardware designs and optimized computational strategies. Prioritizing energy-efficient solutions—without compromising accuracy—with renewable energy sources and eco-friendly initiatives is crucial for mitigating the environmental impact of AI inference processes.

Classical AI inference hardware design, using a CPU or GPU, face limitations in achieving energy efficiency due to the complexity and specificity of AI algorithms, leading to high power consumption (hundreds of watts per multi-core units for servers). Inefficient data movement between processing units and memory further impacts energy efficiency and throughput; for instance, access to external DRAM consumes 200× more energy than access to local registers. At the end, and due to higher computational demands, next-gen servers using a CPU and GPU could consume up to 1,000 W by 2025. Deploying AI inference on resource-constrained, battery-powered devices is even more challenging, as the most efficient CPU- and GPU-based designs, with 10 mW to a few watts, suffer from strong throughput limitations, limiting AI complexity and the final user experience. Balancing energy efficiency with performance and accuracy requirements necessitates careful tradeoffs during the design process, calling for comprehensive optimization strategies. Inadequate hardware support for complex AI workloads can hinder energy efficiency and performance.

The search for energy-efficient solutions

The industry’s escalating demand for energy-efficient AI inference solutions is driven by sustainability goals, cost-reduction objectives and new usages.

Businesses seek scalable and high-performance solutions to manage complex AI workloads without incurring excessive energy costs. On the other hand, energy-efficient AI inference would enable mobile and resource-constrained devices to perform complex tasks without draining the battery quickly while reducing the reliance on cloud-based processing, minimizing data transmission and latency issues. It will contribute to enhanced user experiences through new usages with advanced features, such as real-time language translation, personalized recommendations and accurate image recognition, fostering greater engagement and satisfaction.

Innovative contributions in AI inference

To overcome CPU and GPU limitations, innovative hardware accelerators have been designed specifically for AI inference workloads, enabling highly efficient and optimized processing while minimizing energy consumption. Such accelerators implement an optimized dataflow with dedicated operators (pooling, activation functions, normalization, etc.) used in AI applications. The dataflow engine is the matrix-multiply unit, a large array of processing elements able to efficiently handle large matrix-vector multiplications, convolutions and many more complex operations, knowing the majority of neural networks are based on matrix-multiply operations.

To further optimize energy efficiency, AI accelerators have implemented new techniques, such as near-memory computing. Near-memory computing integrates processing elements within the memory subsystem, enabling faster data processing near the memory, thus reducing the energy consumption associated with data transfer. More recently, new approaches using “non-standard” techniques, such as in-memory computing or spiking neural networks (SNNs), are the most aggressive solutions to achieve highly energy-efficient AI inferences.

In-memory computing conducts computations directly within the memory, at circuit level, eliminating the need for data transfer and enhancing processing speed. The processing can be either performed in an analog or a digital way and implement different memory technologies, such as SRAM, flash or new NVM (RRAM, MRAM, PCRAM, FeFET, etc.). This approach is particularly beneficial for complex AI tasks involving large datasets. SNNs also represent an innovative approach to AI inference: They typically consist of interconnected nodes that communicate through spikes, enabling the simulation of complex temporal processes and event-based computations, which can be useful for tasks like processing time-sensitive data or simulating brain-like behavior.

Shaping the future of AI inference

AI accelerators leveraging near-/in-memory computing or SNNs offer significant impacts for the AI industry, including enhanced energy efficiency, improved processing speed and advanced pattern-recognition capabilities. These accelerators drive the optimization of hardware design, leading to the creation of specialized architectures tailored for specific AI workloads.

Additionally, they promote advancements in edge computing, facilitating efficient AI processing directly on edge devices and reducing latency. The transformative potential of these technologies highlights their crucial role in revolutionizing diverse industries, from healthcare and manufacturing to automotive and consumer electronics.

The integration of highly energy-efficient AI inference in healthcare and automotive sectors yields transformative impacts. In healthcare, it facilitates faster diagnostics and personalized patient care through rapid data analysis, leading to improved treatment outcomes and tailored medical interventions. Additionally, it enables the development of remote patient-monitoring systems, ensuring continuous health tracking and proactive intervention for individuals with chronic conditions. Moreover, in the realm of drug discovery, energy-efficient AI inference expedites the identification of potential drug candidates and accelerates pharmaceutical research and development processes, fostering innovation in medical treatments and therapies.

In the automotive industry, energy-efficient AI inference plays a crucial role in advancing safety features and autonomous-driving capabilities. It empowers vehicles with ADAS and real-time collision detection, enhancing overall road safety. Furthermore, it contributes to the development of self-driving technologies, enabling vehicles to make informed decisions based on real-time data analysis, thereby improving navigation systems and autonomous-driving functionalities. Additionally, the implementation of predictive maintenance solutions based on energy-efficient AI inference enables early detection of potential vehicle issues, optimizing performance, reducing downtime and extending vehicle lifespan.

Conclusion

The industry’s critical demand for energy-efficient AI inference solutions is driven by the need to promote sustainable operations, optimize resource utilization and extend device battery life. These solutions play a vital role in fostering eco-friendly practices, reducing operational costs and enhancing competitive advantages. By facilitating edge computing applications and minimizing energy consumption, energy-efficient AI inference solutions enable businesses to improve profitability, streamline processes and ensure uninterrupted functionality in mobile and IoT devices. Addressing this demand necessitates the development of energy-efficient algorithms and optimized hardware architectures heavily based on smart near-/in-memory computing techniques. Many new players come into the market with innovative computing solutions and the promise of running AI everywhere, from sensors to data centers, with the ambition of offering a completely new user experience.
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 24 users

Diogenese

Top 20
"ARM has released its newest batch of processors, Intel has fallen behind, and Si-Five are the new boy. We know BRN 's ecosystem encompasses all three, but I haven't seen anything to suggest MB is working with any of these."

@Diogenese

The ceo of ARM, Rene Haas, told an interviewer at ces in one of the videos i posted that every ev at ces had ARM chips in it so there is that.

Apologies, i can't remember which post it was.

Edit: found it

Hi ILL,

Yes - I remember that. However, if I recall correctly, the water cooled ADAS processor in CLA concept car was said to be Nvidia. As has been discussed above, there are 1000 chips in a car, so there's plenty of room for ARM outside the ADAS processor.

I think MB marketing were making a virtue of necessity by highlighting the water-cooled processor with the sci-fi blue light.

It would be of advantage to MB to have a processor which does not need water cooling because that is just wasted energy. MB have said the ADAS processor is still in development, so they had to use what they had to hand for the CLA concept. I reckon they will be happy to see the back of the water-cooled processor by off-loading the heavy lifting to Akida.
 
  • Like
  • Fire
  • Love
Reactions: 25 users

wilzy123

Founding Member
Did you just put some pictures together with your own text? 🤔
Yeah... that's it.............

 
  • Haha
  • Like
Reactions: 9 users

IloveLamp

Top 20
Yeah... that's it.............

hsL7kR.gif
 
  • Haha
  • Like
Reactions: 10 users
Hi FF, See the link below from BRN Website. Mercedes gets a mention. I have not seen this link before today. Its Titled "

Designing Smarter and Safer Cars with Essential AI"​

Under the SOC chip drawing it says " AKIDA intergrated into SOC"
I presume this may mean the SOC may be provided by a 3rd party , eg NVIDIA, Renesas, Microchip etc.
If this is the case it will be a watch the financials?
Correct me if i am wrong.
I am thinking that BRN may get involved in the design of AI applications as an extra source of revenue and a way of retaining customer loyalty?
Hi Manny
That link is one I have either missed or as you say may have just popped up.

Thank you very much for sharing it really needs to be read by everyone.

I am not sure regarding your last point but it was a theme at CES that Brainchip and Edge Impulse were stressing about making adoption easy because of the novelty of AKIDA.

Obviously the more sensible practical assistance Brainchip is able to offer the less intimidating adoption becomes.

Maybe @ Diogenese has something to offer on this question.

My opinion only DYOR
Fact Finder
 
  • Like
  • Fire
Reactions: 20 users
I have been working my way through past posts trying to find all the predictions I’ve made that have not come true and misled others as alleged by someone here and repeatedly over at HC in recent weeks.

I have been motivated to do so as I must admit to being concerned that these comments might be taken as true if not rebuffed.

I am old fashioned which makes me value personal integrity and boring as hell on that subject.

It has been my view that I had avoided making predictions and always encouraged others to do their own research and never believe anything I or anyone else has to say until they have done so.

I have multiple times suggested that individuals always read the document or watch the interviews relied upon before accepting someone else’s anonymous opinion as to its meaning.

The recent Rob Telson interview and his comments regarding Mercedes Benz being a case in point.

There was a clear and immediate attempt here and at HC to mislead as to what was actually said and intended by Rob Telson.

This review has been made difficult because the words ‘I predict’ are not habitually used by me so it’s the vibe I have been looking for as the absence of these words does not discount that I may have used words that create the same impression. In other words I have tried to be as objective as I can be.

So in 2019 - 2020 I think it reasonable to say the vibe I projected was that AKD1000 would be considered amazing technology. I was particularly drawn to the absence of heat generation and the lack of external cooling.

I was not the only one of course so not claiming anything special about this prediction but it to my mind has been vindicated by Edge Impulse describing AKD1000 as Science Fiction and all the present use cases being revealed by partners and researchers.

The technology reviewer in Forbes Magazine using 'mind boggling' to describe AKIDA 2.0 probably nails my prediction as having been proven.

I did make positive comments in 2020 around the announced relationships with Socionext, Ford, Valeo & NASA as well as the IP licence sale to Renesas. I personally do not consider I made predictions but if it were argued I did then as these relationships are ongoing and still in play and the results are not known it logically cannot be claimed that these were false predictions or mistakes.

In 2019 - 2020 following the 14 December, 2019 joint presentation by Tata Consulting Services (TCS) and Brainchip of gesture recognition after a great deal of research I think it could be said I predicted big things from this relationship. Others may disagree of course but the below reference in Tata Elxsi’s Quarterly Report seems to vindicate that prediction when added to the ongoing research results published by TCS.

Going into price predictions I made three the first two were reasonably close to the mark. The third was wrong but some might consider that as it was made a month before the World went to hell with an invasion of Ukraine and another minor global event and chip shortages that it was one which could be forgiven. I will remind all that my prediction of $2.75 by Christmas of that year was well below the highest prediction of 50 odd dollars as part of a guessing game being run among shareholders. Some might think that the descriptor 'Share Price Guessing Game' has some relevance as to the seriousness with which to take such predictions unaccompanied as they were by any detailed explanation of how they were formulated.

In 2021 otherwise the closest things to predictions were firstly my comments around the significance of being aligned with NASA and I still maintain that NASA is a great endorsement of AKIDA technology and of course is still ongoing and has seen a number of other partnerships in the aerospace industry emerge. ISL, Intellisense Systems, ANT61, EDGX, Vorago for example.

The second were my very positive comments about the IP sale to MegaChips. Again the best part of $5 million in receipts from MegaChips and the ongoing commercial relationship would justify denying that this prediction about its significance would fit with a claim of failure on my part.

This brings us to 2022 and 2023. My memory serves me well enough to say I have made no supported predictions around revenue or the share price during this period. I have of course been highly complimentary of partnerships such as the ARM, SiFive, Intel, Edge Impulse, Tata Elxsi, Mercedes Benz and others however even if these complimentary statements could be interpreted as some form of prediction the fact that they are still in play and that sales pipelines for IP run out to years it is too early to judge if my predictions are flawed.

In 2023 I did post a list of companies and educational institutions that are engaged with Brainchip however if this list proves not to be correct it would not be any error on my part as I had a fairly aggressive email trail with Mr. Tony Dawe to obtain confirmation of the lists accuracy before I revealed it on TSEx.

Accordingly this list cannot be the source of any angst and indeed is now probably in need of an update with the inclusion of Microchip, Infineon and OnSemi.

Anyway taking into account my review of what I have posted I can call out all these claims as lies.

I have no idea why some find the need to adopt anonymous identities and lie about others online to disparage their character.

Then again there are lots of things people do which I do not understand like being cruel to animals, abusing children and the elderly and which also defy logical explanation.

I suspect it will have to remain a mystery unless they feel inclined to come out of the shadows and explain their motivations and detail with particularity the egregious and misleading posts I have made that they are so offended by and have compelled them to publish their false claims.

My opinion only DYOR
Fact Finder

View attachment 55119

View attachment 55118
Greetings FF.

Well said. Just wondering if this is the VIBE you refer to:ROFLMAO:

 
  • Haha
  • Love
Reactions: 6 users

TopCat

Regular
IMG_3398.jpeg
 
  • Like
  • Love
  • Fire
Reactions: 39 users

Diogenese

Top 20
Hi FF, See the link below from BRN Website. Mercedes gets a mention. I have not seen this link before today. Its Titled "

Designing Smarter and Safer Cars with Essential AI"​

Under the SOC chip drawing it says " AKIDA intergrated into SOC"
I presume this may mean the SOC may be provided by a 3rd party , eg NVIDIA, Renesas, Microchip etc.
If this is the case it will be a watch the financials?
Correct me if i am wrong.
I am thinking that BRN may get involved in the design of AI applications as an extra source of revenue and a way of retaining customer loyalty?
Hi Manny,

I'll take "AI applications" to refer to models which act as the memory of the NN, equivalent to past experience or learning. At one level, you can think of the NN as comparing the input signals with the stored images, sounds, etc, in the model.

The models are used in configuring the NN, such as number of layers, number of nodes in a layer, weights and connections between NPUs and adjacent layers (or later layers with long range skip connections.

BRN has developed MetaTF to simplify adoption of Akida. It includes a software simulator for Akida and also has a system for CNN2SNN conversion so customers can convert their CNN models to SNN.

BRN also has a model library "zoo" which contains databases of images, speech, text, volatile organics, ...

There are many open source model libraries. In some cases, the user can use Akida's 1-shot learning to augment a pre-existing model to adapt it to the user's particular use case, or they may compile an entire model for their specific purpose. Assisting with this may be part of the after-sales service which BRN provides.
 
Last edited:
  • Like
  • Fire
Reactions: 26 users

7für7

Top 20
This guy with the red shirt looks like Georgios Tsoukalos… “and WE HAVE the evidence that extraterrestrial life used akida THOUSANDS OF YEARS AGO… its PROOFED by ancient Sumerian texts which are still not discovered… but we know it!”
 
  • Haha
  • Like
Reactions: 5 users

IloveLamp

Top 20
This guy with the red shirt looks like Georgios Tsoukalos… “and WE HAVE the evidence that extraterrestrial life used akida THOUSANDS OF YEARS AGO… its PROOFED by ancient Sumerian texts which are still not discovered… but we know it!”


joker-fair-enough.gif
 
  • Like
Reactions: 2 users

GazDix

Regular
For the last year I have been looking forward to analysing the changes to the top 20 holders every quarter... Didn't arrive this time.

Sent an email to TD anyway. Either way, the answer back can't be good in my opinion.
 
  • Like
  • Fire
  • Thinking
Reactions: 5 users
Jesus Christ that was quite an outrage. Where should I even begin?

Regarding the excitement:
For example “Brilliant quarterly update” comes to my mind.

Fact Finder the bankruptcy and insolvency issue is noted and something that I learned today. I was in the wrong.

How you come up that I post something criminal though I have no idea.
A company with -14000% net profit margin in the penultimate 4C is *closer* I repeat closer! to insolvency than it is to being profitable. That’s not rocket science and in no way something against the law.
Even our CEO stated that this year makes or breaks the company. Do you want to file a lawsuit because our own CEO acknowledges the situation we’re in?

Altering what I’ve written changing the context and claiming that I said that the company is *close*, close! to bankruptcy, hence insolvency, that’s unlawful. That’s spreading misinformation at its finest.

Do I live closer to Egypt than you do? Yes I do. Do I live close to Egypt? No I don’t.

How you come up that we’re partnered with Samsung is something you have to explain to us.
That’s what I’ve written “The issue is. If Samsung is not using us through ARM or whoever else it might be we compete against a company with its own foundries.”

Tell me where do I claim our partnership with Samsung? Sounds made up to me.
Hell, I’ll even link it for your convenience.

If we’re not involved through a third party that uses Akida (Arm, Megachips or whoever else it might be), not partnership with Samsung, a third party, we’re gonna have some serious issues.
Stop claiming things I supposedly said but didn’t. And for god sake stop altering my sentences to make them fit your narrative and changing the context of what I’ve written. That really pissed me off!

Seeing all the cheerleaders mindlessly agreeing with everything you say and lashing out against anyone who thinks otherwise makes me question why I’m even here. This place has turned into an echo chamber that just allows one single point of view. I hope that’s a wake up call for some.

I always regarded you highly since you’ve been there since day one with great research but threatening me, altering my sentences and the context of what I’ve written and then suggesting me not to continue with this vein left an enormous dent.
 

Attachments

  • IMG_4741.jpeg
    IMG_4741.jpeg
    292.2 KB · Views: 122
  • Like
  • Fire
  • Love
Reactions: 10 users

stockduck

Regular
View attachment 55019
This is one of the most interesting article...

the more I read it, the more ideas comes in my mind in how many cases could be akida IP built in.

What if valeo hasn`t "lost the presumed business" on Lidar-Systems with MB... only the technology will be different that they now are working together
"
"to drive thermal management strategies of EVs for extended driving range and extendet battery lifespan.." with a software from valeo?

It is only speculation, but 2028 will be exciting I guess.

Read more at:
https://auto.economictimes.indiatim...ff0&utm_medium=smarpshare&utm_source=linkedin
 
Last edited:
  • Like
  • Fire
Reactions: 3 users

Tothemoon24

Top 20


January 2024 Newsletter​

BrainChip went all in at CES in Las Vegas, delivering demos that showcased the power of Akida™ and its partner ecosystem. The variety of use cases demonstrated, the partnership announcements, and collaboration at the event, all resulted in significant interest from the AI community and in highly productive meetings with new prospects and partners. The momentum is rapidly building for an exciting 2024. Read on!

Innovation Leadership​

"All Things AI" Podcast Captures Candid Insights from Leaders at CES​

Live at CES, the BrainChip team recorded exclusive interviews with AI industry leaders from Global Foundries, Teksun, Onsemi, Infineon Technologies, Tirias Research, Edge Impulse, and other top organizations. Learn about developments that are transforming the AI tech landscape from these insightful discussions.

BrainChip Successes in 2023​

2023 has wrapped, and we are proud to share our accomplishments. Commemorate with us some of the highlights of a banner year that saw outstanding product releases, multiple patent awards, expanded partner relationships, and a broader global presence that grew to include EMEA (Europe/Middle East/Asia).

BrainChip Achieves Another Patent Grant​

In December 2023, another BrainChip patent was granted, this time recognizing event-based pattern detection that enables learning in a digital hardware implementation of a spiking network. The U.S. grant further strengthens BrainChip's IP portfolio and reinforces the value of our innovative technology, which continues to push the boundaries of neuromorphic AI solutions.

Integration Partnerships​

Microchip’s 32-bit MPU Does More AI with BrainChip’s Akida, a Hit at CES​

BrainChip geared up at CES to spotlight the revolutionary capabilities of our Akida IP in conjunction with Microchip’s SAMv71 Ultra Board and SAMA7G54-EK Board in a specialized demonstration. This unique showcase shed light on widely used, always-on machine learning tasks and how MCUs and MPUs can be extended to scale the intelligence at the Sensor Edge.

BrainChip and NVISO Group Demonstrated AI-Enabled Human Behavioral Analysis at CES​

NVISO Group’s AI Human Behavioral Software and BrainChip’s Akida neuromorphic compute are an ideal combination for a system that monitors the state of the users through real-time perception and observation of head and body pose, eye tracking, and gaze. The result is an application that can accurately gauge emotion and provide insights into better decision-making in use cases like in-cabin safety. The live demo of this capability was a crowd-pleaser at CES 2024.

BrainChip and MYWAI Partner to Deliver Next-Generation Edge AI Solutions​

MYWAI, whose AIoT Platform for EaaS (Equipment as a Service) can stream, process, and manage multimodal sensor data, plans to leverage the real-time, energy-efficient AI computation of BrainChip’s Akida to improve response and scale. This strategic partnership is expected to accelerate the adoption of Edge AI in the industrial and robotic sectors.

Upcoming Events​

We hope you can join us at an industry event.
February 21, 2024 - IFS Direct Connect (San Jose, CA)
At this all-day Intel Foundry Services event in the San Jose McEnery Convention Center, we'll exchange information about how Edge AI can work with Intel’s Open System Foundry and share how Akida can benefit organizations’ supply chains and meet other business-critical challenges.
That’s just the beginning of this new year! We are filling up our events calendar with participation in numerous global industry events in 2024, including Embedded World Conference (Nuremberg), tinyML 2024 and many more.
Want to schedule a meeting with our team while you're at one of these events? Email sales@brainchip.com or meet with us.
 
  • Like
  • Fire
  • Love
Reactions: 33 users

IloveLamp

Top 20
  • Like
  • Fire
  • Love
Reactions: 18 users
Top Bottom