BRN Discussion Ongoing

Unfortunately that same exact sentence has been regurgitated quite a few times over the last 2 years. Its nothing new. Example from a feb 23 article below
View attachment 69465
1726731986691.gif
 
  • Haha
  • Like
  • Thinking
Reactions: 8 users

Rach2512

Regular
Unfortunately that same exact sentence has been regurgitated quite a few times over the last 2 years. Its nothing new. Example from a feb 23 article below
View attachment 69465
So if it's over two years old, we must be getting very close. Thanks for sharing 👍
 
  • Like
  • Fire
  • Love
Reactions: 23 users

IloveLamp

Top 20
  • Like
  • Fire
  • Love
Reactions: 48 users
Now why would Mr Lewis like this???
 

Attachments

  • IMG_4565.png
    IMG_4565.png
    463.7 KB · Views: 260
  • Like
  • Wow
  • Fire
Reactions: 12 users

Diogenese

Top 20
Brainchip redacted?

All the indications are that we are still working with Mercedes, but this Mercedes EQXX page no longer mentions Akida:

https://www.mercedes-benz.com/en/innovation/concept-cars/vision-eqxx-the-new-benchmark-of-effiency

In addition the concurrent article from January 2022 in Motortrend is now dated March 2024:

https://www.motortrend.com/events/m...c_tw_social_MT_220103_sf252559662#sf252559662

Mercedes-Benz Vision EQXX Concept: Did You See This Coming, Elon?​

The establishment strikes back, as the stunning EV shows what a legacy automaker is capable of.
Angus MacKenzieWriterMar 06, 2024

One sentence from that originally 2022 article contains reference to Akida hardware AND software. At the time, I took "software" to be a reference to speech models, but, in light of the more recent specualtion that MB could be using Akida 2/TeNNs simulation software, I see this as adding weight to that hypothesis.

Mercedes engineers worked with California-based artificial-intelligence developer BrainChip to create systems based on the company's Akida hardware and software.

That led to the share price spike and LdN's "Nothing to see here!", so maybe there was an agreement to revert to stealth mode ...

If a company, all alone on the internet says "We are using Akida" and no one hears it ...

I stumbled across this as I was compiling a chronology of MBs AI involvement:

2020 NVIDIA Orin:




June 23, 2020 – Mercedes-Benz, and NVIDIA, the global leader in accelerated computing, plan to enter into a cooperation to create a revolutionary in-vehicle computing system and AI computing infrastructure. Starting in 2024, this will be rolled out across the fleet of next-generation Mercedes-Benz vehicles, enabling them with upgradable automated driving functions.
...
Automated driving functions in future Mercedes-Benz cars will be powered by NVIDIA’s next generation DRIVE platform. The computer system-on-chip (SoC), called Orin, is based on the recently announced NVIDIA Ampere supercomputing architecture.


2022 Akida

Akida's association with MB appears 18 months after NVIDIA.

2023 ChatGPT


June 16, 2023 – Mercedes-Benz is further expanding the use of artificial intelligence and integrating it into the voice control of its vehicles as the next step. By adding ChatGPT, voice control via the MBUX Voice Assistant's Hey Mercedes will become even more intuitive. An optional beta programme will start June 16, 2023 in the U.S. for over 900.000 vehicles equipped with the MBUX infotainment system

Mercedes-Benz MBUX Voice Assistant has already set industry standards and is known for its intuitive operation and a large command portfolio. Driver and passengers can receive sports and weather updates, have questions answered about their surroundings or even control their smart homes.

ChatGPT complements the existing intuitive voice control via Hey Mercedes.

The words used in relation to ChatGPT suggest it is augmenting "Hey Mercedes!", not replacing it.
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 41 users

Flenton

Regular
  • Like
  • Thinking
Reactions: 4 users

Monkeymandan

Regular
Unfortunately that same exact sentence has been regurgitated quite a few times over the last 2 years. It’s
nothing new. Example from a feb 23 article below
View attachment 69465
And there was me thinking I could finally make a worthwhile contribution after lurking in the background the last 18 months with nothing valuable to say. 😬

While I’m here though - thanks to all the regular contributors and in particular those who share technical insight. I am convinced holders patience will be rewarded within the next 12 months.
 
  • Like
  • Fire
  • Love
Reactions: 33 users

manny100

Regular
Brainchip redacted?

All the indications are that we are still working with Mercedes, but this Mercedes EQXX page no longer mentions Akida:

https://www.mercedes-benz.com/en/innovation/concept-cars/vision-eqxx-the-new-benchmark-of-effiency

In addition the concurrent article from January 2022 in Motortrend is now dated March 2024:

https://www.motortrend.com/events/m...c_tw_social_MT_220103_sf252559662#sf252559662

Mercedes-Benz Vision EQXX Concept: Did You See This Coming, Elon?​

The establishment strikes back, as the stunning EV shows what a legacy automaker is capable of.
Angus MacKenzieWriterMar 06, 2024

One sentence from that originally 2022 article contains reference to Akida hardware AND software. At the time, I took "software" to be a reference to speech models, but, in light of the more recent specualtion that MB could be using Akida 2/TeNNs simulation software, I see this as adding weight to that hypothesis.

Mercedes engineers worked with California-based artificial-intelligence developer BrainChip to create systems based on the company's Akida hardware and software.

That led to the share price spike and LdN's "Nothing to see here!", so maybe there was an agreement to revert to stealth mode ...

If a company, all alone on the internet says "We are using Akida" and no one hears it ...

I stumbled across this as I was compiling a chronology of MBs AI involvement:

2020 NVIDIA Orin:




June 23, 2020 – Mercedes-Benz, and NVIDIA, the global leader in accelerated computing, plan to enter into a cooperation to create a revolutionary in-vehicle computing system and AI computing infrastructure. Starting in 2024, this will be rolled out across the fleet of next-generation Mercedes-Benz vehicles, enabling them with upgradable automated driving functions.
...
Automated driving functions in future Mercedes-Benz cars will be powered by NVIDIA’s next generation DRIVE platform. The computer system-on-chip (SoC), called Orin, is based on the recently announced NVIDIA Ampere supercomputing architecture.


2022 Akida

Akida's association with MB appears 18 months after NVIDIA.

2023 ChatGPT


June 16, 2023 – Mercedes-Benz is further expanding the use of artificial intelligence and integrating it into the voice control of its vehicles as the next step. By adding ChatGPT, voice control via the MBUX Voice Assistant's Hey Mercedes will become even more intuitive. An optional beta programme will start June 16, 2023 in the U.S. for over 900.000 vehicles equipped with the MBUX infotainment system

Mercedes-Benz MBUX Voice Assistant has already set industry standards and is known for its intuitive operation and a large command portfolio. Driver and passengers can receive sports and weather updates, have questions answered about their surroundings or even control their smart homes.

ChatGPT complements the existing intuitive voice control via Hey Mercedes.

The words used in relation to ChatGPT suggest it is augmenting "Hey Mercedes!", not replacing it.
For interest, see below a presentation from about 6 months ago where Sean talks about Mercedes involvement subject to an NDA from circa 6 minute mark.
Also talks about the 1000, and 1500 chips and the reasons for producing them in silicon.
Overall a very interesting presentation. Well worth a watch.
I think we can be comfortable that Merc is still involved with BRN. My impression is that is for the futuristic Merc and not at this stage for production line cars.
Brainchip (ASX:BRN): All systems go in 2024! (youtube.com)
 
  • Like
  • Fire
  • Love
Reactions: 19 users

DK6161

Regular
  • Haha
Reactions: 1 users

Diogenese

Top 20
For interest, see below a presentation from about 6 months ago where Sean talks about Mercedes involvement subject to an NDA from circa 6 minute mark.
Also talks about the 1000, and 1500 chips and the reasons for producing them in silicon.
Overall a very interesting presentation. Well worth a watch.
I think we can be comfortable that Merc is still involved with BRN. My impression is that is for the futuristic Merc and not at this stage for production line cars.
Brainchip (ASX:BRN): All systems go in 2024! (youtube.com)
Thanks manny,

Like most of us, I keep wondering "Are we nearly there yet?"
 
  • Like
  • Love
Reactions: 20 users
  • Haha
  • Like
Reactions: 5 users

Evermont

Stealth Mode
Brainchip redacted?

All the indications are that we are still working with Mercedes, but this Mercedes EQXX page no longer mentions Akida:

https://www.mercedes-benz.com/en/innovation/concept-cars/vision-eqxx-the-new-benchmark-of-effiency

In addition the concurrent article from January 2022 in Motortrend is now dated March 2024:

https://www.motortrend.com/events/m...c_tw_social_MT_220103_sf252559662#sf252559662

Mercedes-Benz Vision EQXX Concept: Did You See This Coming, Elon?​

The establishment strikes back, as the stunning EV shows what a legacy automaker is capable of.
Angus MacKenzieWriterMar 06, 2024

One sentence from that originally 2022 article contains reference to Akida hardware AND software. At the time, I took "software" to be a reference to speech models, but, in light of the more recent specualtion that MB could be using Akida 2/TeNNs simulation software, I see this as adding weight to that hypothesis.

Mercedes engineers worked with California-based artificial-intelligence developer BrainChip to create systems based on the company's Akida hardware and software.

That led to the share price spike and LdN's "Nothing to see here!", so maybe there was an agreement to revert to stealth mode ...

If a company, all alone on the internet says "We are using Akida" and no one hears it ...

I stumbled across this as I was compiling a chronology of MBs AI involvement:

2020 NVIDIA Orin:




June 23, 2020 – Mercedes-Benz, and NVIDIA, the global leader in accelerated computing, plan to enter into a cooperation to create a revolutionary in-vehicle computing system and AI computing infrastructure. Starting in 2024, this will be rolled out across the fleet of next-generation Mercedes-Benz vehicles, enabling them with upgradable automated driving functions.
...
Automated driving functions in future Mercedes-Benz cars will be powered by NVIDIA’s next generation DRIVE platform. The computer system-on-chip (SoC), called Orin, is based on the recently announced NVIDIA Ampere supercomputing architecture.


2022 Akida

Akida's association with MB appears 18 months after NVIDIA.

2023 ChatGPT


June 16, 2023 – Mercedes-Benz is further expanding the use of artificial intelligence and integrating it into the voice control of its vehicles as the next step. By adding ChatGPT, voice control via the MBUX Voice Assistant's Hey Mercedes will become even more intuitive. An optional beta programme will start June 16, 2023 in the U.S. for over 900.000 vehicles equipped with the MBUX infotainment system

Mercedes-Benz MBUX Voice Assistant has already set industry standards and is known for its intuitive operation and a large command portfolio. Driver and passengers can receive sports and weather updates, have questions answered about their surroundings or even control their smart homes.

ChatGPT complements the existing intuitive voice control via Hey Mercedes.

The words used in relation to ChatGPT suggest it is augmenting "Hey Mercedes!", not replacing it.
Still in the original release @Diogenese

 
  • Like
  • Love
  • Fire
Reactions: 12 users

Diogenese

Top 20
Still in the original release @Diogenese

Thanks,

I checked a different page.

Afterthought: Electronic bookmarks aren't a patch on the good old postit note, although they do tend to clutter up the screen.
 
Last edited:
  • Like
Reactions: 5 users

Tothemoon24

Top 20
IMG_9580.jpeg
IMG_9581.jpeg
IMG_9582.jpeg
 
  • Like
  • Fire
  • Love
Reactions: 47 users

genyl

Member
Because he thinks it is exciting? The guy simply likes other posts that interest him. It got nothing to do with brainfart.Got damn you need some reality sense. And If you don't believe me, time will tell you that this got nothing to do with brainchip
Now why would Mr Lewis like this??
 
  • Wow
  • Like
  • Sad
Reactions: 4 users

Slade

Top 20
I don’t like putting people on ignore but some make the list with a single post.
 
  • Like
  • Haha
  • Fire
Reactions: 28 users

Labsy

Regular
I don’t like putting people on ignore but some make the list with a single post.
Ditto...I squished that blow fly
 
  • Like
  • Haha
  • Fire
Reactions: 13 users

Quercuskid

Regular
  • Haha
  • Like
Reactions: 14 users

Gazzafish

Regular
Great read and really confirms the future in my
Mind. Right product. Right time 👍


Generative AI and Edge Computing: Unleashing LLMs at the Edge​

In the rapidly evolving landscape of artificial intelligence, two transformative technologies are converging to reshape the future of computing: generative AI and edge computing. As C-suite executives, understanding this intersection is crucial for staying ahead in an increasingly AI-driven world.

The Power of Generative AI Meets the Agility of Edge Computing​

Generative AI, particularly Large Language Models (LLMs), has demonstrated unprecedented capabilities in natural language processing, content creation, and problem-solving. However, these models have traditionally required substantial computational resources, limiting their deployment to cloud-based infrastructures.

Enter edge computing, a paradigm that brings computation and data storage closer to the point of need. By combining generative AI with edge computing, we're on the cusp of a revolution that could democratize access to advanced AI capabilities.

The Challenge: Bringing LLMs to Resource-Constrained Devices​

Deploying LLMs on edge devices presents significant challenges:

  1. Computational Constraints: Edge devices often have limited processing power and memory.
  2. Energy Efficiency: Many edge devices operate on battery power, requiring energy-efficient AI solutions.
  3. Model Size: LLMs can be several gigabytes in size, far exceeding the storage capacity of many edge devices.
  4. Real-time Performance: Edge applications often require low-latency responses, challenging for complex AI models.

Innovative Solutions: Making the Impossible Possible​

Despite these challenges, innovative approaches are emerging to bring the power of LLMs to the edge:

1. Model Compression Techniques​

  • Quantization: Reducing the precision of model parameters without significant loss in accuracy.
  • Pruning: Removing unnecessary connections in neural networks to reduce model size.
  • Knowledge Distillation: Creating smaller, faster models that mimic the behavior of larger ones.

2. Specialized Hardware​

  • AI Accelerators: Custom chips designed for efficient AI computations on edge devices.
  • Neuromorphic Computing: Brain-inspired architectures that promise higher energy efficiency for AI tasks.

3. Distributed AI Architectures​

  • Federated Learning: Enabling edge devices to collaboratively learn a shared model while keeping data locally.
  • Split Inference: Dividing model layers between edge devices and the cloud to balance computation.

4. Adaptive AI Models​

The Business Impact: Why C-Suite Executives Should Care​

The convergence of generative AI and edge computing isn't just a technological marvel—it's a game-changer for businesses across industries:

  1. Enhanced Privacy and Security: Processing sensitive data locally reduces the risk of data breaches and complies with data regulations.
  2. Reduced Latency: Real-time AI responses enable new use cases in robotics, autonomous vehicles, and IoT.
  3. Cost Efficiency: Decreasing reliance on cloud infrastructure can significantly reduce operational costs.
  4. Improved Reliability: Edge AI continuesto function even with unreliable network connections.
  5. New Market Opportunities: Enabling AI on resource-constrained devices opens up new product categories and markets.

Industry Applications: The Future is Now​

The impact of edge-based generative AI is already being felt across various sectors:

  1. Healthcare: AI-powered diagnostic tools running on handheld devices, bringing advanced healthcare to remote areas.
  2. Manufacturing: Real-time quality control and predictive maintenance powered by on-device AI.
  3. Retail: Personalized shopping experiences delivered through AI-enabled point-of-sale systems.
  4. Automotive: Advanced driver assistance systems (ADAS) with on-board natural language processing.
  5. Smart Cities: Intelligent traffic management and public safety systems operating at the edge.

Strategies for C-Suite Executives​

To capitalize on this technological convergence, consider the following strategies:

  1. Invest in R&D: Allocate resources to explore edge AI solutions tailored to your industry.
  2. Foster Partnerships: Collaborate with tech leaders and startups specializing in edge AI and hardware acceleration.
  3. Rethink Data Strategy: Develop a comprehensive edge data strategy that balances centralized and decentralized approaches.
  4. Upskill Your Workforce: Invest in training programs to build internal capabilities in edge AI development and deployment.
  5. Pilot Projects: Initiate small-scale edge AI projects to gain practical insights and demonstrate value.
  6. Prioritize Security: Implement robust security measures for edge devices running AI models.
  7. Stay Informed: Keep abreast of advancements in model compression and edge AI hardware.

The Road Ahead: Challenges and Opportunities​

While the potential of edge-based generative AI is immense, challenges remain:

  1. Standardization: The lack of industry standards for edge AI could lead to fragmentation.
  2. Ethical Considerations: Ensuring responsible AI practices on distributed edge devices.
  3. Integration Complexity: Seamlessly integrating edge AI with existing cloud and on-premises infrastructure.
However, these challenges also present opportunities for forward-thinking organizations to lead and shape the future of AI at the edge.

Conclusion: Seizing the Edge AI Opportunity​

The convergence of generative AI and edge computing represents a pivotal moment in the evolution of artificial intelligence. By bringing the power of LLMs to resource-constrained devices, we're unlocking new possibilities that were once thought impossible.

As C-suite executives, the time to act is now. Those who successfully navigate this technological shift will not only drive innovation within their organizations but also play a crucial role in shaping the future of computing.

The edge AI revolution is here. Are you ready to lead from the front?
 
  • Like
  • Fire
  • Love
Reactions: 46 users
Top Bottom