BRN Discussion Ongoing

  • Haha
  • Like
Reactions: 5 users

Evermont

Stealth Mode
Brainchip redacted?

All the indications are that we are still working with Mercedes, but this Mercedes EQXX page no longer mentions Akida:

https://www.mercedes-benz.com/en/innovation/concept-cars/vision-eqxx-the-new-benchmark-of-effiency

In addition the concurrent article from January 2022 in Motortrend is now dated March 2024:

https://www.motortrend.com/events/m...c_tw_social_MT_220103_sf252559662#sf252559662

Mercedes-Benz Vision EQXX Concept: Did You See This Coming, Elon?​

The establishment strikes back, as the stunning EV shows what a legacy automaker is capable of.
Angus MacKenzieWriterMar 06, 2024

One sentence from that originally 2022 article contains reference to Akida hardware AND software. At the time, I took "software" to be a reference to speech models, but, in light of the more recent specualtion that MB could be using Akida 2/TeNNs simulation software, I see this as adding weight to that hypothesis.

Mercedes engineers worked with California-based artificial-intelligence developer BrainChip to create systems based on the company's Akida hardware and software.

That led to the share price spike and LdN's "Nothing to see here!", so maybe there was an agreement to revert to stealth mode ...

If a company, all alone on the internet says "We are using Akida" and no one hears it ...

I stumbled across this as I was compiling a chronology of MBs AI involvement:

2020 NVIDIA Orin:




June 23, 2020 – Mercedes-Benz, and NVIDIA, the global leader in accelerated computing, plan to enter into a cooperation to create a revolutionary in-vehicle computing system and AI computing infrastructure. Starting in 2024, this will be rolled out across the fleet of next-generation Mercedes-Benz vehicles, enabling them with upgradable automated driving functions.
...
Automated driving functions in future Mercedes-Benz cars will be powered by NVIDIA’s next generation DRIVE platform. The computer system-on-chip (SoC), called Orin, is based on the recently announced NVIDIA Ampere supercomputing architecture.


2022 Akida

Akida's association with MB appears 18 months after NVIDIA.

2023 ChatGPT


June 16, 2023 – Mercedes-Benz is further expanding the use of artificial intelligence and integrating it into the voice control of its vehicles as the next step. By adding ChatGPT, voice control via the MBUX Voice Assistant's Hey Mercedes will become even more intuitive. An optional beta programme will start June 16, 2023 in the U.S. for over 900.000 vehicles equipped with the MBUX infotainment system

Mercedes-Benz MBUX Voice Assistant has already set industry standards and is known for its intuitive operation and a large command portfolio. Driver and passengers can receive sports and weather updates, have questions answered about their surroundings or even control their smart homes.

ChatGPT complements the existing intuitive voice control via Hey Mercedes.

The words used in relation to ChatGPT suggest it is augmenting "Hey Mercedes!", not replacing it.
Still in the original release @Diogenese

 
  • Like
  • Love
  • Fire
Reactions: 12 users

Diogenese

Top 20
Still in the original release @Diogenese

Thanks,

I checked a different page.

Afterthought: Electronic bookmarks aren't a patch on the good old postit note, although they do tend to clutter up the screen.
 
Last edited:
  • Like
Reactions: 5 users

Tothemoon24

Top 20
IMG_9580.jpeg
IMG_9581.jpeg
IMG_9582.jpeg
 
  • Like
  • Fire
  • Love
Reactions: 47 users

genyl

Member
Because he thinks it is exciting? The guy simply likes other posts that interest him. It got nothing to do with brainfart.Got damn you need some reality sense. And If you don't believe me, time will tell you that this got nothing to do with brainchip
Now why would Mr Lewis like this??
 
  • Wow
  • Like
  • Sad
Reactions: 4 users

Slade

Top 20
I don’t like putting people on ignore but some make the list with a single post.
 
  • Like
  • Haha
  • Fire
Reactions: 28 users

Labsy

Regular
I don’t like putting people on ignore but some make the list with a single post.
Ditto...I squished that blow fly
 
  • Like
  • Haha
  • Fire
Reactions: 13 users

Quercuskid

Regular
  • Haha
  • Like
Reactions: 14 users

Gazzafish

Regular
Great read and really confirms the future in my
Mind. Right product. Right time 👍


Generative AI and Edge Computing: Unleashing LLMs at the Edge​

In the rapidly evolving landscape of artificial intelligence, two transformative technologies are converging to reshape the future of computing: generative AI and edge computing. As C-suite executives, understanding this intersection is crucial for staying ahead in an increasingly AI-driven world.

The Power of Generative AI Meets the Agility of Edge Computing​

Generative AI, particularly Large Language Models (LLMs), has demonstrated unprecedented capabilities in natural language processing, content creation, and problem-solving. However, these models have traditionally required substantial computational resources, limiting their deployment to cloud-based infrastructures.

Enter edge computing, a paradigm that brings computation and data storage closer to the point of need. By combining generative AI with edge computing, we're on the cusp of a revolution that could democratize access to advanced AI capabilities.

The Challenge: Bringing LLMs to Resource-Constrained Devices​

Deploying LLMs on edge devices presents significant challenges:

  1. Computational Constraints: Edge devices often have limited processing power and memory.
  2. Energy Efficiency: Many edge devices operate on battery power, requiring energy-efficient AI solutions.
  3. Model Size: LLMs can be several gigabytes in size, far exceeding the storage capacity of many edge devices.
  4. Real-time Performance: Edge applications often require low-latency responses, challenging for complex AI models.

Innovative Solutions: Making the Impossible Possible​

Despite these challenges, innovative approaches are emerging to bring the power of LLMs to the edge:

1. Model Compression Techniques​

  • Quantization: Reducing the precision of model parameters without significant loss in accuracy.
  • Pruning: Removing unnecessary connections in neural networks to reduce model size.
  • Knowledge Distillation: Creating smaller, faster models that mimic the behavior of larger ones.

2. Specialized Hardware​

  • AI Accelerators: Custom chips designed for efficient AI computations on edge devices.
  • Neuromorphic Computing: Brain-inspired architectures that promise higher energy efficiency for AI tasks.

3. Distributed AI Architectures​

  • Federated Learning: Enabling edge devices to collaboratively learn a shared model while keeping data locally.
  • Split Inference: Dividing model layers between edge devices and the cloud to balance computation.

4. Adaptive AI Models​

The Business Impact: Why C-Suite Executives Should Care​

The convergence of generative AI and edge computing isn't just a technological marvel—it's a game-changer for businesses across industries:

  1. Enhanced Privacy and Security: Processing sensitive data locally reduces the risk of data breaches and complies with data regulations.
  2. Reduced Latency: Real-time AI responses enable new use cases in robotics, autonomous vehicles, and IoT.
  3. Cost Efficiency: Decreasing reliance on cloud infrastructure can significantly reduce operational costs.
  4. Improved Reliability: Edge AI continuesto function even with unreliable network connections.
  5. New Market Opportunities: Enabling AI on resource-constrained devices opens up new product categories and markets.

Industry Applications: The Future is Now​

The impact of edge-based generative AI is already being felt across various sectors:

  1. Healthcare: AI-powered diagnostic tools running on handheld devices, bringing advanced healthcare to remote areas.
  2. Manufacturing: Real-time quality control and predictive maintenance powered by on-device AI.
  3. Retail: Personalized shopping experiences delivered through AI-enabled point-of-sale systems.
  4. Automotive: Advanced driver assistance systems (ADAS) with on-board natural language processing.
  5. Smart Cities: Intelligent traffic management and public safety systems operating at the edge.

Strategies for C-Suite Executives​

To capitalize on this technological convergence, consider the following strategies:

  1. Invest in R&D: Allocate resources to explore edge AI solutions tailored to your industry.
  2. Foster Partnerships: Collaborate with tech leaders and startups specializing in edge AI and hardware acceleration.
  3. Rethink Data Strategy: Develop a comprehensive edge data strategy that balances centralized and decentralized approaches.
  4. Upskill Your Workforce: Invest in training programs to build internal capabilities in edge AI development and deployment.
  5. Pilot Projects: Initiate small-scale edge AI projects to gain practical insights and demonstrate value.
  6. Prioritize Security: Implement robust security measures for edge devices running AI models.
  7. Stay Informed: Keep abreast of advancements in model compression and edge AI hardware.

The Road Ahead: Challenges and Opportunities​

While the potential of edge-based generative AI is immense, challenges remain:

  1. Standardization: The lack of industry standards for edge AI could lead to fragmentation.
  2. Ethical Considerations: Ensuring responsible AI practices on distributed edge devices.
  3. Integration Complexity: Seamlessly integrating edge AI with existing cloud and on-premises infrastructure.
However, these challenges also present opportunities for forward-thinking organizations to lead and shape the future of AI at the edge.

Conclusion: Seizing the Edge AI Opportunity​

The convergence of generative AI and edge computing represents a pivotal moment in the evolution of artificial intelligence. By bringing the power of LLMs to resource-constrained devices, we're unlocking new possibilities that were once thought impossible.

As C-suite executives, the time to act is now. Those who successfully navigate this technological shift will not only drive innovation within their organizations but also play a crucial role in shaping the future of computing.

The edge AI revolution is here. Are you ready to lead from the front?
 
  • Like
  • Fire
  • Love
Reactions: 46 users

IMG_2747.jpeg
 
  • Like
  • Fire
  • Wow
Reactions: 14 users
I remember Sean talking about C executives

As C-suite executives, the time to act is now. Those who successfully navigate this technological shift will not only drive innovation within their organizations but also play a crucial role in shaping the future of computing.
 
  • Like
  • Love
Reactions: 14 users
I don’t like putting people on ignore but some make the list with a single post.
Actually, I think Tony Lewis liked it, for the simple reason, that he, can use Microsoft Eureka, to help evaluate the Very small and Tiny SOTA (State of the Art) Language Models that he and the BrainChip Team have developed and are developing.

So not directly an involvement with BrainChip, as Genyl so intelligently stated, but a tool, to help with our Company's progress.
 
  • Like
  • Wow
  • Fire
Reactions: 15 users

The Pope

Regular
Because he thinks it is exciting? The guy simply likes other posts that interest him. It got nothing to do with brainfart.Got damn you need some reality sense. And If you don't believe me, time will tell you that this got nothing to do with brainchip
Looks like a special someone got out of bed from the wrong side this morning. Chip up buttercup. Hope you have a good day.
 
Last edited:
  • Like
  • Haha
Reactions: 14 users
Thanks,

I checked a different page.
The Mototrend article still has it as well, it's just been moved to its own section near the end of the article.
 
  • Like
Reactions: 5 users

Xray1

Regular
Thanks manny,

Like most of us, I keep wondering "Are we nearly there yet?"
IMO..... I was just wondering if Akida 2 (ESP) and TENNs could be or are still in R & D mode till sometime in the 1st Quarter of 2025 :) :)
 
  • Like
Reactions: 1 users
1726821859305.gif

But

1726822228981.gif

To hold this stock
 
  • Like
  • Haha
Reactions: 10 users

manny100

Regular
IMO..... I was just wondering if Akida 2 (ESP) and TENNs could be or are still in R & D mode till sometime in the 1st Quarter of 2025 :) :)
Gen 2 should turn into Gen 3 next year. Looking forward to that. See the AGM replay for details. Tony Lewis speaks from around the 20 minute mark on and has a few slides. He addresses GEN2, TENNs and the future.
Tony describes TENNs as unique and the Swiss Army knife of 'nuero' networks. He says TENNs can be integrated into daily life and the uses are almost limitless.
R & D will see continuous improvements to TENNs. We have already seen the first improvement/expansion in TENNs being the pending patent aTENNuate.
Gen 2 should turn into Gen3 sometime later next year. See the Future slide in Tony's presentation for a guide as to plans.
Link to AGM BELOW.
 
  • Like
  • Love
Reactions: 24 users
IMO..... I was just wondering if Akida 2 (ESP) and TENNs could be or are still in R & D mode till sometime in the 1st Quarter of 2025 :) :)
TENNs is a "part" of AKIDA 2.0 IP and also available independent of it.

Both are available commercially.
And both are still being developed.

Technology doesn't stand still, unless it's left to..
 
  • Like
Reactions: 7 users
Gen 2 should turn into Gen 3 next year. Looking forward to that. See the AGM replay for details. Tony Lewis speaks from around the 20 minute mark on and has a few slides. He addresses GEN2, TENNs and the future.
Tony describes TENNs as unique and the Swiss Army knife of 'nuero' networks. He says TENNs can be integrated into daily life and the uses are almost limitless.
R & D will see continuous improvements to TENNs. We have already seen the first improvement/expansion in TENNs being the pending patent aTENNuate.
Gen 2 should turn into Gen3 sometime later next year. See the Future slide in Tony's presentation for a guide as to plans.
Link to AGM BELOW.

Except I think Gen 3, or AKIDA 3.0 IP, will be distinct, from 2.0 and not just an evolution of it, supplanting it..
 
  • Like
Reactions: 2 users

Diogenese

Top 20
Except I think Gen 3, or AKIDA 3.0 IP, will be distinct, from 2.0 and not just an evolution of it, supplanting it..
Trouble is, we keep trumping our own aces.
 
  • Like
  • Fire
  • Haha
Reactions: 16 users
Top Bottom