BRN Discussion Ongoing

Bravo

If ARM was an arm, BRN would be its biceps💪!
For some reason that picture reminded me of this, the Era and the cigarette, I guess..

Ahh I'm really looking forward to holidaying in Ipanema one day and feeling the blissful cutting sting, of being ignored by beautiful women..




Well you’re wrong there because I go to Ipanema all the time and I would never ignore you!


1702990589312.gif
 
  • Haha
  • Like
  • Love
Reactions: 13 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Evening all
As always, thanks for most of your contributions. Excited as ever.
Just wondered what we thought the chances were of Mercedes lifting the lid on the Concept CLA Class with akida under the boot at CES.
I think there’s a decent chance @Colorado23. Mercedes have confirmed there will be press updates and sessions covering new digital technology and collaborations with industry leading partners on the 9th of Jan, so I have my fingers, toes and eyes crossed.🤞
 
  • Like
  • Fire
  • Love
Reactions: 26 users

Diogenese

Top 20
I think there’s a decent chance @Colorado23. Mercedes have confirmed there will be press updates and sessions covering new digital technology and collaborations with industry leading partners on the 9th of Jan, so I have my fingers, toes and eyes crossed.🤞
If you had your eyes crossed, it's not surprising your ties are crossed.
 
  • Haha
  • Like
  • Love
Reactions: 7 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
If you had your eyes crossed, it's not surprising your ties are crossed.
He-he-he! Yes, I uncrossed my eyes for a split second and discovered I had typed “ties” instead of “toes”, so I edited my post. Note to self: no typing when eyes are crossed.
 
Last edited:
  • Haha
  • Like
  • Love
Reactions: 9 users
Not mentioning BRN but good blog post by Everest Group talking up neuromorphic.

Everest Group provides strategic research insights on IT, business process, and engineering services and is a leading global BPO research firm.



“IT in a Box” Edge Model: The Next Frontier of Edge Computing | Blog

DECEMBER 15, 2023
Edge computing has great potential beyond local data centers. By integrating the Internet of Things (IoT), Artificial Intelligence/Machine Learning (AI/ML), and neuromorphic chips with edge computing, a revolutionary shift toward a comprehensive distributed cloud model, “IT in a Box,” could be on the horizon. Learn about the 3E design principles of this advanced edge model and its many benefits in this blog, and feel free to reach out to us to explore this topic further.

Edge locations are often associated with local data centers and primarily involve deploying idle servers closer to end users, facilitating data localization, and minimizing latency. However, a critical question arises: Does the existing edge solution offer differentiation from a conventional data center? The answer, unfortunately, is a resounding no.

Conventional edge model = data center

Today, edge locations are commonly perceived as sheer extensions of availability zones, employed to reduce latency through data localization. Despite major players’ efforts to integrate edge with advanced technologies, questions persist about the processing capabilities and scalability of these edge locations and more.

  • Hyperscalers integrate IoT, AI/ML, and next-generation security and network capabilities with edge, yet questions linger about the potential of these edge locations
  • Telecom providers leverage 5G to enhance edge networks and deploy radios and computing capabilities but again face deployment and scalability challenges
  • Technology vendors also struggle with enhancing processing, managing complex edge devices, storing data at edge locations, and developing industry-specific use cases. Still, the question remains, is that all an edge location could do?

Deficiencies in the conventional edge model

While efforts have consistently been made to enhance the intelligent edge, the current edge model falls short in establishing distinct features that could elevate it beyond its current limitations. The prevailing challenges associated with the edge include:

  1. The proliferation of edge locations around the globe has inadvertently led to increased real estate demands, hardware costs, energy consumption, and carbon emissions
  2. Due to limited edge storage and processing capabilities, there is a constant need to shuttle data back and forth between the edge and centralized cloud data centers. This poses a significant hurdle in use cases requiring real-time decision-making
  3. The distributed nature of the edge environment also adds complexity to management and orchestration

Reimagining the edge model beyond a local data center

Edge computing’s promise extends far beyond a “mere data center in your neighborhood.” The current issues require an AI and IoT integrated edge with substantial data processing, large storage capacity, efficient network connectivity, and tight security. This type of solution should replicate at scale and thwart modern cybersecurity threats, all while delivering superior speed information to the end user.

Enter the game-changer in next-generation computation: neuromorphic chips. These chips process information in a human brain-like manner, offering a revolutionary leap in edge computing capabilities. Imagine compressing edge real estate without compromising processing power – that’s where the neuromorphic chip can be the key element for the intelligent edge.

In the not-so-distant future, the fusion of IoT, AI/ML, and neuromorphic chips with edge could signal a paradigm shift, consequently forming a comprehensive distributed cloud model or “IT in a Box.”

The 3E Design Principle of “IT in a Box”

The 3E design principle underpinning “IT in a Box” is based on three core principles that form the foundation for its design and implementation: balancing efficiency, economy, and empowerment. This creates a powerful and adaptive edge computing model that effortlessly weaves together the threads of sustainability, scalability, accuracy, and security.

Let’s look at each of these principles in more detail.

  • Efficiency – This principle of “IT in a Box” takes center stage, redefining processing, storage, and information delivery at the edge. Imagine a symphony of sensors, intricately integrated in the edge environment, tirelessly collecting and sending data. These sensors gather information that is sophisticatedly analyzed right at the edge location. The result? Swift, precise, and accurate insights without the need for a laborious journey to centralized cloud hubs
  • Economy – This principle emphasizes the importance of cost-effectiveness and sustainability working together. At the heart of this lies the strategic integration of advanced technologies with neuromorphic chips and efficient platforms. “IT in a Box” aims to create a world where the edge requires less physical footprint, reducing real estate requirements. This cost-efficient proposition also aligns with the broader goal of sustainable expansion. It’s about making high-performance computing accessible not just to giants, but to a broader spectrum of industries and applications
  • Empowerment – This principle promises intelligent autonomy and tailor-made solutions. It is not only about processing, storing, and delivering data but also about empowering edge locations with accelerated decision-making abilities that reflect the unique needs of diverse businesses. Hence, this principle uncovers a vast landscape of industry-specific and micro-vertical use cases from healthcare and manufacturing to retail and finance. Picture a smart factory where edge devices autonomously optimize production processes based on real-time data analysis, or consider a healthcare system where patient monitoring happens seamlessly at the edge. “IT in a Box” becomes a strategic partner, enabling businesses to swiftly respond to changing scenarios

Benefits of the “IT in a Box” Edge Model

The benefits of “IT in a Box” are wide-ranging, contributing significantly to the operational efficiency, strategic value, and overall success of enterprises. Among the advantages are:

  • It not only ushers in a new era of accessibility but also facilitates the rapid and cost-effective deployment of smaller edge locations, transcending the boundaries of metropolises and extending to tier X cities
  • The power of “IT in a Box” lies in its ability to process and store vast volumes of data at the edge. This equates to unprecedented speeds in delivering crucial information and, more importantly, provides a welcome relief for central cloud data centers burdened by heavy loads
  • The deployment of highly autonomous edge devices is a reality for “IT in a Box.” Devices are equipped with the capability for large-scale analysis, intelligent decision-making, and real-time reporting – all taking place immediately at the edge
  • With “IT in a Box,” most of the data no longer needs to travel to centralized infrastructure, boosting privacy and security as it stays close to the source
  • “IT in a Box” isn’t just about efficacy but also sustainability. It paves the way for a greener tech future with mindful energy use and low carbon emissions

The future of “IT in a Box” revolutionizing industries

In the not-so-distant future, “IT in a Box” holds immense potential for micro-vertical applications that can revolutionize various industries, such as:

  • Autonomous vehicles – Imagine a driverless car enabled by the above elements. It would process data in proximity, resulting in improved sensor fusion, adaptability, and learning, making driverless cars more efficient, safe, and responsive
  • Virtual healthcare – These benefits facilitate effective remote monitoring of vital signs and health parameters with immediate analysis of data, resulting in quick health anomaly diagnosis
  • Smart cities – Video feeds from surveillance cameras can be processed locally, identifying potential security threats in real time and promptly alerting concerned local authorities
These micro-vertical use cases cut across the 3E design principles of “IT in a Box.” As the convergence of various technologies matures, the potential for innovation and micro-vertical use cases across industries becomes vast. Indeed, the future holds the potential for sensors with embedded neuromorphic chips that can process and analyze information on-the-spot, rather than near the source.

Please feel free to reach out to Raya.Mukherjee@everestgrp.com or Titus.M@everestgrp.com to share any questions and your thoughts about the potential of this evolution in edge computing.
 
  • Like
  • Fire
  • Love
Reactions: 32 users

Easytiger

Regular
IMG_3408.jpeg
 
  • Like
  • Love
  • Fire
Reactions: 40 users

charles2

Regular
Not mentioning BRN but good blog post by Everest Group talking up neuromorphic.

Everest Group provides strategic research insights on IT, business process, and engineering services and is a leading global BPO research firm.



“IT in a Box” Edge Model: The Next Frontier of Edge Computing | Blog

DECEMBER 15, 2023
Edge computing has great potential beyond local data centers. By integrating the Internet of Things (IoT), Artificial Intelligence/Machine Learning (AI/ML), and neuromorphic chips with edge computing, a revolutionary shift toward a comprehensive distributed cloud model, “IT in a Box,” could be on the horizon. Learn about the 3E design principles of this advanced edge model and its many benefits in this blog, and feel free to reach out to us to explore this topic further.

Edge locations are often associated with local data centers and primarily involve deploying idle servers closer to end users, facilitating data localization, and minimizing latency. However, a critical question arises: Does the existing edge solution offer differentiation from a conventional data center? The answer, unfortunately, is a resounding no.

Conventional edge model = data center

Today, edge locations are commonly perceived as sheer extensions of availability zones, employed to reduce latency through data localization. Despite major players’ efforts to integrate edge with advanced technologies, questions persist about the processing capabilities and scalability of these edge locations and more.

  • Hyperscalers integrate IoT, AI/ML, and next-generation security and network capabilities with edge, yet questions linger about the potential of these edge locations
  • Telecom providers leverage 5G to enhance edge networks and deploy radios and computing capabilities but again face deployment and scalability challenges
  • Technology vendors also struggle with enhancing processing, managing complex edge devices, storing data at edge locations, and developing industry-specific use cases. Still, the question remains, is that all an edge location could do?

Deficiencies in the conventional edge model

While efforts have consistently been made to enhance the intelligent edge, the current edge model falls short in establishing distinct features that could elevate it beyond its current limitations. The prevailing challenges associated with the edge include:

  1. The proliferation of edge locations around the globe has inadvertently led to increased real estate demands, hardware costs, energy consumption, and carbon emissions
  2. Due to limited edge storage and processing capabilities, there is a constant need to shuttle data back and forth between the edge and centralized cloud data centers. This poses a significant hurdle in use cases requiring real-time decision-making
  3. The distributed nature of the edge environment also adds complexity to management and orchestration

Reimagining the edge model beyond a local data center

Edge computing’s promise extends far beyond a “mere data center in your neighborhood.” The current issues require an AI and IoT integrated edge with substantial data processing, large storage capacity, efficient network connectivity, and tight security. This type of solution should replicate at scale and thwart modern cybersecurity threats, all while delivering superior speed information to the end user.

Enter the game-changer in next-generation computation: neuromorphic chips. These chips process information in a human brain-like manner, offering a revolutionary leap in edge computing capabilities. Imagine compressing edge real estate without compromising processing power – that’s where the neuromorphic chip can be the key element for the intelligent edge.

In the not-so-distant future, the fusion of IoT, AI/ML, and neuromorphic chips with edge could signal a paradigm shift, consequently forming a comprehensive distributed cloud model or “IT in a Box.”

The 3E Design Principle of “IT in a Box”

The 3E design principle underpinning “IT in a Box” is based on three core principles that form the foundation for its design and implementation: balancing efficiency, economy, and empowerment. This creates a powerful and adaptive edge computing model that effortlessly weaves together the threads of sustainability, scalability, accuracy, and security.

Let’s look at each of these principles in more detail.

  • Efficiency – This principle of “IT in a Box” takes center stage, redefining processing, storage, and information delivery at the edge. Imagine a symphony of sensors, intricately integrated in the edge environment, tirelessly collecting and sending data. These sensors gather information that is sophisticatedly analyzed right at the edge location. The result? Swift, precise, and accurate insights without the need for a laborious journey to centralized cloud hubs
  • Economy – This principle emphasizes the importance of cost-effectiveness and sustainability working together. At the heart of this lies the strategic integration of advanced technologies with neuromorphic chips and efficient platforms. “IT in a Box” aims to create a world where the edge requires less physical footprint, reducing real estate requirements. This cost-efficient proposition also aligns with the broader goal of sustainable expansion. It’s about making high-performance computing accessible not just to giants, but to a broader spectrum of industries and applications
  • Empowerment – This principle promises intelligent autonomy and tailor-made solutions. It is not only about processing, storing, and delivering data but also about empowering edge locations with accelerated decision-making abilities that reflect the unique needs of diverse businesses. Hence, this principle uncovers a vast landscape of industry-specific and micro-vertical use cases from healthcare and manufacturing to retail and finance. Picture a smart factory where edge devices autonomously optimize production processes based on real-time data analysis, or consider a healthcare system where patient monitoring happens seamlessly at the edge. “IT in a Box” becomes a strategic partner, enabling businesses to swiftly respond to changing scenarios

Benefits of the “IT in a Box” Edge Model

The benefits of “IT in a Box” are wide-ranging, contributing significantly to the operational efficiency, strategic value, and overall success of enterprises. Among the advantages are:

  • It not only ushers in a new era of accessibility but also facilitates the rapid and cost-effective deployment of smaller edge locations, transcending the boundaries of metropolises and extending to tier X cities
  • The power of “IT in a Box” lies in its ability to process and store vast volumes of data at the edge. This equates to unprecedented speeds in delivering crucial information and, more importantly, provides a welcome relief for central cloud data centers burdened by heavy loads
  • The deployment of highly autonomous edge devices is a reality for “IT in a Box.” Devices are equipped with the capability for large-scale analysis, intelligent decision-making, and real-time reporting – all taking place immediately at the edge
  • With “IT in a Box,” most of the data no longer needs to travel to centralized infrastructure, boosting privacy and security as it stays close to the source
  • “IT in a Box” isn’t just about efficacy but also sustainability. It paves the way for a greener tech future with mindful energy use and low carbon emissions

The future of “IT in a Box” revolutionizing industries

In the not-so-distant future, “IT in a Box” holds immense potential for micro-vertical applications that can revolutionize various industries, such as:

  • Autonomous vehicles – Imagine a driverless car enabled by the above elements. It would process data in proximity, resulting in improved sensor fusion, adaptability, and learning, making driverless cars more efficient, safe, and responsive
  • Virtual healthcare – These benefits facilitate effective remote monitoring of vital signs and health parameters with immediate analysis of data, resulting in quick health anomaly diagnosis
  • Smart cities – Video feeds from surveillance cameras can be processed locally, identifying potential security threats in real time and promptly alerting concerned local authorities
These micro-vertical use cases cut across the 3E design principles of “IT in a Box.” As the convergence of various technologies matures, the potential for innovation and micro-vertical use cases across industries becomes vast. Indeed, the future holds the potential for sensors with embedded neuromorphic chips that can process and analyze information on-the-spot, rather than near the source.

Please feel free to reach out to Raya.Mukherjee@everestgrp.com or Titus.M@everestgrp.com to share any questions and your thoughts about the potential of this evolution in edge computing.
And guess who appears to have the 'secret sauce' for things neuromorphic?

Enter the game-changer in next-generation computation: neuromorphic chips. These chips process information in a human brain-like manner, offering a revolutionary leap in edge computing capabilities. Imagine compressing edge real estate without compromising processing power – that’s where the neuromorphic chip can be the key element for the intelligent edge.

In the not-so-distant future, the fusion of IoT, AI/ML, and neuromorphic chips with edge could signal a paradigm shift, consequently forming a comprehensive distributed cloud model or “IT in a Box.”

(I was tempted to substitute would for 'could').
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 37 users
Up nearly 50 million shorts since December 1st and at this rate it won’t be long before we are back above 100 million, what happened to less short interest when you go back to asx 300


1703015430577.gif
 
  • Like
  • Sad
  • Thinking
Reactions: 8 users
  • Like
  • Fire
Reactions: 24 users

IloveLamp

Top 20
Not sure of date on this article, but suggests DNN embedded and cloud?


CERENCE INTRODUCES NEW FEATURES IN CERENCE DRIVE, THE WORLD’S LEADING TECHNOLOGY AND SOLUTIONS PORTFOLIO FOR AUTOMAKERS AND CONNECTED CARS​

New capabilities such as enhanced voice recognition and synthetic speech serve as the foundation for a safer, more enjoyable journey for everyone
BURLINGTON, Mass. – Cerence Inc., AI for a world in motion, today introduced new innovations in Cerence Drive, its technology and solutions portfolio for automakers and IoT providers to build high-quality, intelligent voice assistant experiences and speech-enabled applications. Cerence Drive today powers AI-based, voice-enabled assistants in approximately 300 million cars from nearly every major automaker in the world, including Audi, BMW, Daimler, Ford, Geely, GM, SAIC, Toyota, and many more.
The Cerence Drive portfolio offers a distinct, hybrid approach with both on-board and cloud-based technologies that include voice recognition, natural language understanding (NLU), text-to-speech (TTS), speech signal enhancement (SSE), and more. These technologies can be deployed and tightly integrated with the wide variety of systems, sensors and interfaces found in today’s connected cars. The latest version of Cerence Drive includes a variety of new features to elevate the in-car experience:
> Enhanced, active voice recognition and assistant activation that goes beyond the standard push-to-talk buttons and wake-up words. The voice assistant is always listening for a relevant utterance, question or command, much like a personal assistant would, creating a more natural experience. In addition, Cerence’s voice recognition can run throughout the car, both embedded and in the cloud, distributing the technical load and delivering a faster user experience for drivers.
> New, deep neural net (DNN)-based NLU engine built on one central technology stack with 23 languages available both embedded and in the cloud. This streamlined approach creates new standards for scalability and flexibility between embedded and cloud applications and domains for simpler integration, faster innovation, and a more seamless in-car experience, regardless of connectivity.
> TTS and synthetic voice advancements that deliver new customizations, including a non-gender-specific voice for the voice assistant, and emotional output, which enables automakers to adjust an assistant’s speaking style based on the information delivered or tailored to a specific situation. In addition, the introduction of deep learning delivers a more natural and human-like voice with an affordable computational footprint.
> Improved, more intelligent speech signal enhancement that includes multi-zone processing with quick and simple speaker identification; passenger interference cancelation that blocks out background noise as well as voices from others in the car; and a deep neural net-based approach for greater noise suppression and better communication.
“Improving the experience for drivers and creating curated technology that feels unique and harmonious with our partners’ brands have been true motivators since we started our new journey as Cerence, and that extends to our latest innovations in Cerence Drive,” said Sanjay Dhawan, CEO, Cerence. “Cerence Drive, our flagship offering, is the driving force behind our promise of a truly moving in-car experience for our customers and their drivers, and our new innovations announced today are core to making that mission a reality. ”
Cerence Drive’s newest features are available now for automakers worldwide. To learn more about Cerence Drive, visit www.cerence.com/solutions.

Also a 2022 pdf spiel on their overall solutions package.

HERE

Guess we have to remember we also have a patent granted on 2018 on neuromorphic application via PVDM.


US-10157629-B2 - Low Power Neuromorphic Voice Activation System and Method​


Abstract
The present invention provides a system and method for controlling a device by recognizing voice commands through a spiking neural network. The system comprises a spiking neural adaptive processor receiving an input stream that is being forwarded from a microphone, a decimation filter and then an artificial cochlea. The spiking neural adaptive processor further comprises a first spiking neural network and a second spiking neural network. The first spiking neural network checks for voice activities in output spikes received from artificial cochlea. If any voice activity is detected, it activates the second spiking neural network and passes the output spike of the artificial cochlea to the second spiking neural network that is further configured to recognize spike patterns indicative of specific voice commands. If the first spiking neural network does not detect any voice activity, it halts the second spiking neural network.

Screenshot_20231220_070642_LinkedIn.jpg
 
  • Like
  • Thinking
  • Fire
Reactions: 7 users
I am a bit shifty on the whole "yeah 2024 baby" talk we see.... as I am sure we would see the same comments about 2023 (76% down as we speak) or 2022... I get it we have all these announcements etc and don't attack the company etc etc yada yada.

The reality, or the facts if you will, is that it's all talk up to this point...

I want to see some movement in 2024, as does everyone. But we have now been here, hoping, praying, swearing, dot joining (whatever your game is) and nothing ever eventuated. Oh wait Akida 2.0 will be amazing... then nothing (I know you will tell me it needs to be tested, but I am sure BRN Management told us it was already in the hands of the big boys, only to then say its not and people are assessing it)....

Anyways. Here's to 2024....

/FL
 
  • Like
  • Fire
  • Love
Reactions: 37 users

IloveLamp

Top 20
  • Like
  • Fire
  • Love
Reactions: 11 users

Kachoo

Regular
I am a bit shifty on the whole "yeah 2024 baby" talk we see.... as I am sure we would see the same comments about 2023 (76% down as we speak) or 2022... I get it we have all these announcements etc and don't attack the company etc etc yada yada.

The reality, or the facts if you will, is that it's all talk up to this point...

I want to see some movement in 2024, as does everyone. But we have now been here, hoping, praying, swearing, dot joining (whatever your game is) and nothing ever eventuated. Oh wait Akida 2.0 will be amazing... then nothing (I know you will tell me it needs to be tested, but I am sure BRN Management told us it was already in the hands of the big boys, only to then say its not and people are assessing it)....

Anyways. Here's to 2024....

/FL
I want to see 2024 succeed I thought 2023 was the boom year boy was I wrong.

In the end we do need to see revenue dollars start to trickle in then flow strongly that will be the bottom line.

These products coming to market being not huge earners but will be earners should help snow ball the revenue. If these companies see the ease of integration and benifits then others will follow and we want to see the partners integrate more products in their line with Akida.

The bigger companies Valeo Renesas Megachips well we will see how they produce the dollars.

In the end it is the financials that need to be an indicator of forward movement. Once we see some money I think more value will be put on the partnerships.

The issue was the crappy revenue they have made this year has killed confidence. Poor coms with holders. Basicly classic ignore the hard questions.

Slowly the ship is turning I do see the light not going to look much at the past that has happened.
 
  • Like
  • Fire
  • Love
Reactions: 26 users

Onboard21

Member
Hi All
Some will recall that I have previously posted a list of companies and institutions which have been confirmed as engaged with Brainchip either by Press Release, academic papers, ASX announcements or direct communications with the Director Investor Relations or the CEO Sean Hehir. Today I updated my list and was pleased to see that it has now reached 50 in number so I thought it was time to post it again here:

1. FORD

2. VALEO

3. RENESAS

4. NASA

5. TATA Consulting Services

6. MEGACHIPS

7. MOSCHIP

8. SOCIONEXT

9. PROPHESEE

10. VVDN

11. TEKSUN

12. Ai LABS

13. NVISO

14. EMOTION 3D

15. ARM

16. EDGE IMPULSE

17. INTEL

18. GLOBAL FOUNDRIES

19. BLUE RIDGE ENVISIONEERING

20. MERCEDES BENZ

21. ANT 61

22. QUANTUM VENTURA

23. INFORMATION SYSTEM LABORATORIES

24. INTELLISENSE SYSTEMS

25. CVEDIA

26. LORSER INDUSTRIES

27. SiFIVE

28. IPRO SILICONE

29. SALESLINK

30. NUMEN

31. VORAGO

32. NANOSE

33. BIOTOME

34. OCULI

35. CIRCLE8 CLEAN TECHNOLOGIES

36. AVID GROUP

37. TATA ELXSI

38. GMAC INTELLIGENCE

39. EDGX

40. EUROPEAN SPACE AGENCY

41 UNIGEN

42. iniVation

43. SAHOMA CONTROLWARE

44. MAGIK EYE

45. University of Virginia

46. University of Oklahoma

47. Arizona State University

48. Carnegie Mellon University

49. Rochester Institute of Technology

50. Drexel University

My opinion only DYOR
Fact Finder
HI
What happened to Ipsolon Research
 
  • Like
Reactions: 2 users

buena suerte :-)

BOB Bank of Brainchip
Last edited:
  • Like
  • Fire
  • Love
Reactions: 21 users
Hi JD,

The possible terms of an agreement are practically limitless. It is how the parties agree to proceed.

Most of the engagements have been termed "partnerships" to develop a product incorporating Akida for the customer's purposes.

Generally the profits of a partnership are distributed on the basis of the partners' contributions.

However, it is possible that the partnership's objective is to design a product, and then for the enterprise to morph into a licence arrangement. BRN's contribution to the design process would include the Akida IP, engineering support, and know-how. Once the design is complete, BRN would not need to be involved in the commercial production and marketing of the product.

The letter of intent would have been designed to cover the initial stages of negotiations. Certainly no IP would have been provided on the basis of a non-binding letter of intent.

Business negotiations are usually commercial-in-confidence whether the parties are private or public.
Hi Diogenese

I will only add that under the Continuous Disclosure Rules one of the exemptions to giving disclosure to the market via the ASX is where it involves “an incomplete proposal or negotiation”.

In the scenario you proffer an agreement to attempt to develop a product containing AKIDA could easily satisfy this test at least right up to the point where it is developed and a decision is made to manufacture it.

My opinion only DYOR
FACT FINDER
 
  • Like
  • Fire
Reactions: 25 users

IloveLamp

Top 20


These are general-purpose MCU devices and address diverse high-performance and compute-intensive applications in Industrial Automation, Home Appliances, Smart Home, Consumer, Building/Home Automation, and Medical/Healthcare market segments.
Screenshot_20231220_081805_LinkedIn.jpg
 
  • Like
  • Fire
  • Love
Reactions: 11 users

Esq.111

Fascinatingly Intuitive.
Morning Chippers ,

Would be a good source of funding for us.


Regards,
Esq
 
  • Like
  • Fire
  • Love
Reactions: 14 users


These are general-purpose MCU devices and address diverse high-performance and compute-intensive applications in Industrial Automation, Home Appliances, Smart Home, Consumer, Building/Home Automation, and Medical/Healthcare market segments.
View attachment 52365
Posts like the above make me wonder.

Why does Renesas use something on an ARM chip, that we supposedly can do, with "one-shot learning", power efficiency and everything else we are the best at? They had to use 30 million images to train their models (if I remember the number correctly).

Aren't we partners with both these companies? So....

- Are we too expensive?
- Too hard to work with?
- Not where we claim to be in the real world?
- Is this the proof in a roundabout way why nothing has kicked off in full flight yet?

Lots of questions... happy to have someone explain this to me without the usual belittling or being called downramper or whatever else the flavour of the week is.
 
  • Like
  • Fire
  • Love
Reactions: 27 users
Wording below from BRN when advertising CES. Does this bit means we will have a number of things unveiled to public over next few weeks or just to those who can get an appointment at CES24? I read it as the former. One could only hope.

From strengthening security through anti-spoofing, on-chip learning, to industrial control through anomaly detection, and numerous others – which we will unveil over the coming weeks.

SC
 
  • Like
  • Fire
  • Love
Reactions: 12 users
Top Bottom