BRN Discussion Ongoing

Diogenese

Top 20
View attachment 47222
💥 [BREAKING] Today, we proudly unveil the GenX320 Metavision® sensor - the smallest 🔍 and most power-efficient event-based vision sensor in the world!
👉 https://bit.ly/3QgQoRY

Built for the next generation of smart consumer devices, GenX320 delivers new levels of intelligence, autonomy, and privacy to a vast range of fast-growing Edge market segments, including AR/VR headsets, wearables, smart camera and monitoring systems, IoT devices, and many more.

The sensor has been developed with a specific focus on the unique energy, compute, and size requirements of Edge AI vision systems. It enables robust, high-speed vision at ultra-low power, even in challenging operating and lighting conditions.

GenX320 key benefits include:
✅ Ultra-fast event timestamping (1 µsec) with flexible data formatting
✅ Smart power management for ultra-low power consumption and wake-on-events (as low as 36µW)
✅ Seamless integration with standard SoCs, reducing external processing
✅ Low-latency connectivity through MIPI or CPI data interfaces
✅ AI-ready with on-chip histogram output for AI accelerators
✅ Sensor-level privacy due to inherently sparse event data and static scene removal
✅ Compatibility with Prophesee Metavision Intelligence software suite

🚀 Learn more about how the GenX320 successfully overcomes current vision sensing limitations in edge applications 👉 https://bit.ly/3QgQoRY


Event-Based Metavision® Sensor GENX320 | PROPHESEE


The link leads to Prophesee's early Adopters:

Zinn Labs,
ultraleap,
Xperi.

Zinn patent application for eye tracking glasses:


WO2023081297A1 EYE TRACKING SYSTEM FOR DETERMINING USER ACTIVITY


1697460711642.png



1697460889320.png



Embodiments relate to an eye tracking system. A headset of the system includes an eye tracking sensor that captures eye tracking data indicating positions and movements of a user's eye. A controller (e.g., in the headset) of the tracking system analyzes eye tracking data from the sensors to determine eye tracking feature values of the eye during a time period. The controller determines an activity of the user during the time period based on the eye tracking feature values. The controller updates an activity history of the user with the determined activity.

A method comprising: analyzing eye tracking data to determine eye tracking feature values of an eye of a user of a headset during a time period, wherein the eye tracking data is determined from an eye tracking system on the headset; determining an activity of the user during the time period based on the determined eye tracking feature values; and updating an activity history of the user with the determined activity, wherein the feature values include movements of the eye, and determining the activity comprises identifying movements of the eye that correspond to the activity.

In some embodiments, a machine learned model of the activity module 310 is a recurrent neural network (e.g., using a long short-term memory neural network or gated recurrent units) that considers the time-based component of the eye tracking feature values.
 
  • Like
  • Fire
  • Love
Reactions: 50 users

MrRomper

Regular
  • Like
  • Fire
  • Thinking
Reactions: 38 users

charles2

Regular
  • Like
  • Thinking
Reactions: 7 users

IloveLamp

Top 20
🤔


1000006818.png
 
  • Like
  • Fire
Reactions: 28 users

Tuliptrader

Regular
  • Like
  • Haha
  • Love
Reactions: 31 users

MrNick

Regular
 
  • Fire
  • Like
  • Love
Reactions: 3 users
The big seller walls are back up to hold share price down and in place for as long as possible…resume manipulation programming.
 
  • Like
  • Sad
Reactions: 10 users

Vladsblood

Regular
The big seller walls are back up to hold share price down and in place for as long as possible…resume manipulation programming.
Gotcha Fastback,, Outright morally criminal systemic activity sanctified by the ever complying ASX throughout Brainchip's advances.
Especially since around the MB announcement. Vlad.
 
  • Like
Reactions: 8 users
Gotcha Fastback,, Outright morally criminal systemic activity sanctified by the ever complying ASX throughout Brainchip's advances.
Especially since around the MB announcement. Vlad.
Yes it sure is criminal by big insto players @Vladsblood

Insto’s know they a going to make a mint out of Brainchip in the medium to long term at these buy prices …they will hold here as long as they can.
 
  • Like
Reactions: 6 users

7für7

Regular
What about the partnership between Qualcomm and Prophesee? It’s not only brainchip working with Prophesee. I’m still waiting for a statement from brainchips side! dyor
 
  • Like
  • Haha
Reactions: 6 users
Wow! what a flurry of activity!
Could it be as a result of the second generation release, no wait, that would take time to sync with the development cycle of these companies, it must built on those useless AKIDA 1000 and 1500 chip designs:ROFLMAO:.

UNLESS

the eco system partners have managed to really shorted the implementation cycle like they have been suggesting.🚀
Either way, great news for the 'latched on barnacle' (LOBs) BRN holders.
Exciting next 2 Quarters in this LOB's opinion.
 
Last edited:
  • Like
  • Haha
Reactions: 13 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Modern neuromorphic processor architectures...PLURAL...Hmmm?????



This Tiny Sensor Could Be in Your Next Headset​

Prophesee
PROPHESEE Event-Based Metavision GenX320 Bare Die 2.jpg

Neuromorphic computing company develops event-based vision sensor for edge AI apps.
Spencer Chin | Oct 16, 2023


As edge-based artificial intelligence (AI) applications become more common, there will be a greater need for sensors that can meet the power and environmental needs of edge hardware. Prophesee SA, which supplies advanced neuromorphic vision systems, has introduced an event-based vision sensor for integration into ultra-low-power edge AI vision devices. The GenX320 Metavision sensor, which uses a tiny 3x4mm die, leverages the company’s technology platform into growing intelligent edge market segments, including AR/VR headsets, security and monitoring/detection systems, touchless displays, eye tracking features, and always-on intelligent IoT devices.

According to Luca Verre, CEO and co-founder of Prophesee, the concept of event-based vision has been researched for years, but developing a viable commercial implementation in a sensor-like device has only happened relatively recently. “Prophesee has used a combination of expertise and innovative developments around neuromorphic computing, VLSI design, AL algorithm development, and CMOS image sensing,” said Verre in an e-mail interview with Design News. “Together, those skills and advancements, along with critical partnerships with companies like Sony, Intel, Bosch, Xiaomi, Qualcomm, 🤔and others 🤔have enabled us to optimize a design for the performance, power, size, and cost requirements of various markets.”

Prophesse’s vision sensor is a 320x320, 6.3μm pixel BSI stacked event-based vision sensor that offers a tiny 1/5-in. optical format. Verre said, “The explicit goal was to improve integrability and usability in embedded at-the-edge vision systems, which in addition to size and power improvements, means the design must address the challenge of event-based vision’s unconventional data format, nonconstant data rates, and non-standard interfaces to make it more usable for a wider range of applications. We have done that with multiple integrated event data pre-processing, filtering, and formatting functions to minimize external processing overhead.”

Verre added, “In addition, MIPI or CPI data output interfaces offer low-latency connectivity to embedded processing platforms, including low-power microcontrollers and modern neuromorphic processor architectures.

Low-Power Operation

According to Verre, the GenX320 sensor has been optimized for low-power operation, featuring a hierarchy of power modes and application-specific modes of operation. On-chip power management further improves sensor flexibility and integrability. To meet aggressive size and cost requirements, the chip is fabricated using a CMOS stacked process with pixel-level Cu-Cu bonding interconnects achieving a 6.3μm pixel-pitch.
The sensor performs low latency, µsec resolution timestamping of events with flexible data formatting. On-chip intelligent power management modes reduce power consumption to a low 36uW and enable smart wake-on-events. Deep sleep and standby modes are also featured.

According to Prophesee, the sensor is designed to be easily integrated with standard SoCs with multiple combined event data pre-processing, filtering, and formatting functions to minimize external processing overhead. MIPI or CPI data output interfaces offer low-latency connectivity to embedded processing platforms, including low-power microcontrollers and modern neuromorphic processor architectures.

Prophesee’s Verre expects the sensor to find applications in AR/VR headsets. “We are solving an important issue in our ability to efficiently (i.e. low power/low heat) support foveated rendering in eye tracking for a more realistic, immersive experience. Meta has discussed publicly the use of event-based vision technology, and we are actively involved with our partner Zinn Labs in this area. XPERI has already developed a driver monitor system (DMS) proof of concept based on our previous generation sensor for gaze monitoring and we are working with them on a next-gen solution using GenX320 for both automotive and other potential uses, including micro expression monitoring. The market for gesture and motion detection is very large, and our partner Ultraleap has demonstrated a working prototype of a touch-free display using our solution.”

The sensor incorporates an on-chip histogram output compatible with multiple AI accelerators. The sensor is also natively compatible with Prophesee Metavision Intelligence, an open-source event-based vision software suite that is used by a community of over 10,000 users.

Prophesee will support the GenX320 with a complete range of development tools for easy exploration and optimization, including a comprehensive Evaluation Kit housing a chip-on-board (COB) GenX320 module, or a compact optical flex module. In addition, Prophesee will offer a range of adapter kits that enable seamless connectivity to a large range of embedded platforms, such as an STM32 MCU, speeding time-to-market.

Spencer Chin is a Senior Editor for Design News covering the electronics beat. He has many years of experience covering developments in components, semiconductors, subsystems, power, and other facets of electronics from both a business/supply-chain and technology perspective. He can be reached at Spencer.Chin@informa.com.

 
  • Like
  • Love
  • Fire
Reactions: 36 users

Murphy

Life is not a dress rehearsal!
Berlin, the Disruptive's Substack article is one of the best overviews of the dilemma faced by cloud servers globally, then what BRN solves, why it solves it, provides a vocabulary or glossary of terms needed by a lay person to understand the story, then explains where server farms/data centres are headed, what the edge is, a comparison of Arm and BRN, why BRN will be possibly as big as Arm, what differentiates BRN and a technical description of what Akida represents, why now is the time for BRN to begin to make inroads into the AI scenario and more.

It is compelling and definitely a great article for your parents to read, I will say that it is probably the best overview of BRN that I have seen anywhere outside of this forum. So if you haven't had a look at it, do yourself a favour. And thanks @Berlinforever
What a great read for the layman or computer genius.
Every holder of BRN should read this.


If you don't have dreams, you can't have dreams come true!
 
  • Like
  • Fire
  • Love
Reactions: 18 users
Has anyone seen this:


BrainChip's leadership shares details on the company's development of the Akida chip, future prospects, and more in this informative podcast.​

Artificial Intelligence
- Edge AI
- Edge Processors
In a recent episode of the BrainChip podcast, a select group of the company's top executives gathered to discuss the future trajectory of artificial intelligence (AI) and the pivotal role BrainChip is set to play in this dynamic landscape. We highlight some of the key discussion points below but encourage you to listen to the whole episode here.

Leadership with Vision​

Rob Tolson, serving as the Vice President of Sales and Marketing, is the driving force behind BrainChip's global outreach. With a keen understanding of market dynamics and a vision for BrainChip's expansive global presence, Tolson has been instrumental in positioning the company as a leader in the AI industry.
Peter Vandermade, the CEO and Co-founder of BrainChip, brings to the table a wealth of experience and a visionary approach. His emphasis on the importance of the company's Advanced Research Center in Perth showcases his commitment to pioneering the next wave of AI innovations. Vandermade's insights into the potential of Akida, BrainChip's flagship technology, highlight his forward-thinking approach to AI's future.
Anil Mankar, the Vice President of Product Development and also a Co-founder, offers a deep dive into the technical intricacies of BrainChip's operations. His insights into the production process, from chip manufacturing to rigorous testing, provide a glimpse into the meticulous steps BrainChip takes to ensure top-tier functionality and performance.
Lastly, Ken Scarance, the Chief Financial Officer of BrainChip, sheds light on the company's financial endeavours. His discussions on Brain Chips strategic financial initiatives, including capital raising agreements and efforts to bolster its presence in the U.S. capital markets, underscore the company's ambitions for growth and market dominance.

BrainChip's Evolution: A Glimpse into the Company's Global Strategy

To kick off the episode, the team outlines the company’s history, vision and future plans. BrainChip's commitment to being a global leader in the AI industry is evident in its expansive operational presence. With hubs in California, Perth, France, and India, the company has strategically positioned itself in key tech-centric locations. This global footprint not only facilitates diverse collaborations but also ensures that BrainChip remains at the pulse of AI advancements worldwide.

From Research to Production: A Strategic Pivot​

Historically, BrainChip has been synonymous with cutting-edge research in neuromorphic computing. Their dedication to pushing the boundaries of AI has positioned them as pioneers in the field. However, recognizing the vast commercial potential of their innovations and the industry's shifting dynamics, BrainChip is undergoing a transformation.
Tolson emphasised this transition in the podcast, noting the company's pivot from being primarily research-driven to adopting a production-centric approach. For engineers, this shift signifies BrainChip's intent to translate their groundbreaking research into tangible, market-ready solutions.

Engaging the Tech Community​

Understanding the importance of effective communication in the tech world, especially among engineers and developers, Tolson highlighted BrainChip's efforts to foster engagement. The company's podcast series, for instance, is more than just a marketing tool. It's a platform for knowledge sharing, offering insights into Brain Chips offerings, their vision for AI's future, and the technical intricacies that make their solutions stand out. Stay up to date with BrainChip activities by following their Wevolver profile.

The Advanced Research Center: A Beacon of Innovation​

Located in Perth, Australia the Advanced Research Center is BrainChip's crown jewel. It's not just a research facility; it's a testament to the company's dedication to pushing the boundaries of what's possible in AI. While many in the industry focus on refining existing deep learning models, BrainChip's centre is already looking beyond, exploring the next frontier of AI innovations.

Akida: The Future of AI Technology​

Vandermade's enthusiasm was palpable when discussing Akida, BrainChip's flagship technology. Akida is not just another chip in the market; it embodies BrainChip's vision for the future of AI. Two of its standout features are its low energy consumption and its on-chip learning capabilities.
For engineers and tech enthusiasts, these features are significant. The low energy consumption means that Akida is not only efficient but also environmentally conscious, addressing a growing concern in today's tech-driven world. On the other hand, on-chip learning capabilities represent a leap in AI technology, allowing for faster, more efficient processing without the need for constant back-and-forth with centralised data centres.
In Vandermade's view, Akida is set to redefine the AI industry. Its unique features position it not just as an innovative product but as a transformative solution that could shape the way we think about and implement AI in various applications.
eyJidWNrZXQiOiJ3ZXZvbHZlci1wcm9qZWN0LWltYWdlcyIsImtleSI6ImZyb2FsYS8xNjk3MjA4NDI5MTg2LWFraWRhLmpwZyIsImVkaXRzIjp7InJlc2l6ZSI6eyJ3aWR0aCI6OTUwLCJmaXQiOiJjb3ZlciJ9fX0=

Financial Initiatives and Market Presence​

To wrap up the podcast, Ken Scarance, BrainChip's Chief Financial Officer, touched upon the company's recent financial endeavours. He highlighted the company's agreement with LDA, aimed at raising capital, and BrainChip's strategic move to enhance its presence in the U.S. capital markets. Additionally, the company is bolstering its investor relations strategy, aiming to foster better communication with stakeholders and educate the market about BrainChips groundbreaking offerings.

Conclusion​

As BrainChip continues to make strides in the AI domain, the company remains committed to keeping its audience informed and engaged. With a series of events lined up to showcase Akida's capabilities and an unwavering focus on innovation, BrainChip is undoubtedly poised to redefine the boundaries of AI. Stay up to date with BrainChip's new content here.
 
  • Like
  • Fire
  • Love
Reactions: 27 users
What about the partnership between Qualcomm and Prophesee? It’s not only brainchip working with Prophesee. I’m still waiting for a statement from brainchips side! dyor
Its interesting when you do a google search on "AKIDA", one of the first results that comes up is:


However, there is no mention of AKIDA on their page?
 
  • Like
  • Thinking
  • Love
Reactions: 11 users

7für7

Regular
Modern neuromorphic processor architectures...PLURAL...Hmmm?????



This Tiny Sensor Could Be in Your Next Headset​

Prophesee
PROPHESEE Event-Based Metavision GenX320 Bare Die 2.jpg

Neuromorphic computing company develops event-based vision sensor for edge AI apps.
Spencer Chin | Oct 16, 2023


As edge-based artificial intelligence (AI) applications become more common, there will be a greater need for sensors that can meet the power and environmental needs of edge hardware. Prophesee SA, which supplies advanced neuromorphic vision systems, has introduced an event-based vision sensor for integration into ultra-low-power edge AI vision devices. The GenX320 Metavision sensor, which uses a tiny 3x4mm die, leverages the company’s technology platform into growing intelligent edge market segments, including AR/VR headsets, security and monitoring/detection systems, touchless displays, eye tracking features, and always-on intelligent IoT devices.

According to Luca Verre, CEO and co-founder of Prophesee, the concept of event-based vision has been researched for years, but developing a viable commercial implementation in a sensor-like device has only happened relatively recently. “Prophesee has used a combination of expertise and innovative developments around neuromorphic computing, VLSI design, AL algorithm development, and CMOS image sensing,” said Verre in an e-mail interview with Design News. “Together, those skills and advancements, along with critical partnerships with companies like Sony, Intel, Bosch, Xiaomi, Qualcomm, 🤔and others 🤔have enabled us to optimize a design for the performance, power, size, and cost requirements of various markets.”

Prophesse’s vision sensor is a 320x320, 6.3μm pixel BSI stacked event-based vision sensor that offers a tiny 1/5-in. optical format. Verre said, “The explicit goal was to improve integrability and usability in embedded at-the-edge vision systems, which in addition to size and power improvements, means the design must address the challenge of event-based vision’s unconventional data format, nonconstant data rates, and non-standard interfaces to make it more usable for a wider range of applications. We have done that with multiple integrated event data pre-processing, filtering, and formatting functions to minimize external processing overhead.”

Verre added, “In addition, MIPI or CPI data output interfaces offer low-latency connectivity to embedded processing platforms, including low-power microcontrollers and modern neuromorphic processor architectures.

Low-Power Operation

According to Verre, the GenX320 sensor has been optimized for low-power operation, featuring a hierarchy of power modes and application-specific modes of operation. On-chip power management further improves sensor flexibility and integrability. To meet aggressive size and cost requirements, the chip is fabricated using a CMOS stacked process with pixel-level Cu-Cu bonding interconnects achieving a 6.3μm pixel-pitch.
The sensor performs low latency, µsec resolution timestamping of events with flexible data formatting. On-chip intelligent power management modes reduce power consumption to a low 36uW and enable smart wake-on-events. Deep sleep and standby modes are also featured.

According to Prophesee, the sensor is designed to be easily integrated with standard SoCs with multiple combined event data pre-processing, filtering, and formatting functions to minimize external processing overhead. MIPI or CPI data output interfaces offer low-latency connectivity to embedded processing platforms, including low-power microcontrollers and modern neuromorphic processor architectures.

Prophesee’s Verre expects the sensor to find applications in AR/VR headsets. “We are solving an important issue in our ability to efficiently (i.e. low power/low heat) support foveated rendering in eye tracking for a more realistic, immersive experience. Meta has discussed publicly the use of event-based vision technology, and we are actively involved with our partner Zinn Labs in this area. XPERI has already developed a driver monitor system (DMS) proof of concept based on our previous generation sensor for gaze monitoring and we are working with them on a next-gen solution using GenX320 for both automotive and other potential uses, including micro expression monitoring. The market for gesture and motion detection is very large, and our partner Ultraleap has demonstrated a working prototype of a touch-free display using our solution.”

The sensor incorporates an on-chip histogram output compatible with multiple AI accelerators. The sensor is also natively compatible with Prophesee Metavision Intelligence, an open-source event-based vision software suite that is used by a community of over 10,000 users.

Prophesee will support the GenX320 with a complete range of development tools for easy exploration and optimization, including a comprehensive Evaluation Kit housing a chip-on-board (COB) GenX320 module, or a compact optical flex module. In addition, Prophesee will offer a range of adapter kits that enable seamless connectivity to a large range of embedded platforms, such as an STM32 MCU, speeding time-to-market.

Spencer Chin is a Senior Editor for Design News covering the electronics beat. He has many years of experience covering developments in components, semiconductors, subsystems, power, and other facets of electronics from both a business/supply-chain and technology perspective. He can be reached at Spencer.Chin@informa.com.

They mentioned Sony, Bosch, Intel but not brainchip… why not brainchip if it would be the groundbreaking technology for them? I would not bet on this horse! Still waiting and holding without hyping every article and announcement about AI dyor
 
  • Like
  • Haha
Reactions: 7 users

7für7

Regular
Its interesting when you do a google search on "AKIDA", one of the first results that comes up is:


However, there is no mention of AKIDA on their page?
Maybe google have akida inside 😂
 
  • Haha
Reactions: 1 users
Has anyone seen this:


BrainChip's leadership shares details on the company's development of the Akida chip, future prospects, and more in this informative podcast.​

Artificial Intelligence
- Edge AI
- Edge Processors
In a recent episode of the BrainChip podcast, a select group of the company's top executives gathered to discuss the future trajectory of artificial intelligence (AI) and the pivotal role BrainChip is set to play in this dynamic landscape. We highlight some of the key discussion points below but encourage you to listen to the whole episode here.

Leadership with Vision​

Rob Tolson, serving as the Vice President of Sales and Marketing, is the driving force behind BrainChip's global outreach. With a keen understanding of market dynamics and a vision for BrainChip's expansive global presence, Tolson has been instrumental in positioning the company as a leader in the AI industry.
Peter Vandermade, the CEO and Co-founder of BrainChip, brings to the table a wealth of experience and a visionary approach. His emphasis on the importance of the company's Advanced Research Center in Perth showcases his commitment to pioneering the next wave of AI innovations. Vandermade's insights into the potential of Akida, BrainChip's flagship technology, highlight his forward-thinking approach to AI's future.
Anil Mankar, the Vice President of Product Development and also a Co-founder, offers a deep dive into the technical intricacies of BrainChip's operations. His insights into the production process, from chip manufacturing to rigorous testing, provide a glimpse into the meticulous steps BrainChip takes to ensure top-tier functionality and performance.
Lastly, Ken Scarance, the Chief Financial Officer of BrainChip, sheds light on the company's financial endeavours. His discussions on Brain Chips strategic financial initiatives, including capital raising agreements and efforts to bolster its presence in the U.S. capital markets, underscore the company's ambitions for growth and market dominance.

BrainChip's Evolution: A Glimpse into the Company's Global Strategy

To kick off the episode, the team outlines the company’s history, vision and future plans. BrainChip's commitment to being a global leader in the AI industry is evident in its expansive operational presence. With hubs in California, Perth, France, and India, the company has strategically positioned itself in key tech-centric locations. This global footprint not only facilitates diverse collaborations but also ensures that BrainChip remains at the pulse of AI advancements worldwide.

From Research to Production: A Strategic Pivot​

Historically, BrainChip has been synonymous with cutting-edge research in neuromorphic computing. Their dedication to pushing the boundaries of AI has positioned them as pioneers in the field. However, recognizing the vast commercial potential of their innovations and the industry's shifting dynamics, BrainChip is undergoing a transformation.
Tolson emphasised this transition in the podcast, noting the company's pivot from being primarily research-driven to adopting a production-centric approach. For engineers, this shift signifies BrainChip's intent to translate their groundbreaking research into tangible, market-ready solutions.

Engaging the Tech Community​

Understanding the importance of effective communication in the tech world, especially among engineers and developers, Tolson highlighted BrainChip's efforts to foster engagement. The company's podcast series, for instance, is more than just a marketing tool. It's a platform for knowledge sharing, offering insights into Brain Chips offerings, their vision for AI's future, and the technical intricacies that make their solutions stand out. Stay up to date with BrainChip activities by following their Wevolver profile.

The Advanced Research Center: A Beacon of Innovation​

Located in Perth, Australia the Advanced Research Center is BrainChip's crown jewel. It's not just a research facility; it's a testament to the company's dedication to pushing the boundaries of what's possible in AI. While many in the industry focus on refining existing deep learning models, BrainChip's centre is already looking beyond, exploring the next frontier of AI innovations.

Akida: The Future of AI Technology​

Vandermade's enthusiasm was palpable when discussing Akida, BrainChip's flagship technology. Akida is not just another chip in the market; it embodies BrainChip's vision for the future of AI. Two of its standout features are its low energy consumption and its on-chip learning capabilities.
For engineers and tech enthusiasts, these features are significant. The low energy consumption means that Akida is not only efficient but also environmentally conscious, addressing a growing concern in today's tech-driven world. On the other hand, on-chip learning capabilities represent a leap in AI technology, allowing for faster, more efficient processing without the need for constant back-and-forth with centralised data centres.
In Vandermade's view, Akida is set to redefine the AI industry. Its unique features position it not just as an innovative product but as a transformative solution that could shape the way we think about and implement AI in various applications.
eyJidWNrZXQiOiJ3ZXZvbHZlci1wcm9qZWN0LWltYWdlcyIsImtleSI6ImZyb2FsYS8xNjk3MjA4NDI5MTg2LWFraWRhLmpwZyIsImVkaXRzIjp7InJlc2l6ZSI6eyJ3aWR0aCI6OTUwLCJmaXQiOiJjb3ZlciJ9fX0=

Financial Initiatives and Market Presence​

To wrap up the podcast, Ken Scarance, BrainChip's Chief Financial Officer, touched upon the company's recent financial endeavours. He highlighted the company's agreement with LDA, aimed at raising capital, and BrainChip's strategic move to enhance its presence in the U.S. capital markets. Additionally, the company is bolstering its investor relations strategy, aiming to foster better communication with stakeholders and educate the market about BrainChips groundbreaking offerings.

Conclusion​

As BrainChip continues to make strides in the AI domain, the company remains committed to keeping its audience informed and engaged. With a series of events lined up to showcase Akida's capabilities and an unwavering focus on innovation, BrainChip is undoubtedly poised to redefine the boundaries of AI. Stay up to date with BrainChip's new content here.
Hi HG,

This is episode 10 of the podcast series which is quite some time ago so a bit dated.

Q; are we related?

SG

😂
 
  • Like
  • Haha
Reactions: 11 users
Top Bottom