BRN Discussion Ongoing

MrRomper

Regular
Zinn Labs on GenX320
https://www.linkedin.com/posts/kevi...um?utm_source=share&utm_medium=member_desktop
1697486478819.png
 
  • Like
  • Fire
  • Love
Reactions: 22 users

MrRomper

Regular
  • Like
  • Fire
  • Thinking
Reactions: 38 users

charles2

Regular
  • Like
  • Thinking
Reactions: 7 users

IloveLamp

Top 20
  • Like
  • Fire
Reactions: 28 users

Tuliptrader

Regular
  • Like
  • Haha
  • Love
Reactions: 31 users

MrNick

Regular
 
  • Fire
  • Like
  • Love
Reactions: 3 users
The big seller walls are back up to hold share price down and in place for as long as possible…resume manipulation programming.
 
  • Like
  • Sad
Reactions: 10 users

Vladsblood

Regular
The big seller walls are back up to hold share price down and in place for as long as possible…resume manipulation programming.
Gotcha Fastback,, Outright morally criminal systemic activity sanctified by the ever complying ASX throughout Brainchip's advances.
Especially since around the MB announcement. Vlad.
 
  • Like
Reactions: 8 users
Gotcha Fastback,, Outright morally criminal systemic activity sanctified by the ever complying ASX throughout Brainchip's advances.
Especially since around the MB announcement. Vlad.
Yes it sure is criminal by big insto players @Vladsblood

Insto’s know they a going to make a mint out of Brainchip in the medium to long term at these buy prices …they will hold here as long as they can.
 
  • Like
Reactions: 6 users

7für7

Top 20
What about the partnership between Qualcomm and Prophesee? It’s not only brainchip working with Prophesee. I’m still waiting for a statement from brainchips side! dyor
 
  • Like
  • Haha
Reactions: 6 users
Wow! what a flurry of activity!
Could it be as a result of the second generation release, no wait, that would take time to sync with the development cycle of these companies, it must built on those useless AKIDA 1000 and 1500 chip designs:ROFLMAO:.

UNLESS

the eco system partners have managed to really shorted the implementation cycle like they have been suggesting.🚀
Either way, great news for the 'latched on barnacle' (LOBs) BRN holders.
Exciting next 2 Quarters in this LOB's opinion.
 
Last edited:
  • Like
  • Haha
Reactions: 13 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Modern neuromorphic processor architectures...PLURAL...Hmmm?????



This Tiny Sensor Could Be in Your Next Headset​

Prophesee
PROPHESEE Event-Based Metavision GenX320 Bare Die 2.jpg

Neuromorphic computing company develops event-based vision sensor for edge AI apps.
Spencer Chin | Oct 16, 2023


As edge-based artificial intelligence (AI) applications become more common, there will be a greater need for sensors that can meet the power and environmental needs of edge hardware. Prophesee SA, which supplies advanced neuromorphic vision systems, has introduced an event-based vision sensor for integration into ultra-low-power edge AI vision devices. The GenX320 Metavision sensor, which uses a tiny 3x4mm die, leverages the company’s technology platform into growing intelligent edge market segments, including AR/VR headsets, security and monitoring/detection systems, touchless displays, eye tracking features, and always-on intelligent IoT devices.

According to Luca Verre, CEO and co-founder of Prophesee, the concept of event-based vision has been researched for years, but developing a viable commercial implementation in a sensor-like device has only happened relatively recently. “Prophesee has used a combination of expertise and innovative developments around neuromorphic computing, VLSI design, AL algorithm development, and CMOS image sensing,” said Verre in an e-mail interview with Design News. “Together, those skills and advancements, along with critical partnerships with companies like Sony, Intel, Bosch, Xiaomi, Qualcomm, 🤔and others 🤔have enabled us to optimize a design for the performance, power, size, and cost requirements of various markets.”

Prophesse’s vision sensor is a 320x320, 6.3μm pixel BSI stacked event-based vision sensor that offers a tiny 1/5-in. optical format. Verre said, “The explicit goal was to improve integrability and usability in embedded at-the-edge vision systems, which in addition to size and power improvements, means the design must address the challenge of event-based vision’s unconventional data format, nonconstant data rates, and non-standard interfaces to make it more usable for a wider range of applications. We have done that with multiple integrated event data pre-processing, filtering, and formatting functions to minimize external processing overhead.”

Verre added, “In addition, MIPI or CPI data output interfaces offer low-latency connectivity to embedded processing platforms, including low-power microcontrollers and modern neuromorphic processor architectures.

Low-Power Operation

According to Verre, the GenX320 sensor has been optimized for low-power operation, featuring a hierarchy of power modes and application-specific modes of operation. On-chip power management further improves sensor flexibility and integrability. To meet aggressive size and cost requirements, the chip is fabricated using a CMOS stacked process with pixel-level Cu-Cu bonding interconnects achieving a 6.3μm pixel-pitch.
The sensor performs low latency, µsec resolution timestamping of events with flexible data formatting. On-chip intelligent power management modes reduce power consumption to a low 36uW and enable smart wake-on-events. Deep sleep and standby modes are also featured.

According to Prophesee, the sensor is designed to be easily integrated with standard SoCs with multiple combined event data pre-processing, filtering, and formatting functions to minimize external processing overhead. MIPI or CPI data output interfaces offer low-latency connectivity to embedded processing platforms, including low-power microcontrollers and modern neuromorphic processor architectures.

Prophesee’s Verre expects the sensor to find applications in AR/VR headsets. “We are solving an important issue in our ability to efficiently (i.e. low power/low heat) support foveated rendering in eye tracking for a more realistic, immersive experience. Meta has discussed publicly the use of event-based vision technology, and we are actively involved with our partner Zinn Labs in this area. XPERI has already developed a driver monitor system (DMS) proof of concept based on our previous generation sensor for gaze monitoring and we are working with them on a next-gen solution using GenX320 for both automotive and other potential uses, including micro expression monitoring. The market for gesture and motion detection is very large, and our partner Ultraleap has demonstrated a working prototype of a touch-free display using our solution.”

The sensor incorporates an on-chip histogram output compatible with multiple AI accelerators. The sensor is also natively compatible with Prophesee Metavision Intelligence, an open-source event-based vision software suite that is used by a community of over 10,000 users.

Prophesee will support the GenX320 with a complete range of development tools for easy exploration and optimization, including a comprehensive Evaluation Kit housing a chip-on-board (COB) GenX320 module, or a compact optical flex module. In addition, Prophesee will offer a range of adapter kits that enable seamless connectivity to a large range of embedded platforms, such as an STM32 MCU, speeding time-to-market.

Spencer Chin is a Senior Editor for Design News covering the electronics beat. He has many years of experience covering developments in components, semiconductors, subsystems, power, and other facets of electronics from both a business/supply-chain and technology perspective. He can be reached at Spencer.Chin@informa.com.

 
  • Like
  • Love
  • Fire
Reactions: 36 users

Murphy

Life is not a dress rehearsal!
Berlin, the Disruptive's Substack article is one of the best overviews of the dilemma faced by cloud servers globally, then what BRN solves, why it solves it, provides a vocabulary or glossary of terms needed by a lay person to understand the story, then explains where server farms/data centres are headed, what the edge is, a comparison of Arm and BRN, why BRN will be possibly as big as Arm, what differentiates BRN and a technical description of what Akida represents, why now is the time for BRN to begin to make inroads into the AI scenario and more.

It is compelling and definitely a great article for your parents to read, I will say that it is probably the best overview of BRN that I have seen anywhere outside of this forum. So if you haven't had a look at it, do yourself a favour. And thanks @Berlinforever
What a great read for the layman or computer genius.
Every holder of BRN should read this.


If you don't have dreams, you can't have dreams come true!
 
  • Like
  • Fire
  • Love
Reactions: 18 users
Has anyone seen this:


BrainChip's leadership shares details on the company's development of the Akida chip, future prospects, and more in this informative podcast.​

Artificial Intelligence
- Edge AI
- Edge Processors
In a recent episode of the BrainChip podcast, a select group of the company's top executives gathered to discuss the future trajectory of artificial intelligence (AI) and the pivotal role BrainChip is set to play in this dynamic landscape. We highlight some of the key discussion points below but encourage you to listen to the whole episode here.

Leadership with Vision​

Rob Tolson, serving as the Vice President of Sales and Marketing, is the driving force behind BrainChip's global outreach. With a keen understanding of market dynamics and a vision for BrainChip's expansive global presence, Tolson has been instrumental in positioning the company as a leader in the AI industry.
Peter Vandermade, the CEO and Co-founder of BrainChip, brings to the table a wealth of experience and a visionary approach. His emphasis on the importance of the company's Advanced Research Center in Perth showcases his commitment to pioneering the next wave of AI innovations. Vandermade's insights into the potential of Akida, BrainChip's flagship technology, highlight his forward-thinking approach to AI's future.
Anil Mankar, the Vice President of Product Development and also a Co-founder, offers a deep dive into the technical intricacies of BrainChip's operations. His insights into the production process, from chip manufacturing to rigorous testing, provide a glimpse into the meticulous steps BrainChip takes to ensure top-tier functionality and performance.
Lastly, Ken Scarance, the Chief Financial Officer of BrainChip, sheds light on the company's financial endeavours. His discussions on Brain Chips strategic financial initiatives, including capital raising agreements and efforts to bolster its presence in the U.S. capital markets, underscore the company's ambitions for growth and market dominance.

BrainChip's Evolution: A Glimpse into the Company's Global Strategy

To kick off the episode, the team outlines the company’s history, vision and future plans. BrainChip's commitment to being a global leader in the AI industry is evident in its expansive operational presence. With hubs in California, Perth, France, and India, the company has strategically positioned itself in key tech-centric locations. This global footprint not only facilitates diverse collaborations but also ensures that BrainChip remains at the pulse of AI advancements worldwide.

From Research to Production: A Strategic Pivot​

Historically, BrainChip has been synonymous with cutting-edge research in neuromorphic computing. Their dedication to pushing the boundaries of AI has positioned them as pioneers in the field. However, recognizing the vast commercial potential of their innovations and the industry's shifting dynamics, BrainChip is undergoing a transformation.
Tolson emphasised this transition in the podcast, noting the company's pivot from being primarily research-driven to adopting a production-centric approach. For engineers, this shift signifies BrainChip's intent to translate their groundbreaking research into tangible, market-ready solutions.

Engaging the Tech Community​

Understanding the importance of effective communication in the tech world, especially among engineers and developers, Tolson highlighted BrainChip's efforts to foster engagement. The company's podcast series, for instance, is more than just a marketing tool. It's a platform for knowledge sharing, offering insights into Brain Chips offerings, their vision for AI's future, and the technical intricacies that make their solutions stand out. Stay up to date with BrainChip activities by following their Wevolver profile.

The Advanced Research Center: A Beacon of Innovation​

Located in Perth, Australia the Advanced Research Center is BrainChip's crown jewel. It's not just a research facility; it's a testament to the company's dedication to pushing the boundaries of what's possible in AI. While many in the industry focus on refining existing deep learning models, BrainChip's centre is already looking beyond, exploring the next frontier of AI innovations.

Akida: The Future of AI Technology​

Vandermade's enthusiasm was palpable when discussing Akida, BrainChip's flagship technology. Akida is not just another chip in the market; it embodies BrainChip's vision for the future of AI. Two of its standout features are its low energy consumption and its on-chip learning capabilities.
For engineers and tech enthusiasts, these features are significant. The low energy consumption means that Akida is not only efficient but also environmentally conscious, addressing a growing concern in today's tech-driven world. On the other hand, on-chip learning capabilities represent a leap in AI technology, allowing for faster, more efficient processing without the need for constant back-and-forth with centralised data centres.
In Vandermade's view, Akida is set to redefine the AI industry. Its unique features position it not just as an innovative product but as a transformative solution that could shape the way we think about and implement AI in various applications.
eyJidWNrZXQiOiJ3ZXZvbHZlci1wcm9qZWN0LWltYWdlcyIsImtleSI6ImZyb2FsYS8xNjk3MjA4NDI5MTg2LWFraWRhLmpwZyIsImVkaXRzIjp7InJlc2l6ZSI6eyJ3aWR0aCI6OTUwLCJmaXQiOiJjb3ZlciJ9fX0=

Financial Initiatives and Market Presence​

To wrap up the podcast, Ken Scarance, BrainChip's Chief Financial Officer, touched upon the company's recent financial endeavours. He highlighted the company's agreement with LDA, aimed at raising capital, and BrainChip's strategic move to enhance its presence in the U.S. capital markets. Additionally, the company is bolstering its investor relations strategy, aiming to foster better communication with stakeholders and educate the market about BrainChips groundbreaking offerings.

Conclusion​

As BrainChip continues to make strides in the AI domain, the company remains committed to keeping its audience informed and engaged. With a series of events lined up to showcase Akida's capabilities and an unwavering focus on innovation, BrainChip is undoubtedly poised to redefine the boundaries of AI. Stay up to date with BrainChip's new content here.
 
  • Like
  • Fire
  • Love
Reactions: 27 users
What about the partnership between Qualcomm and Prophesee? It’s not only brainchip working with Prophesee. I’m still waiting for a statement from brainchips side! dyor
Its interesting when you do a google search on "AKIDA", one of the first results that comes up is:


However, there is no mention of AKIDA on their page?
 
  • Like
  • Thinking
  • Love
Reactions: 11 users

7für7

Top 20
Modern neuromorphic processor architectures...PLURAL...Hmmm?????



This Tiny Sensor Could Be in Your Next Headset​

Prophesee
PROPHESEE Event-Based Metavision GenX320 Bare Die 2.jpg

Neuromorphic computing company develops event-based vision sensor for edge AI apps.
Spencer Chin | Oct 16, 2023


As edge-based artificial intelligence (AI) applications become more common, there will be a greater need for sensors that can meet the power and environmental needs of edge hardware. Prophesee SA, which supplies advanced neuromorphic vision systems, has introduced an event-based vision sensor for integration into ultra-low-power edge AI vision devices. The GenX320 Metavision sensor, which uses a tiny 3x4mm die, leverages the company’s technology platform into growing intelligent edge market segments, including AR/VR headsets, security and monitoring/detection systems, touchless displays, eye tracking features, and always-on intelligent IoT devices.

According to Luca Verre, CEO and co-founder of Prophesee, the concept of event-based vision has been researched for years, but developing a viable commercial implementation in a sensor-like device has only happened relatively recently. “Prophesee has used a combination of expertise and innovative developments around neuromorphic computing, VLSI design, AL algorithm development, and CMOS image sensing,” said Verre in an e-mail interview with Design News. “Together, those skills and advancements, along with critical partnerships with companies like Sony, Intel, Bosch, Xiaomi, Qualcomm, 🤔and others 🤔have enabled us to optimize a design for the performance, power, size, and cost requirements of various markets.”

Prophesse’s vision sensor is a 320x320, 6.3μm pixel BSI stacked event-based vision sensor that offers a tiny 1/5-in. optical format. Verre said, “The explicit goal was to improve integrability and usability in embedded at-the-edge vision systems, which in addition to size and power improvements, means the design must address the challenge of event-based vision’s unconventional data format, nonconstant data rates, and non-standard interfaces to make it more usable for a wider range of applications. We have done that with multiple integrated event data pre-processing, filtering, and formatting functions to minimize external processing overhead.”

Verre added, “In addition, MIPI or CPI data output interfaces offer low-latency connectivity to embedded processing platforms, including low-power microcontrollers and modern neuromorphic processor architectures.

Low-Power Operation

According to Verre, the GenX320 sensor has been optimized for low-power operation, featuring a hierarchy of power modes and application-specific modes of operation. On-chip power management further improves sensor flexibility and integrability. To meet aggressive size and cost requirements, the chip is fabricated using a CMOS stacked process with pixel-level Cu-Cu bonding interconnects achieving a 6.3μm pixel-pitch.
The sensor performs low latency, µsec resolution timestamping of events with flexible data formatting. On-chip intelligent power management modes reduce power consumption to a low 36uW and enable smart wake-on-events. Deep sleep and standby modes are also featured.

According to Prophesee, the sensor is designed to be easily integrated with standard SoCs with multiple combined event data pre-processing, filtering, and formatting functions to minimize external processing overhead. MIPI or CPI data output interfaces offer low-latency connectivity to embedded processing platforms, including low-power microcontrollers and modern neuromorphic processor architectures.

Prophesee’s Verre expects the sensor to find applications in AR/VR headsets. “We are solving an important issue in our ability to efficiently (i.e. low power/low heat) support foveated rendering in eye tracking for a more realistic, immersive experience. Meta has discussed publicly the use of event-based vision technology, and we are actively involved with our partner Zinn Labs in this area. XPERI has already developed a driver monitor system (DMS) proof of concept based on our previous generation sensor for gaze monitoring and we are working with them on a next-gen solution using GenX320 for both automotive and other potential uses, including micro expression monitoring. The market for gesture and motion detection is very large, and our partner Ultraleap has demonstrated a working prototype of a touch-free display using our solution.”

The sensor incorporates an on-chip histogram output compatible with multiple AI accelerators. The sensor is also natively compatible with Prophesee Metavision Intelligence, an open-source event-based vision software suite that is used by a community of over 10,000 users.

Prophesee will support the GenX320 with a complete range of development tools for easy exploration and optimization, including a comprehensive Evaluation Kit housing a chip-on-board (COB) GenX320 module, or a compact optical flex module. In addition, Prophesee will offer a range of adapter kits that enable seamless connectivity to a large range of embedded platforms, such as an STM32 MCU, speeding time-to-market.

Spencer Chin is a Senior Editor for Design News covering the electronics beat. He has many years of experience covering developments in components, semiconductors, subsystems, power, and other facets of electronics from both a business/supply-chain and technology perspective. He can be reached at Spencer.Chin@informa.com.

They mentioned Sony, Bosch, Intel but not brainchip… why not brainchip if it would be the groundbreaking technology for them? I would not bet on this horse! Still waiting and holding without hyping every article and announcement about AI dyor
 
  • Like
  • Haha
Reactions: 7 users

7für7

Top 20
Its interesting when you do a google search on "AKIDA", one of the first results that comes up is:


However, there is no mention of AKIDA on their page?
Maybe google have akida inside 😂
 
  • Haha
Reactions: 1 users
Has anyone seen this:


BrainChip's leadership shares details on the company's development of the Akida chip, future prospects, and more in this informative podcast.​

Artificial Intelligence
- Edge AI
- Edge Processors
In a recent episode of the BrainChip podcast, a select group of the company's top executives gathered to discuss the future trajectory of artificial intelligence (AI) and the pivotal role BrainChip is set to play in this dynamic landscape. We highlight some of the key discussion points below but encourage you to listen to the whole episode here.

Leadership with Vision​

Rob Tolson, serving as the Vice President of Sales and Marketing, is the driving force behind BrainChip's global outreach. With a keen understanding of market dynamics and a vision for BrainChip's expansive global presence, Tolson has been instrumental in positioning the company as a leader in the AI industry.
Peter Vandermade, the CEO and Co-founder of BrainChip, brings to the table a wealth of experience and a visionary approach. His emphasis on the importance of the company's Advanced Research Center in Perth showcases his commitment to pioneering the next wave of AI innovations. Vandermade's insights into the potential of Akida, BrainChip's flagship technology, highlight his forward-thinking approach to AI's future.
Anil Mankar, the Vice President of Product Development and also a Co-founder, offers a deep dive into the technical intricacies of BrainChip's operations. His insights into the production process, from chip manufacturing to rigorous testing, provide a glimpse into the meticulous steps BrainChip takes to ensure top-tier functionality and performance.
Lastly, Ken Scarance, the Chief Financial Officer of BrainChip, sheds light on the company's financial endeavours. His discussions on Brain Chips strategic financial initiatives, including capital raising agreements and efforts to bolster its presence in the U.S. capital markets, underscore the company's ambitions for growth and market dominance.

BrainChip's Evolution: A Glimpse into the Company's Global Strategy

To kick off the episode, the team outlines the company’s history, vision and future plans. BrainChip's commitment to being a global leader in the AI industry is evident in its expansive operational presence. With hubs in California, Perth, France, and India, the company has strategically positioned itself in key tech-centric locations. This global footprint not only facilitates diverse collaborations but also ensures that BrainChip remains at the pulse of AI advancements worldwide.

From Research to Production: A Strategic Pivot​

Historically, BrainChip has been synonymous with cutting-edge research in neuromorphic computing. Their dedication to pushing the boundaries of AI has positioned them as pioneers in the field. However, recognizing the vast commercial potential of their innovations and the industry's shifting dynamics, BrainChip is undergoing a transformation.
Tolson emphasised this transition in the podcast, noting the company's pivot from being primarily research-driven to adopting a production-centric approach. For engineers, this shift signifies BrainChip's intent to translate their groundbreaking research into tangible, market-ready solutions.

Engaging the Tech Community​

Understanding the importance of effective communication in the tech world, especially among engineers and developers, Tolson highlighted BrainChip's efforts to foster engagement. The company's podcast series, for instance, is more than just a marketing tool. It's a platform for knowledge sharing, offering insights into Brain Chips offerings, their vision for AI's future, and the technical intricacies that make their solutions stand out. Stay up to date with BrainChip activities by following their Wevolver profile.

The Advanced Research Center: A Beacon of Innovation​

Located in Perth, Australia the Advanced Research Center is BrainChip's crown jewel. It's not just a research facility; it's a testament to the company's dedication to pushing the boundaries of what's possible in AI. While many in the industry focus on refining existing deep learning models, BrainChip's centre is already looking beyond, exploring the next frontier of AI innovations.

Akida: The Future of AI Technology​

Vandermade's enthusiasm was palpable when discussing Akida, BrainChip's flagship technology. Akida is not just another chip in the market; it embodies BrainChip's vision for the future of AI. Two of its standout features are its low energy consumption and its on-chip learning capabilities.
For engineers and tech enthusiasts, these features are significant. The low energy consumption means that Akida is not only efficient but also environmentally conscious, addressing a growing concern in today's tech-driven world. On the other hand, on-chip learning capabilities represent a leap in AI technology, allowing for faster, more efficient processing without the need for constant back-and-forth with centralised data centres.
In Vandermade's view, Akida is set to redefine the AI industry. Its unique features position it not just as an innovative product but as a transformative solution that could shape the way we think about and implement AI in various applications.
eyJidWNrZXQiOiJ3ZXZvbHZlci1wcm9qZWN0LWltYWdlcyIsImtleSI6ImZyb2FsYS8xNjk3MjA4NDI5MTg2LWFraWRhLmpwZyIsImVkaXRzIjp7InJlc2l6ZSI6eyJ3aWR0aCI6OTUwLCJmaXQiOiJjb3ZlciJ9fX0=

Financial Initiatives and Market Presence​

To wrap up the podcast, Ken Scarance, BrainChip's Chief Financial Officer, touched upon the company's recent financial endeavours. He highlighted the company's agreement with LDA, aimed at raising capital, and BrainChip's strategic move to enhance its presence in the U.S. capital markets. Additionally, the company is bolstering its investor relations strategy, aiming to foster better communication with stakeholders and educate the market about BrainChips groundbreaking offerings.

Conclusion​

As BrainChip continues to make strides in the AI domain, the company remains committed to keeping its audience informed and engaged. With a series of events lined up to showcase Akida's capabilities and an unwavering focus on innovation, BrainChip is undoubtedly poised to redefine the boundaries of AI. Stay up to date with BrainChip's new content here.
Hi HG,

This is episode 10 of the podcast series which is quite some time ago so a bit dated.

Q; are we related?

SG

😂
 
  • Like
  • Haha
Reactions: 11 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
October 16, 2023

Neuromorphic computing next key trend to watch for after generative AI​

GlobalData predicts that the primary functions of neuromorphic computing will be in more human-like tasks than generative AI.
By Isaac Hanson
email_icon.svg

Neuromorphic-Computing.jpg
Neuromorphic Computers are modeled on the human brain. Credit: Shutterstock.


Neuromorphic computing may be the next key development following on from generative AI, it was suggested on a recent GlobalData webinar.

What are the game-changing innovations in AI after Generative AI? explored the disruptive AI innovations of the future, their potential impacts and the companies that are at the forefront of the new innovations.

Generative AI​

Generative AI is one of a number of recent AI breakthroughs, with tools like ChatGPT making these technologies available to a mass audience. Such platforms allow users to enter prompts in plain language and can answer questions, write computer code and even whole articles in response.

ChatGPT reached 100 million users in two months, faster than TikTok, Netflix or Spotify and boats an average use time of 8-10 minutes, similar to that of Facebook or Youtube. The appeal of generative AI goes far beyond general public use cases, though.

NASA is utilising the technology to build spaceship parts, and NVIDIA is pioneering the its use in drug research and development. It is however limited in scalability, as current AI tools take large amounts of computing power to operate. This means that the servers processing requests are often off-site due to the space requirements, leading to latency in responses. It also leads to extremely high energy usage, which is both expensive and environmentally harmful.

So, What’s Next?​

One of the solutions to these problems that has been gaining traction recently is a move towards neuromorphic computing. This is a way of building computers modelled on the human brain. Though perhaps philosophically reductive, the brain can be abstracted on a technical basis into a collection of computation units (neurons) connected by fast-access local memory (synapses).

Building a computer system in a similar way can increase the density of computing power, meaning much lower energy costs and a significantly smaller storage space. The human brain draws around 12W of power on a continuous basis, several times lower than even a laptop computer’s 60W. It is also able to respond to stimuli in real-time, growing and rearranging connections between neurons as it is exposed to new information.

Whilst the technology is far from the 100 billion neurons found in the human brain currently, Intel has developed a neuromorphic computing board with 100 million neuron-equivalent nodes. All of this power fits inside a chassis the size of five standard servers.

What can neuromorphic computing do?​

This kind of computing will not necessarily be useful for the same applications as generative AI, at least in the beginning. GlobalData predicts that the primary functions of neuromorphic computing will be in more human-like tasks: improving the connectivity between prosthetics and human brains, improving autonomous vehicles’ driving and improving customer service.

In the long term, this technology will likely see greater integration with generative AI and neural networks, as has already begun with IBM’s TrueNorth chip. TrueNorth features 1 million digital neurons connected by 256 million digital synapses, with the capacity for neural network integration in order to allow AI models to learn more rapidly and power efficiently.

Job postings for roles in neuromorphic computing have accelerated since mid-June of 2021, and 2022 saw an increase in senior postings in the field compared to previous years. Intel and IBM are unsurprisingly two of the largest hirers (first and third respectively), alongside Ericsson (second) and HP (fourth).

Whether the technology focuses on generative AI integration or charts its own course, there is no doubt that neuromorphic computing will be a key factor in shaping the future of technology.



 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 29 users
Top Bottom