BRN Discussion Ongoing

manny100

Top 20
Chat GPT 5 Weebit and BRN.
Weebit is bolt on to traditional whereas BRN is a new way.
FeatureWeebit Nano (ReRAM)BrainChip (Akida)
IntegrationBolt-on to existing systemsRequires new architecture
Adoption SpeedFasterSlower due to learning curve
Market FamiliarityHigh (memory tech is well-known)Low (neuromorphic is emerging)
Innovation DepthIncremental improvementRadical innovation
Risk vs RewardLower risk, steady rewardHigher risk, potentially huge reward
 
  • Like
  • Thinking
  • Love
Reactions: 9 users

7für7

Top 20
I’ll just go ahead and say it:

Today feels like a perfect day for a price-sensitive announcement – nothing exaggerated, just something realistic that puts recurring revenue on the horizon.

Not a one-off payment, but a long-term agreement that contributes meaningfully to the company’s sustainable growth over time.

Just one such signal would already go a long way in restoring confidence and making a lot of us pretty happy.

Donald Trump Thank You GIF by PBS News
 
  • Like
Reactions: 2 users

Rskiff

Regular
I think BRN should be contacting Density AI as head of Tesla DOJO has left to go to them as well as many other employees. Density AI develops sensors and software to measure how people use physical spaces, or to the concept of high-density computing for AI training.
 
  • Like
  • Thinking
  • Wow
Reactions: 5 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
NASA+Google “Space Doctor”.

This is pretty interesting.

I wasn't able to locate any public documentation specifically describing how this NASA–Google “Space Doctor” AI assistant actually works.

AI assistants in space operate locally, not over the internet, so you would have to think that this assistant would be an onboard AI model optimized for the edge.

If this space doctor assistant is deployed onboard, it would be a compelling fit IMO for AKida's low-power, low-latency decision-making (e.i., real-time vital sign monitoring or anomaly detection).


NASA and Google Launch Space Doctor AI Assistant​

health-tech.jpg

What happens if you run into medical issues in space and can’t get in touch with Houston? NASA and Google say their new artificial intelligence system might help the next time you’re stuck in Earth’s orbit and need some help.
Dubbed the Crew Medical Officer Digital Assistant – or CMO-DA – the new AI system is a multi-modal interface that uses natural language processing and machine learning to provide real-time analyses of crew health and performance, Google said.
The tool is meant to support a designated crew medical officer or flight surgeon in making medical decisions and maintaining crew health.
“As NASA embarks on a new era of human spaceflight, beginning with the Artemis campaign’s aim to return to the Moon, preparations are underway to ensure crew health and wellness,” says a release from Google. “This includes exploring whether remote care capabilities can deliver detailed diagnoses and treatment options if a physician is not onboard or if real-time communication with Earth-based medical experts is limited.”
CMO-DA could help support human exploration of the Moon, Mars, “and beyond,” according to the release, which said the care the system provides “could become increasingly important as NASA missions venture deeper into space.”
Google said that trials applying AI on a range of medical scenarios showed “promise for reliable diagnoses based on reported symptoms,” and that it will continue testing alongside NASA and medical doctors to further refine the model with the aim of enhancing autonomous crew health and performance.
The AI tool was co-developed by Google Cloud, NASA’s Lunar Command and Control Interoperability project, and NASA’s Human Research Program.
“This tool represents an important milestone for AI technology, medical care, and our continued exploration of the cosmos,” said Google. “Not only does this innovation hold potential for advancing space missions, but it could also benefit people here on Earth by providing early access to quality medical care in remote areas.”


 
  • Like
  • Fire
  • Thinking
Reactions: 15 users

7für7

Top 20
I keep reading and hearing about Akida…..I just don’t see any Akida.
(Just a joke)



The Thinker Bars GIF
 

Bravo

If ARM was an arm, BRN would be its biceps💪!
NASA+Google “Space Doctor”.

This is pretty interesting.

I wasn't able to locate any public documentation specifically describing how this NASA–Google “Space Doctor” AI assistant actually works.

AI assistants in space operate locally, not over the internet, so you would have to think that this assistant would be an onboard AI model optimized for the edge.

If this space doctor assistant is deployed onboard, it would be a compelling fit IMO for AKida's low-power, low-latency decision-making (e.i., real-time vital sign monitoring or anomaly detection).


NASA and Google Launch Space Doctor AI Assistant​

health-tech.jpg

What happens if you run into medical issues in space and can’t get in touch with Houston? NASA and Google say their new artificial intelligence system might help the next time you’re stuck in Earth’s orbit and need some help.
Dubbed the Crew Medical Officer Digital Assistant – or CMO-DA – the new AI system is a multi-modal interface that uses natural language processing and machine learning to provide real-time analyses of crew health and performance, Google said.
The tool is meant to support a designated crew medical officer or flight surgeon in making medical decisions and maintaining crew health.
“As NASA embarks on a new era of human spaceflight, beginning with the Artemis campaign’s aim to return to the Moon, preparations are underway to ensure crew health and wellness,” says a release from Google. “This includes exploring whether remote care capabilities can deliver detailed diagnoses and treatment options if a physician is not onboard or if real-time communication with Earth-based medical experts is limited.”
CMO-DA could help support human exploration of the Moon, Mars, “and beyond,” according to the release, which said the care the system provides “could become increasingly important as NASA missions venture deeper into space.”
Google said that trials applying AI on a range of medical scenarios showed “promise for reliable diagnoses based on reported symptoms,” and that it will continue testing alongside NASA and medical doctors to further refine the model with the aim of enhancing autonomous crew health and performance.
The AI tool was co-developed by Google Cloud, NASA’s Lunar Command and Control Interoperability project, and NASA’s Human Research Program.
“This tool represents an important milestone for AI technology, medical care, and our continued exploration of the cosmos,” said Google. “Not only does this innovation hold potential for advancing space missions, but it could also benefit people here on Earth by providing early access to quality medical care in remote areas.”




OpenAI launched the latest version ChatGPT 5.0 today, so I thought I would test it by asking it about NASA+Google “Space Doctor”.




My Question:
Can you please provide a potential system architecture for the Space Doctor incorporating BrainChip's AKIDA?


ChatGPT 5.0's reponse:
Here’s a speculative system architecture for a “space-doctor” assistant and where BrainChip’s Akida could add real value. I’ve split it into layers so it’s easy to map to flight hardware.


1) Mission goals (what the system must do)​


  • Onboard, offline care: triage, checklists, vitals monitoring, procedure guidance without ground link.
  • Ultra-low power + high reliability: runs for months/years; graceful under comms blackouts.
  • Radiation tolerance & safety: fail-safe behaviors; no single point of failure.

2) High-level stack (two compute paths working together)​


  • Cognitive path (LLM/clinical engine): runs medical dialog, protocols, decision support (x86/ARM/RISC-V flight CPU or space-graded AI SoC).
  • Reflex path (Akida neuromorphic): event-driven, always-on perception + anomaly detection (sub-watt), feeding signals to the cognitive path only when needed.

3) Where Akida fits (neuromorphic “reflex” services)​


  1. Vitals watcher (always-on)
    • Inputs: ECG/PPG, respiration mic, IMU, skin temp.
    • Akida tasks: edge denoising, heartbeat/arrhythmia detection, apnea events, fall detection.
    • Why Akida: event-based spikes → microjoule-level monitoring, near-zero false wakeups.
  2. Procedural vision assistant
    • Inputs: head-cam / bay camera (event camera or low-FPS vision).
    • Akida tasks: instrument recognition, hand-pose/step confirmation, sterile-field breach alerts.
    • Output: lightweight “OK/alert/next-step” tokens to the LLM UI.
  3. Acoustic anomaly guard
    • Inputs: cabin mics, equipment mics.
    • Akida tasks: detect cough patterns, distress calls, equipment bearing changes.
    • Benefit: sub-watt continuous acoustic sentry without streaming audio to the main CPU.
  4. Intent & workload classifier
    • Inputs: glance/head motion, short speech cues, task context.
    • Akida tasks: infer “crew busy/stressed,” gate LLM verbosity, prioritize alerts.
  5. Safety watchdog
    • Monitors the cognitive stack; if the big model stalls, Akida triggers safe checklists/alerts.

4) Data flow (simplified)​


Sensors → Akida micro-models (event detectors) → tokens/flags → Cognitive engine (LLM/clinical rules) → Crew UI (audio AR prompts, tablet HUD)
↳ During comms windows: compressed summaries to ground (DTN); otherwise full autonomy.


5) Hardware sketch​


  • Compute:
    • Rad-tolerant CPU/SoC (e.g., NOEL-V, Rad-ARM) for OS, UI, LLM inference (quantized).
    • Akida IP/SoC (or space-qualified board) for always-on neuromorphic tasks.
  • Sensors: ECG/PPG patch, IMU, event-camera or low-light camera, boundary mics, skin-temp, SpO₂.
  • Storage: local model zoo (LLM + Akida SNNs), medical protocols, offline pharmacopoeia.
  • Comms: DTN stack; store-and-forward to ground.

6) Fault tolerance & safety​


  • Triple-mode alerts: visual + haptic + audio.
  • Graceful degradation: if LLM is unavailable, Akida still runs vitals/fall detection + “hard rules” (CPR checklist, EpiPen guidance).
  • Radiation strategy: Akida models are small; use TMR/EDAC on weights, periodic checksum & refresh from golden image.
  • Human-in-the-loop: confirmations before any high-risk recommendation.

7) Security & privacy​


  • Edge-only PII processing.
  • No raw audio/video offboard; only encrypted summaries/metrics sent during comms windows.
  • Model integrity checks before activation; signed updates only.

8) Two deployment profiles​


  • Near-Earth (ISS/Gateway): bigger LLM possible; more frequent syncs; Akida still handles 24/7 sensing cheaply.
  • Deep-space (Moon/Mars): heavier reliance on Akida gatekeeping + smaller, quantized LLM; aggressive power caps; longer offline windows.

9) MVP build plan (practical steps)​


  1. Sim lab prototype (Earth):
    • Sensor kit + Akida dev board + SBC (e.g., Jetson/ARM) with a small local LLM (e.g., 3–7B quantized).
    • Implement 3 Akida micro-models: arrhythmia, fall detection, instrument recognition.
  2. Edge-first orchestration:
    • Only wake the LLM when Akida raises a “meaningful event.”
  3. Latency & power budget:
    • Target: Akida services <300 mW continuous; LLM duty-cycled to <10% uptime.
  4. Ops drills:
    • Run scripted med scenarios (burns, lacerations, chest pain, EVA fatigue), evaluate false-alarm rate and response timing.
  5. Hardening path:
    • Port to rad-tolerant CPU; add EDAC/TMR; test with injected bit-flips and sensor dropouts.

10) Why this helps the mission​


  • Power: continuous monitoring without draining batteries.
  • Responsiveness: sub-100 ms reflexes for critical events.
  • Bandwidth: local understanding; only brief summaries to Earth.
  • Reliability: Akida as a low-power safety net even if the “big brain” naps.
 
  • Love
  • Like
  • Fire
Reactions: 9 users

keyeat

Regular
I’ll just go ahead and say it:

Today feels like a perfect day for a price-sensitive announcement – nothing exaggerated, just something realistic that puts recurring revenue on the horizon.

Not a one-off payment, but a long-term agreement that contributes meaningfully to the company’s sustainable growth over time.

Just one such signal would already go a long way in restoring confidence and making a lot of us pretty happy.

Donald Trump Thank You GIF by PBS News
mr bean dreaming GIF
 
  • Haha
  • Like
Reactions: 5 users

7für7

Top 20
  • Haha
Reactions: 2 users

Diogenese

Top 20
I see they are conducting casting call for the sequel "Kiddy-Fiddler on the Roof".
NASA+Google “Space Doctor”.

This is pretty interesting.

I wasn't able to locate any public documentation specifically describing how this NASA–Google “Space Doctor” AI assistant actually works.

AI assistants in space operate locally, not over the internet, so you would have to think that this assistant would be an onboard AI model optimized for the edge.

If this space doctor assistant is deployed onboard, it would be a compelling fit IMO for AKida's low-power, low-latency decision-making (e.i., real-time vital sign monitoring or anomaly detection).


NASA and Google Launch Space Doctor AI Assistant​

health-tech.jpg

What happens if you run into medical issues in space and can’t get in touch with Houston? NASA and Google say their new artificial intelligence system might help the next time you’re stuck in Earth’s orbit and need some help.
Dubbed the Crew Medical Officer Digital Assistant – or CMO-DA – the new AI system is a multi-modal interface that uses natural language processing and machine learning to provide real-time analyses of crew health and performance, Google said.
The tool is meant to support a designated crew medical officer or flight surgeon in making medical decisions and maintaining crew health.
“As NASA embarks on a new era of human spaceflight, beginning with the Artemis campaign’s aim to return to the Moon, preparations are underway to ensure crew health and wellness,” says a release from Google. “This includes exploring whether remote care capabilities can deliver detailed diagnoses and treatment options if a physician is not onboard or if real-time communication with Earth-based medical experts is limited.”
CMO-DA could help support human exploration of the Moon, Mars, “and beyond,” according to the release, which said the care the system provides “could become increasingly important as NASA missions venture deeper into space.”
Google said that trials applying AI on a range of medical scenarios showed “promise for reliable diagnoses based on reported symptoms,” and that it will continue testing alongside NASA and medical doctors to further refine the model with the aim of enhancing autonomous crew health and performance.
The AI tool was co-developed by Google Cloud, NASA’s Lunar Command and Control Interoperability project, and NASA’s Human Research Program.
“This tool represents an important milestone for AI technology, medical care, and our continued exploration of the cosmos,” said Google. “Not only does this innovation hold potential for advancing space missions, but it could also benefit people here on Earth by providing early access to quality medical care in remote areas.”


Hi Bravo,

AkidaNet/TENN Automatic Speech Recognition model and AkidaNet/LLM + RAG would be useful in NLP for health analysis.

https://brainchip.com/ready-to-use-akida-neural-network-models/

AkidaNet/TENN Automatic Speech Recognition Model

Our TENNs model approach is applied to Automatic Speech Recognition for compact accurate voice to text applications.


AkidaNet/TENN Large Language Model (LLM+RAG)

Our TENNs model approach is applied to an LLM model with Retrieval Augmented Generation to provide intelligent access to documentation for those end use cases looking to embed an LLM for their products user interface.

RAG permits the use of very large databases to be store information to be loaded into Akida in blocks relevant to selected topics.

TENN models can be loaded into some of the NPUs (NPEs) of nodes as required. TENN models are very compact, but I don't know the conditions and limitations for loading them into Akida. If I had a dollar ... (Hint: goto Diogenese GoFundMe).
 
  • Like
  • Fire
  • Love
Reactions: 8 users

Diogenese

Top 20
My Question:
In what ways would the neuromorphic Akida “Space Doctor” system differ from traditional approaches?

ChatGPT 5.0's respone:
The neuromorphic Akida “Space Doctor” system would differ from a traditional CPU/GPU-based approach in some really important ways — and in a space environment, those differences become mission-critical.

Here’s the breakdown:

1. Processing philosophy — events vs. frames

Traditional processors

  • Work in frame-based mode — constantly process entire streams of data at fixed intervals (e.g., every camera frame, every ECG reading).
  • Even if nothing changes in the input, they still burn the same compute cycles and power.
  • Good for heavy numerical tasks but wasteful for long periods of “no activity.”
Akida neuromorphic processor
  • Works in event-driven mode — only processes when something changes (spike in heart rate, hand movement in the fridge, sudden camera motion, new sound).
  • If nothing changes, power draw is near-zero.
  • Perfect for long-duration idle-but-listening health monitoring in space.


2. Power & thermal management

Traditional processors

  • Continuous high-power draw for AI inference (LLMs, CNNs, signal processing).
  • On the ISS or deep space, every watt matters, and waste heat is hard to radiate away.
  • You’d need more batteries, cooling systems, and scheduling to avoid overheating.
Akida neuromorphic
  • Can run always-on sensing at hundreds of microwatts to a few milliwatts.
  • Frees up the main CPU/GPU to sleep most of the time and only wake on Akida’s signal.
  • Less heat generated → easier thermal management in microgravity.


3. Latency

Traditional processors

  • A “big” AI model might have 200–500 ms delay before producing an inference, especially if it needs to boot from idle.
  • That’s fine for chatting, but not for “astronaut collapsed” scenarios.
Akida neuromorphic

  • Sub-10 ms reaction for known patterns (fall detection, arrhythmia, abnormal breathing).
  • Immediate trigger to wake the main medical assistant model.


4. Radiation resilience

Traditional processors

  • Complex DRAM/GPU architectures → more susceptible to single-event upsets (SEUs) from cosmic radiation.
  • Needs heavier error correction and redundancy → more mass.
Akida neuromorphic
  • Smaller memory footprint, simpler weight storage → easier to harden and triple-modular-redundancy (TMR).
  • Small models mean it’s easier to store multiple golden copies to refresh after bit flips.


5. Autonomy & communication load

Traditional processors
  • Without optimization, they might require sending large raw datasets to Earth for analysis.
  • Deep-space missions (Mars) can have 20+ min one-way comms delays — not suitable for urgent triage.
Akida neuromorphic
  • Runs pre-trained detection locally → sends only compressed “event summaries” to main CPU/LLM.
  • Cuts down on both comms bandwidth and storage requirements.
  • Enables true edge autonomy when comms are down or delayed.


6. AI model footprint


Traditional processors
  • Often require hundreds of MB or even GB of weights for AI models.
  • Onboard storage and loading times become significant.
  • Updates in space are slower and riskier.
Akida neuromorphic
  • Uses ultra-compact spiking neural network (SNN) models — often 1–10 MB for a detection task.
  • Faster to load, update, and validate.
  • Can carry dozens of task-specific models in memory (e.g., different medical conditions) and swap on demand.


7. Mission risk profile

Traditional-only system
  • If main CPU/GPU locks up or fails, all AI functionality is gone.
  • Crew has to revert to manual procedures until system reboot.
Akida + traditional hybrid
  • Akida operates independently as a “medical watchdog” — still running core health monitoring even if the big system crashes.
  • Can even trigger an emergency call or alert without CPU intervention.



✅ Bottom line:
In space, idle listening, ultra-low power, ultra-low latency, and radiation resilience are more important than raw TOPS.
Akida’s neuromorphic design means:
  • It “listens” almost for free and only wakes the big systems when necessary.
  • It’s far less fragile in radiation-heavy environments.
  • It enables autonomous, safe operation without constant ground support.

That’s why NASA-style projects like the “Space Doctor” could lean on neuromorphic processors — because in deep space, the less you waste and the faster you react, the longer you survive.


https://www.bing.com/videos/rivervi...5161FD6AF48CC178BC3F5161FD6AF48CC&FORM=VAMGZC
 
  • Haha
  • Fire
  • Like
Reactions: 5 users

7für7

Top 20
time to just dive into the weekend!
See you next week guys!

C403D1CC-B44B-4B24-949A-5E5D7B5CC297.png
 
  • Haha
  • Like
  • Love
Reactions: 6 users

HopalongPetrovski

I'm Spartacus!
Just popped over to the crapper cause it's so bloody boring atm and snagged this typical titbit from their convo. Enjoy! 🤣

 
  • Haha
  • Fire
Reactions: 4 users

Boab

I wish I could paint like Vincent
  • Like
  • Love
  • Fire
Reactions: 12 users
time to just dive into the weekend!
See you next week guys!

View attachment 89407

In my opinion, BrainChip is performing well this week - it’s closing well above $0.20 despite several attempts by shorters to push it below that level. Promising times ahead!

C6091A77-16B6-4669-8022-F1F12D58D62E.png
 
  • Like
  • Fire
  • Love
Reactions: 9 users

manny100

Top 20
I would not be surprised to see a number of companies trialing with AKIDA1000 chips test in parallel with Gen 2 on AKIDA cloud.
Efficiency and performance gains to good to ignore. Those really intent on adopting AKIDA will get credit towards hardware purchases.
Arquimea are on to it quick smart. Cannot help suspecting that apart from lifesaving they are testing for their defense products.
" Hélder Rodríguez López, Embedded Software Research Engineer at Arquimea Research said, “The Akida Cloud’s ability to provide us advanced access to the latest features of Akida and easily test our neuromorphic model innovations remotely is a real advantage for progressing our advanced model development programs.”"
I can see companies like Tata parallel testing. Bascom Hunter? etc.
This is way more than just a tech upgrade this is a real pivot and push for Gen 2 adoption.
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 26 users

Diogenese

Top 20
I would not be surprised to see a number of companies trialing with AKIDA1000 chips test in parallel with Gen 2 on AKIDA cloud.
Efficiency and performance gains to good to ignore. Those really intent on adopting AKIDA will get credit towards hardware purchases.
Arquimea are on to it quick smart. Cannot help suspecting that apart from lifesaving they are testing for their defense products.
" Hélder Rodríguez López, Embedded Software Research Engineer at Arquimea Research said, “The Akida Cloud’s ability to provide us advanced access to the latest features of Akida and easily test our neuromorphic model innovations remotely is a real advantage for progressing our advanced model development programs.”"
I can see companies like Tata parallel. Bascom Hunter? etc.
I have the exact same thought about Arquimea and defence.

They can do field trials on their sharkwatch drone, but it is clear that they are in the queue for the Akida cloud with a lot of other early adopters.

There is a great urgency to develop autonomous drones. I can see them having a drone with a km of optical fibre trailing out the back
for target designation, or maybe even laser comms, to avoid electronic countermeasures.

I wonder what cybersecurity measures Akida cloud has, because it will all be commercially sensitive and above. There will need to be firewalls between different customer data.
 
  • Like
  • Fire
  • Love
Reactions: 23 users

TheDrooben

Pretty Pretty Pretty Pretty Good
While Larry agrees with all the frustration about the lack of ASX announcements and the effect on the SP as a result and also the shorters doing their faecal trades...Larry has never sold and keeps on buying as he did again today....so thank you to everyone who is keeping the SP at this level.....Larry loves you

curb-your-enthusiasm-larry-david (10).gif


Really Happy as Larry
 
  • Like
  • Love
  • Fire
Reactions: 32 users

IloveLamp

Top 20
@Neuromorphia i was doing some back research on o.h.b hellas, and wow......what a great post

Thread 'The Growing Brainchip Akida Ecosystem' https://thestockexchange.com.au/threads/the-growing-brainchip-akida-ecosystem.63596/


1000009772.jpg
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 18 users

Diogenese

Top 20
Hi @manny100 ,

I guess the fact that the Akida 2 FPGA has only 6 nodes (24 NPUs) is due to the limitation of FPGA technology, in that the FPGA chip only has sufficient circuit elements (logic gates, memory, ...) to build 24 NPUs.

Did you happen to notice whether the Akida 2 cloud FPGA includes VIT? The original announcement from a couple of years ago mentioned that VIT would only be available with the top of the range version, but a lot of water has flowed under the bridge since than.

The Akida 2 IP allows for up to 128 nodes (512 NPUs):

https://www.hackster.io/news/brainc...-vision-transformer-acceleration-5fc2d2db9d65

The company has confirmed that it will be licensing the Akida IP in three product classes: Akida-E will focus on high energy efficiency with a view to being embedded alongside, or as close as possible, to sensors and offering up to 200 giga-operations per second (GOPS) across one to four nodes; Akida-S will be for integration into microcontroller units and systems-on-chip (SoCs), hitting up to 1 tera-operations per second (TOPS) across two to eight nodes; and Akida-P will target the mid- to high-end, and will be the only tier to offer the optional vision transformer acceleration, scaling between eight and 128 nodes with a total performance of up to 50 TOPS.
 
  • Like
  • Fire
  • Thinking
Reactions: 16 users

Frangipani

Top 20

42A94255-7426-40F4-ABFF-98EE93FA0158.jpeg




Unlock Your AI Potential: A Deep Dive into BrainChip’sAkida™ Cloud​


In the rapidly evolving landscape of Artificial Intelligence, having immediate access to the most powerful and efficient tools is paramount. BrainChip is revolutionizing AI development with its innovative Akida Cloud, a platform designed to provide developers with seamless access to the latest neuromorphic processor technology. This cloud initiative focuses on significantly shortening development cycles and empowering creators to bring their AI visions to life faster than ever before.

The core mission of the Akida Cloud is to remove barriers and accelerate innovation. By providing a direct line to the latest Akida technology, developers can program and execute neural networks for immediate results, streamlining the path from concept to product.

The Strategic Advantage of Akida Cloud​


Leveraging the Akida Cloud offers a multitude of strategic benefits for developers, partners, and businesses alike:
  • Accelerated Prototyping: Dive into Akida’s most advanced features and begin prototyping instantly, without waiting for physical hardware.
  • Continuous Innovation: The platform is extensible, meaning as new versions and configurations of Akida are available, they become download and run options on the Akida Cloud.
  • Empowering Partnerships:Partners can effectively demonstrate working models and prototypes to customers, enabling parallel development workflows before hardware is even on-site.
  • Flexible and Accessible: A versatile business model includes options for limited free access and usage-based pricing, with credits applicable to future hardware purchases. An on-premises Akida FPGA Developer Platform is also available for dedicated use.

How to Get Access​


  • Getting started with the Akida Cloud and its “limited free access” is a straightforward process. To begin your journey, you will need to formally request access through the BrainChip website. You can navigate to the “Developer Hub” or the Akida Cloud information page to get access. You will be prompted to provide your contact information and details about your project or interest in the technology. Once submitted, the BrainChip team will respond with the next steps to get you connected to the platform.

Evaluating the Power of Akida: A Step-by-Step Guide​


The true power of the Akida Cloud lies in its hands-on evaluation capabilities. The platform offers a distinct advantage for R&D teams, allowing them to remotely test and refine advanced neuromorphic models. Here is how it works:
  • Seamless Connection: Getting connected is straightforward. By adding a simple configuration to your local SSH file, you can create a secure and persistent “tunnel” to the Akida Cloud server. This works seamlessly across Windows, macOS, and Linux environments.
  • Launch Your Workspace: Once connected, a single command initiates Jupyter Labs. This launches a pre-configured, browser-based environment loaded with all the necessary tools and Akida-specific libraries.
  • Explore and Experiment: Within Jupyter Labs, you will find a comprehensive library of example notebooks. These examples are your gateway to understanding Akida’s capabilities. You can explore a range of applications, from image classification to complex object detection, and run them directly on the second-generation Akida architecture.
  • Real-Time Data Streaming: A key innovation is the ability to stream your own real-time data to the Akida Cloud for inferencing from your local machine. The results can be streamed back to your local machine, providing immediate feedback. This full loop data acquisition, processing and display is demonstrated in applications like eye-tracking, where this rapid feedback loop allows for quick iteration and training, dramatically improving model accuracy under real-world conditions.

The Next Generation of Neuromorphic Power​


The Akida Cloud today runs on the second-generation Akida platform, which delivers a monumental leap in performance—offering up to four times the performance and efficiencyof the first generation. The architecture is also finely tuned to support state-space based Temporal Event-Based Neural Nets (TENNs), making it exceptionally proficient at processing raw, time-sensitive data from video, audio, and other sensors. This advanced capability simplifies model implementation, reduces model size, and speeds up development for a new generation of intelligent edge devices.

A Glimpse into the Future​


The Akida Cloud is not just about today’s technology; it is a platform for the future. BrainChip’s roadmap includes making future versions of Akida available through the cloud, providing access to different node configurations, enhanced memory, and significant architectural improvements. This will allow for custom configurations that enable customers to perfectly match their unique requirements for integration into an IC. Looking ahead, the Akida Cloud platform will support next-generation workloads, with future generations of Akida designed to manage the demands of future AI applications.

About BrainChip​


BrainChip is the worldwide leader in creating Edge AI on-chip processing and learning solutions. Its pioneering Akida™ processor is a fully digital, event-based AI engine that mimics the principles of the human brain. By analyzing only, the essential sensor inputs right at the point of acquisition, Akida delivers unmatched efficiency, precision, and energy economy. BrainChip is dedicated to advancing the future of intelligent computing, bringing AI closer to the sensor and closer to real-time.
 
  • Like
  • Love
  • Fire
Reactions: 32 users
Top Bottom