BRN Discussion Ongoing

Bravo

If ARM was an arm, BRN would be its biceps💪!
In this graph of the Vally of death Brainchip is in its final stage to come out of this Vally.

This is however the first out of 5. Valleys of death in tech start up 😅

I personally believe if we clear this first Vally it’s because of expanding IP adoption and Royalties . I predict this will start occurring mid 2024 and exponentially increasing from 2025—->

The reason I believe this is because of Brainchips established ecosystem of partners, Brainchips position in the edge market in terms of Technology and the timing of Brainchips relevance in the current and evolving market.

In other words… Brainchip has a lot of friends, they have a revolutionary technology that solves an ever pressing problem to expand AI forward.

So because of this I believe if brainchip can survive this first Vally, the market will move Brainchip straight past the other 4 typical valleys explained in this article.

It is of course not financial advice and I will forever be an optimist ☺️

Happy Easter everyone and Gods favour on all of us.

Please for the love of goshkins could everyone who has questions surrounding the “valley of death” please read this post.


And, what’s more, if you don’t believe Dodgy-knee’s post, you can listen to the actual podcast and decide for yourself!



Screenshot 2024-04-01 at 11.57.40 pm.png
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 28 users

Frangipani

Top 20
New interview with our CMO Nandan Nayampally:


IMG_2177_head.jpg

INTERVIEWSTECH
·APRIL 1, 2024·6 MIN READ

BRAINCHIP, MAKING AI UBIQUITOUS​

BrainChip is the worldwide leader in on-chip edge AI processing and learning technology, that enables faster, efficient, secure, and customizable intelligent devices untethered from the cloud. The company’s first-to-market neuromorphic processor, AkidaTM, mimics the human brain, the most efficient inference and learning engine known, to analyze only essential sensor inputs at the point of acquisition, executing only necessary operations and therefore, processing data with unparalleled efficiency and precision. This supports a distributed intelligence approach keeping machine learning local to the chip, independent of the cloud, dramatically reducing latency, while simultaneously improving privacy and data security.

The Akida neural processor is designed to provide a complete ultra-low power Edge AI network processor for vision, audio, smart transducers, vital signs and, broadly, any sensor application.

BrainChip’s scalable solutions, that can be used standalone or integrated into systems on chip to execute today’s models and future networks directly in hardware, empowering the market to create much more intelligent, cost-effective devices and services universally deployable across real-world applications in connected cars, healthcare, consumer electronics, industrial IoT, smart-agriculture and more, including use in a space mission and the most stringent conditions.

BrainChip is the foundation for cost-effective, fan-less, portable, real-time Edge AI systems that can offload the cloud, reducing the rapid growth in carbon footprint of datacenters. In addition, Akida’s unique capability to learn locally on device also reduces retraining of models in the cloud whose skyrocketing cost is a barrier to the growth of AIoT.


Interview with Nandan Nayampally, CMO at BrainChip.

Easy Engineering: What are the main areas of activity of the company?

Nandan Nayampally:
BrainChip is focused on AI at the Edge. The vision of the company is to make AI ubiquitous. Therefore, the mission for the company is to enable every device to have on-board AI acceleration, the key to which is extremely energy-efficient, and yet performant neural network processing. The company has been inspired by the human brain – the most efficient inference and learning engine to build neuromorphic AI acceleration solutions. The company delivers this as IP which can be integrated into customers’ System on Chip (SoCs). To achieve this, BrainChip has built a very configurable, event-based neural processor unit that is extremely energy-efficient and has a small footprint. It is complemented with BrainChip’s model compilation tools in MetaTFTM and its silicon reference platforms that can be used by customers to develop initial prototypes and then taken to market.

BrainChip continues to invest heavily in next generation neuromorphic architecture to stay ahead of the current AI landscape. To democratize GenAI, and pave the path to Artificial General Intelligence (AGI).

E.E: What’s the news about new products/services?

N.N:
Built in collaboration with VVDN Technologies, the Akida Edge Box is designed to meet the demanding needs of retail and security, Smart City, automotive, transportation and industrial applications. The device combines a powerful quad-core CPU platform with Akida AI accelerators to provide a huge boost in AI performance. The compact, light Edge box is cost-effective and versatile with built in ethernet and Wi-Fi connectivity, HD display support, extensible storage and USB interfaces. BrainChip and VVDN are finalizing the set of AI applications that will run out of the box. With the ability to personalize and learn on device, the box can be customized per application and per user without need of Cloud support, enhancing the privacy and security.

From an IP perspective, the 2nd generation of the Akida IP adds some big differentiators including a mechanism that can radically improve the performance and efficiency of processing multi-dimensional streaming data (video, audio, sensor) by orders of magnitude without compromising on accuracy. It also accelerates the most common use-case in AI – vision – in hardware much more effectively.

E.E: What are the ranges of products/services?

N.N:
BrainChip offers a range of products and services centered around its Akida neural processor technology. This includes:

Akida IP: These are BrainChip’s core offerings, representing the neuromorphic computing technology that powers edge AI applications. It has substantial benefits in multi-dimensional streaming data, accelerating structured state space models and vision.

MetaTF: A machine learning toolchain that integrates with TensorFlow, PyTorch designed to facilitate the transition to neuromorphic computing for developers working in the convolutional neural network space.

Akida1000, AKD1500 Ref SoC: A reference systems-on-chip (SoCs) that showcases the capabilities of the Akida technology, and enables prototyping, and small-volume production.

Akida Enablement Platforms/Dev Kits: Tools and platforms designed to support the development, training, and testing of neural networks on the Akida event domain neural processor.

IMG_2231_text-768x1024.jpg


E.E: What is the state of the market where you are currently active?

N.N:
We see three different groups of customers in the edge AI industry. The early adopters have already integrated AI acceleration into their edge application and are seeing the benefits of improved efficiency and the ability to run more complex models and use cases.

The second group are currently running AI models on the edge, but they are doing it without dedicated hardware acceleration. They are running on their MCU/MPU. It works but is not as efficient as it could be.

The last group we’re seeing have not yet integrated AI into their edge application. They are trying to understand the use cases, the unique value proposition that AI can unlock for them, and how to manage their data and train models.

We are actively engaged with customers at all three stages and understand the unique challenges and opportunities at each stage.

E.E: What can you tell us about market trends?

N.N:
As was evidenced at CES 2024, we’re seeing growth of AI everywhere. For this to be scalable and successful, the growth is happening not just in the data center, but increasingly at the Edge Network and growing to most IoT end devices. We’re at a point where the growth in energy-efficient compute capacity can now run complex use cases like object detection and segmentation on the Edge – not just at the Network Edge, but given technologies like BrainChip’s Akida, even on portable, fanless end-point devices.

By doing more compute at the end point, you can substantially reduce the bandwidth congestion to cloud, improve real-time response and most importantly improve privacy by minimizing or eliminating the transmission of sensitive data to cloud.

Models are becoming larger and more capable but storage and compute capacity at the Edge are constrained, so we see the need for efficient performance, massive compression and innovative solutions take precedence in hardware and software.


Generative AI has a great deal of momentum and it will only become monetizable if there is more done on the Edge. Even on smartphones, there are already thousands of generative AI applications.

There is a clear need to do more with lesswhich is fundamental to making AI economically viable. The costs include memory, compute, thermal management, bandwidth, and battery capacity to name a few. Customers, therefore, are demanding more power-efficient, storage-efficient and energy-efficient and cost-effective solutions. They want to unlock use cases like object detection in the wild. In addition to limited or no connectivity their use case might require running on battery for months. Traditional MPU/MCU based solutions won’t allow this. BrainChip’s neuromorphic architecture is well positioned for these ultra-low power scenarios.

E.E: What are the most innovative products/services marketed?

N.N:
We are seeing great progress in intuitive Human Machine Interface (HMI), where voice-based and vision-based communication with devices is on the rise – in consumer devices, smart home, automotive, remote healthcare and more. For example, automotive is using driver monitoring for emotions, focus and fatigue could help save lives and losses. Remote ECG and predictive vital signs monitoring remotely can also improve not just fatalities but quality of life. AI-driven fitness training is beginning to help individuals stay healthy.

There are lots more.

E.E: What estimations do you have for the beginning of 2024?

N.N:
We expect AI to truly go mainstream in 2024, but it’s still the tip of the iceberg.

The big transition you will see is the more mainstream adoption of Edge AI – without it, pure cloud-based solutions especially with Generative AI, would be cost-prohibitive. We therefore see the move towards Small Language Models (SLMs) that draw from (Large Language Models) LLMs to fit better into Edge devices while still providing the accuracy and response time that is expected.

In short, the AI innovation is moving to the Edge, and in 2024, you will see this coming together clearly.
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 95 users

IloveLamp

Top 20
1000014703.jpg
 
  • Like
  • Fire
  • Love
Reactions: 37 users

IloveLamp

Top 20
New interview with our CMO Nandan Nayampally:


IMG_2177_head.jpg

INTERVIEWSTECH
·APRIL 1, 2024·6 MIN READ

BRAINCHIP, MAKING AI UBIQUITOUS​

BrainChip is the worldwide leader in on-chip edge AI processing and learning technology, that enables faster, efficient, secure, and customizable intelligent devices untethered from the cloud. The company’s first-to-market neuromorphic processor, AkidaTM, mimics the human brain, the most efficient inference and learning engine known, to analyze only essential sensor inputs at the point of acquisition, executing only necessary operations and therefore, processing data with unparalleled efficiency and precision. This supports a distributed intelligence approach keeping machine learning local to the chip, independent of the cloud, dramatically reducing latency, while simultaneously improving privacy and data security.

The Akida neural processor is designed to provide a complete ultra-low power Edge AI network processor for vision, audio, smart transducers, vital signs and, broadly, any sensor application.

BrainChip’s scalable solutions, that can be used standalone or integrated into systems on chip to execute today’s models and future networks directly in hardware, empowering the market to create much more intelligent, cost-effective devices and services universally deployable across real-world applications in connected cars, healthcare, consumer electronics, industrial IoT, smart-agriculture and more, including use in a space mission and the most stringent conditions.

BrainChip is the foundation for cost-effective, fan-less, portable, real-time Edge AI systems that can offload the cloud, reducing the rapid growth in carbon footprint of datacenters. In addition, Akida’s unique capability to learn locally on device also reduces retraining of models in the cloud whose skyrocketing cost is a barrier to the growth of AIoT.


Interview with Nandan Nayampally, CMO at BrainChip.

Easy Engineering: What are the main areas of activity of the company?

Nandan Nayampally:
BrainChip is focused on AI at the Edge. The vision of the company is to make AI ubiquitous. Therefore, the mission for the company is to enable every device to have on-board AI acceleration, the key to which is extremely energy-efficient, and yet performant neural network processing. The company has been inspired by the human brain – the most efficient inference and learning engine to build neuromorphic AI acceleration solutions. The company delivers this as IP which can be integrated into customers’ System on Chip (SoCs). To achieve this, BrainChip has built a very configurable, event-based neural processor unit that is extremely energy-efficient and has a small footprint. It is complemented with BrainChip’s model compilation tools in MetaTFTM and its silicon reference platforms that can be used by customers to develop initial prototypes and then taken to market.

BrainChip continues to invest heavily in next generation neuromorphic architecture to stay ahead of the current AI landscape. To democratize GenAI, and pave the path to Artificial General Intelligence (AGI).

E.E: What’s the news about new products/services?

N.N:
Built in collaboration with VVDN Technologies, the Akida Edge Box is designed to meet the demanding needs of retail and security, Smart City, automotive, transportation and industrial applications. The device combines a powerful quad-core CPU platform with Akida AI accelerators to provide a huge boost in AI performance. The compact, light Edge box is cost-effective and versatile with built in ethernet and Wi-Fi connectivity, HD display support, extensible storage and USB interfaces. BrainChip and VVDN are finalizing the set of AI applications that will run out of the box. With the ability to personalize and learn on device, the box can be customized per application and per user without need of Cloud support, enhancing the privacy and security.

From an IP perspective, the 2nd generation of the Akida IP adds some big differentiators including a mechanism that can radically improve the performance and efficiency of processing multi-dimensional streaming data (video, audio, sensor) by orders of magnitude without compromising on accuracy. It also accelerates the most common use-case in AI – vision – in hardware much more effectively.

E.E: What are the ranges of products/services?

N.N:
BrainChip offers a range of products and services centered around its Akida neural processor technology. This includes:

Akida IP: These are BrainChip’s core offerings, representing the neuromorphic computing technology that powers edge AI applications. It has substantial benefits in multi-dimensional streaming data, accelerating structured state space models and vision.

MetaTF: A machine learning toolchain that integrates with TensorFlow, PyTorch designed to facilitate the transition to neuromorphic computing for developers working in the convolutional neural network space.

Akida1000, AKD1500 Ref SoC: A reference systems-on-chip (SoCs) that showcases the capabilities of the Akida technology, and enables prototyping, and small-volume production.

Akida Enablement Platforms/Dev Kits: Tools and platforms designed to support the development, training, and testing of neural networks on the Akida event domain neural processor.

IMG_2231_text-768x1024.jpg


E.E: What is the state of the market where you are currently active?

N.N:
We see three different groups of customers in the edge AI industry. The early adopters have already integrated AI acceleration into their edge application and are seeing the benefits of improved efficiency and the ability to run more complex models and use cases.

The second group are currently running AI models on the edge, but they are doing it without dedicated hardware acceleration. They are running on their MCU/MPU. It works but is not as efficient as it could be.

The last group we’re seeing have not yet integrated AI into their edge application. They are trying to understand the use cases, the unique value proposition that AI can unlock for them, and how to manage their data and train models.

We are actively engaged with customers at all three stages and understand the unique challenges and opportunities at each stage.

E.E: What can you tell us about market trends?

N.N:
As was evidenced at CES 2024, we’re seeing growth of AI everywhere. For this to be scalable and successful, the growth is happening not just in the data center, but increasingly at the Edge Network and growing to most IoT end devices. We’re at a point where the growth in energy-efficient compute capacity can now run complex use cases like object detection and segmentation on the Edge – not just at the Network Edge, but given technologies like BrainChip’s Akida, even on portable, fanless end-point devices.

By doing more compute at the end point, you can substantially reduce the bandwidth congestion to cloud, improve real-time response and most importantly improve privacy by minimizing or eliminating the transmission of sensitive data to cloud.

Models are becoming larger and more capable but storage and compute capacity at the Edge are constrained, so we see the need for efficient performance, massive compression and innovative solutions take precedence in hardware and software.


Generative AI has a great deal of momentum and it will only become monetizable if there is more done on the Edge. Even on smartphones, there are already thousands of generative AI applications.

There is a clear need to do more with lesswhich is fundamental to making AI economically viable. The costs include memory, compute, thermal management, bandwidth, and battery capacity to name a few. Customers, therefore, are demanding more power-efficient, storage-efficient and energy-efficient and cost-effective solutions. They want to unlock use cases like object detection in the wild. In addition to limited or no connectivity their use case might require running on battery for months. Traditional MPU/MCU based solutions won’t allow this. BrainChip’s neuromorphic architecture is well positioned for these ultra-low power scenarios.

E.E: What are the most innovative products/services marketed?

N.N:
We are seeing great progress in intuitive Human Machine Interface (HMI), where voice-based and vision-based communication with devices is on the rise – in consumer devices, smart home, automotive, remote healthcare and more. For example, automotive is using driver monitoring for emotions, focus and fatigue could help save lives and losses. Remote ECG and predictive vital signs monitoring remotely can also improve not just fatalities but quality of life. AI-driven fitness training is beginning to help individuals stay healthy.

There are lots more.

E.E: What estimations do you have for the beginning of 2024?

N.N:
We expect AI to truly go mainstream in 2024, but it’s still the tip of the iceberg.

The big transition you will see is the more mainstream adoption of Edge AI – without it, pure cloud-based solutions especially with Generative AI, would be cost-prohibitive. We therefore see the move towards Small Language Models (SLMs) that draw from (Large Language Models) LLMs to fit better into Edge devices while still providing the accuracy and response time that is expected.

In short, the AI innovation is moving to the Edge, and in 2024, you will see this coming together clearly.
Great interview, possibly the best one. I like this bit

E.E: What estimations do you have for the beginning of 2024?

N.N: We expect AI to truly go mainstream in 2024, but it’s still the tip of the iceberg.
 
  • Like
  • Love
  • Fire
Reactions: 26 users

AARONASX

Holding onto what I've got
Great interview, possibly the best one. I like this bit

E.E: What estimations do you have for the beginning of 2024?

N.N: We expect AI to truly go mainstream in 2024, but it’s still the tip of the iceberg.

Hi ILL,

Just below that "We therefore see the move towards Small Language Models (SLMs) that draw from (Large Language Models)"

Is this possible part of the reason why we're drawing on available funds with LDA?

From announcement:
1712005744656.png
 
Last edited:
  • Like
  • Fire
  • Thinking
Reactions: 31 users

stuart888

Regular
Lev Selector: Sharp guy, does a sweet AI Weekly Update.



1712006333723.png
 
  • Like
Reactions: 3 users

IloveLamp

Top 20
  • Like
  • Wow
  • Thinking
Reactions: 17 users

stuart888

Regular
Advanced Rag! "Just Rag It". Thumbs up for everything neuromorphic based like Brainchip.

All sorts of cutting-edge technology explained here. Like Small-to-Big Retrieval.



1712008701269.png
 
  • Like
  • Wow
  • Thinking
Reactions: 5 users

stuart888

Regular
Lots of discussion on Quantize to 1-2-4 Bits lately. The video starts right at the Quantization focus.

Via Lev Selector. He covers so much, all quickly, with lots of references to more detail on every subject. Just look at what he covers in 30 minutes.

1712009600871.png

 
  • Like
  • Thinking
Reactions: 6 users

stuart888

Regular
If you write code, watch David Ondrej. He shows you exactly how to do everything LLM. From very length Prompt Text-to-Code, to all things to make happen so you can make money, or improve your business.

He understands the Edge Neuromorphic focus of the industry and Brainchip.


 
  • Like
Reactions: 13 users

stuart888

Regular
Great overview. Bubbly and speaks knowledge. Some videos are pure code writing, hands on.

LLMs make coding in English available for all that want to move ahead of the rest of society, that does not learn and upgrade.



Brainchip is on the correct path.
 
  • Like
Reactions: 4 users

stuart888

Regular
Simon Thorpe!

I cannot understand how this video has only 4K viewers? I have watched it at least 3 times, N-of-M and all the spiky goodness.

The Mobile World Congress was all Edge AI Inference, no Cloud! AI Edge with Privacy, no cloud.



1712014409975.png
 
  • Like
  • Fire
  • Love
Reactions: 32 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Last edited:
  • Like
  • Fire
  • Love
Reactions: 68 users

Diogenese

Top 20
Simon Thorpe!

I cannot understand how this video has only 4K viewers? I have watched it at least 3 times, N-of-M and all the spiky goodness.

The Mobile World Congress was all Edge AI Inference, no Cloud! AI Edge with Privacy, no cloud.



View attachment 60131

Hi Stuart,

This is a superb discussion of N/M coding. while ST's group managed to produce a software simulation, PvdM's inspiration was to realize that it could be implemented in silicon.

It's amazing to think that the basic research which led to N/M was done in the 1920s. I guess the only detector sensitive and responsive enough for those measurements would have been the cathode ray tube. (Prof Google tells me that the cathode ray tube was invented by Karl Ferdinand Braun in 1897.)

The video is from 2021.

Do you have a direct link?
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 18 users

Diogenese

Top 20
  • Haha
Reactions: 2 users

Esq.111

Fascinatingly Intuitive.
Afternoon Chippers ,

Just stumbled across this company , they are to be listed shortly apparently.

Started by boffins at Princeton University,



Henk-Jan Boele
CEO



BlinkLab Limited (ASX:BB1)​


Follow
Share Price
-
Market Cap
-
Status
Upcoming

News
About
Price Chart
Reports
Dividends

Latest News​


��
TitleTypeDate
There is no news available for BlinkLab Limited.

News Notifications
Follow BlinkLab Limited (ASX:BB1) for email alerts and the latest news in your newsfeed
Follow

About​

BlinkLab Limited (ASX:BB1) stands poised to change the landscape of diagnosing and treating neurodevelopmental conditions like ASD and ADHD in children.
BlinkLab's innovative approach leverages the power of smartphones, AI and Machine Learning to deliver screening tests specifically designed for children as young as 18 months old. This marks a significant advancement, considering traditional diagnoses typically occur around five years of age, often missing the crucial early window for effective intervention.
BlinkLab, a company started by neuroscientists at Princeton University, over the past several years have fully developed a smartphone based diagnostic platform and is seeking to finalise an FDA Class II medical device registration study in partnership with leading US university hospitals. This approval will pave the way for broad use of the Company's technology in diagnosing and treating conditions like ASD and ADHD.
Why invest in BlinkLab?
Early Detection:
The Company's technology detects potential neurological conditions in children as young as 18 months, significantly earlier than traditional methods (around five years old). This critical window allows for timely intervention, maximising treatment effectiveness and improving long-term outcomes.
Proven Technology: Extensive multi-national clinical trials have already validated the apps efficacy and safety.
Addressing a Critical Need: ASD and ADHD diagnosis and treatment represent a significant cost burden on healthcare systems globally. Early detection through BlinkLab has the potential to improve efficiency, reduce costs, and ultimately benefit countless families.
Strong Partnerships: Future and ongoing research collaborations with prestigious university hospitals underscores the scientific merit and clinical potential of our technology.

BlinkLab is led by an experienced management team and directors with a proven track record in building companies and vast knowledge in digital healthcare, computer vision, AI and Machine Learning. The Company's Scientific Advisory board consists of leading experts in the field of ASD and Brain Development allowing it to bridge most advanced technological innovations with groundbreaking scientific research.
Together, this powerful combination of leadership and scientific expertise enables BlinkLab to:
  • develop clinically relevant solutions based on real-world needs;
  • navigate the regulatory landscape efficiently and effectively; and
  • deliver meaningful results for children, families, and the healthcare system
Share

Share Price Chart​

6 Months


There is no price information available for BlinkLab Limited.

Share Price History​

Select a Date:




��
DateOpenLastChangeChange %HighLowVolume
There is no price history available for BlinkLab Limited.

Annual Reports​

��
NameDate
There are no annual reports available for BlinkLab Limited.

Upcoming Dividend Dates​

��
TypeEx-Dividend DatePayment DateDividendFranking
There are no upcoming dividend dates for BlinkLab Limited.

Dividend History​

��
TypeEx-Dividend DatePayment DateDividendFrankingDRP
There is no dividend history for BlinkLab Limited.

Company Resources​



BlinkLab is a world-first, AI-driven digital healthcare venture, that uses a smartphone and facial recognition to detect autism and ADHD.

It’s listing on the ASX next Thursday (April 4) under the code BB1.

BlinkLab’s development is being led by chair Brian Leedman – a founder of ResApp Health – the cough analysis app sold to Pfizer two years ago in a $179 million deal.

Now Brian’s confident that BlinkLab – first developed at Princeton University – could mirror that success, while also fast-tracking detection of Autism and ADHD and improving patient outcomes.

“We’re the only company in the world using computer vision and machine learning to assess brain responses to audio and visual stimulation – delivered via a smartphone – to detect autism and ADHD,” Mr Leedman said.

https://media.hotcrapper.com.au/embed/bf9cq6gmz6pxicr9z22nff7yug/1/large
“It’s easy for a doctor or specialist to use as a diagnostic tool. We can diagnose children from 18 months old, which is years younger than through traditional methods that typically result in children being diagnosed from the age of five,” he said.

“By the time a child reaches five years old, neuronal development in the brain has significantly advanced as have the behavioural patterns that led to the concerns of the parents. If the child is autistic, then they were autistic from birth.

“As in all medicines, the earlier you can diagnose anyone with anything, the better the outcomes for the patient.”

There are lengthy diagnosis delays both here in Australia and abroad, so the demand for such a tool is extensive.

The global Autism Spectrum Disorder market is expected to reach $700 billion by 2028​

BlinkLab uses AI and machine learning algorithms to predict autism and ADHD and recent clinical trials demonstrated sensitivity of 85% and specificity of 84%, suggesting much higher accuracy comparing to existing FDA approved products.

“Based on clinical trial results to date, we have a very accurate test for autism and we now have to replicate that in a US FDA (Food and Drug Administration) medical device registration study,” Brian Leedman said.

“The NDIS in Australia has come under scrutiny for the tremendous cost to the Australian taxpayer over the fact that ASD (Autism Spectrum Disorder) diagnosis and treatment in children is the single largest expenditure – so the timing for this technology development and ASX listing is spot on.”

Industry data suggests more than a third of those claiming through NDIS report the primary condition of autism. The NDIS paid out $6.73 billion dollars in payments to support those Australians last year alone. What’s also concerning is that the figure was 28 per cent higher than for the previous year, 2022. Autism rates in Australia are among the highest in the world and effect 1 in 25 children.

https://media.hotcrapper.com.au/embed/bdbzigqmjhnxgb4keo0av9657a/1/largeBlinkLab uses AI and machine learning algorithms to predict autism and ADHD and in recent clinical trials demonstrated sensitivity of 85%
BlinkLab’s FDA trial will recruit up to 500 people, and to fund it, BB1 is raising $7 million through the IPO process with 20 cent shares. It plans to complete the study by mid-next year with the aim of gaining FDA approval by 2026. The neurometric testing technology is also being developed and studied for its efficacy in screening for schizophrenia and forms of dementia.

Backed by Princeton University

BlinkLab has an exclusive global commercial licence agreement with Princeton University, where Founders Dr Henk-Jan Boele, Peter Boele and Bas Koekkoek first created early iterations of the idea. It’s come a long way since then with $4.4 million spent on the Software as a Medical Device development. There’ve been 6000 individual diagnostic tests already carried out throughout the world, including eight studies as far afield as Morocco and Ecuador.

The company has forged numerous partnerships with credible institutions including Princeton University, Penn Medicine, Erasmus MC, Baylor College of Medicine, and many others.

No safety risk, BlinkLab just has to prove efficacy

Mr Leedman said another advantage in developing medtech such as BlinkLab was that there could be no safety risks or delays.

“The worst that can happen is the phone is dropped on your foot – literally,” he said.

“That means we don’t face the long and slow process of testing for safety, we just have to prove efficacy.”

Aside from his success with ResApp Health, it’s worth mentioning that Mr Leedman has also chaired Neurotech International (ASX:NTI) which is developing the use of very low THC oral cannabis drug candidate NTI164 for children with autism. During his tenure, the share price had risen over 1,000%.

“This is a natural progression for me,” Brian Leedman said.

“Everything I’ve done is highly relevant to getting BlinkLab to this point and strengthens my belief that we have something truly special here.”


Regards ,
Esq.
 
  • Like
  • Fire
  • Wow
Reactions: 22 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
  • Haha
  • Like
  • Wow
Reactions: 14 users

Damo4

Regular
  • Like
  • Fire
Reactions: 14 users
View attachment 60135




View attachment 60134


Now that it is written here in black and white for all to clearly see Sony is looking at developing SNN offering positions to build Neuromorphic products. This tells me they have been involved for a good long period of time to evaluate this technology prior to the companys conclusion SNN is of value to them in the market
Brainchip as far as we all know is the only company whom offer a complete SNN series of products for testing and integration to potential customers. This tells me that they may very well have been one of the company whom most likely are still under NDA with BRN.
Go brainchip
 
  • Like
  • Fire
  • Love
Reactions: 27 users

stuart888

Regular
MedLM. Kind of figured google would win in other industries.

They make my life a lot better and with no cost. I am a big fan.

 
  • Fire
  • Like
Reactions: 4 users
Top Bottom