BRN Discussion Ongoing

Townyj

Ermahgerd
Some big buys going through now - Insto Analysts said BRN is a BUY! Only blue sky from here. DYOR

Over 300,000 shares wanted in this one order.

1:19:59 PM 0.560 366,481 $205,229.360

@Rocket577 was that you finally..?? Sheesh settle it down.
 
  • Haha
  • Like
  • Fire
Reactions: 13 users
Yes @MadMayHam, and I thought it was very interesting that Si-Five specify that they want their X280 Intelligence Series to be tightly integrated with either Akida-S or Akida-P neural processors.




View attachment 31560


View attachment 31564
Yes the logic of this did not escape me either:

“Now tell me Mr. Five might I call you Si, how can you say you have a preference to tightly integrate AKIDA S and/or P with X280 when AKIDA E is cheaper and more power efficient?

Yes sir you may call me Si. That is a really good question Freddie.
Well the answer is very simple we don’t just make these statements on a whim.
We have been officially partnered with Brainchip for some time but prior to that our engineers did extensive testing of Brainchip’s AKIDA technology IP family to firstly determine if X280 and AKIDA were compatible and had something in combination to offer to our customers.
Our Engineers having answered yes we then asked our marketing people what was it that our customers were seeking and which was lacking from our present Product line, not just the Intelligence Series, and they came back with quite a long list actually.
We then went back to our engineers with this list and they applied themselves to the task and after extensive further testing and interaction with Brainchip advised that AKIDA S and AKIDA P were the perfect fit for X280.

Thanks Si for that it sounds like you really do your technology due diligence before jumping into bed with your technology partners.”

or something like this but maybe Si and his mates are from the shoot from the hip school of management and make technology adoption decisions based on a magazine article and a coin toss.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Love
  • Fire
Reactions: 62 users

Steve10

Regular
Jobs ads for Sales Executives in USA & Europe for Edge Impulse targeting health monitoring market.

Proven experience with identifying and onboarding enterprise customers in the health monitoring arena.


 
  • Like
  • Fire
  • Thinking
Reactions: 19 users
D

Deleted member 118

Guest
  • Fire
  • Like
Reactions: 12 users

White Horse

Regular
https://www.linkedin.com/posts/leddartech_perception-adas-autonomousdriving-ugcPost-7038509033632198656-JMqm?utm_source=share&utm_medium=member_desktop


Rob’s profile photo

Rob Telson likes this
LeddarTech- Automotive Software: Low-Level Sensor Fusion & Perception
13,584 followers
1d • Edited •

1 day ago


😀 👍Are you looking to enhance your knowledge of ADAS and Perception systems? If so, then understanding the concepts of Ego-Lane and CIPV is crucial.

Identifying these vehicles especially when distant is critical for implementation of safety ADAS features such as automatic emergency braking and forward collision warning.

To gain a better insight into the concepts of Ego-Lane and CIPV, check out our informative video that explains how these concepts look from a perception system's point-of-view. It is an excellent opportunity to learn more about these crucial aspects of ADAS and Autonomous Driving.

If you're interested in exploring more about the world of Perception and ADAS, check out this interactive tool - https://hubs.li/Q01F72js0

#perception #ADAS #autonomousdriving





Remaining time 1:18
 
  • Like
  • Fire
  • Thinking
Reactions: 17 users

Steve10

Regular
I think this product will benefit with Akida.

Avoid obstacles.
GPS navigation.
All-in-one.​

biped is the future of mobility for blind and visually impaired people. A smart harness, worn on shoulders, that uses self-driving technology from Honda Research Institute to guide people with short sounds. Ideal complement to a white cane or to a guide dog.

270 million visually impaired people worldwide walk with a risk of hitting obstacles, such as electric scooters, bicycles, or tree branches.

White canes alone cannot detect all obstacles. But replacing it is not the way to go. What if you could walk with your mind free, knowing where potential obstacles are? And get GPS instructions too?

‍It uses wide-angle cameras and AI to generate short sounds to warn you about the position of important obstacles, such as branches, holes, vehicles or pedestrians. It also provides GPS instructions (coming soon). Sounds are played in Bluetooth headphones.

biped is a harness worn on the shoulders, equipped with ultra-wide angle cameras on the left of your chest, a battery behind your neck, and a small computer on the right of your chest. It works just like a self-driving car, for pedestrians. We partnered with Honda Research Institute to bring the best of car research, to biped.

1678249247495.png

1678249356959.png
 
  • Like
  • Love
  • Fire
Reactions: 21 users

MDhere

Regular
is the agm on the 23rd May bring held at broadroom again like last year or where before i book my flights and accom?
 
  • Like
Reactions: 1 users

buena suerte :-)

BOB Bank of Brainchip
Certainly looking a lot healthier 🙏

719 buyers for 12,393,513 units

377 sellers for 5,624,129 units
 
  • Like
  • Fire
  • Love
Reactions: 26 users

White Horse

Regular
Just in case some missed the second article from Forbes yesterday, here it is.

BrainChip Readies 2nd Gen Platform For Power-Efficient Edge AI​


Karl Freund
Contributor
Founder and Principal Analyst, Cambrian-AI Research LLC

Mar 6, 2023

The company’s event-based digital Neuromorphic IP can add efficient AI processing to SoCs.

Edge AI is becoming a thing. Instead of using just an embedded microprocessor in edge applications and sending the data to a cloud for AI processing, many edge companies are considering adding AI at the edge itself, and then communicating conclusions about what the edge processor is “seeing” instead of sending the raw sensory data such as an image. To date, this dynamic has been held back by the cost and power requirements of initial implementations. What customers are looking for is proven AI tech that can run under a watt, and that they can add to a microcontroller for on-board processing.

Many startups have entered the field, looking to compete in an area of AI which does not have an 800-pound incumbent to displace (a.k.a., NVIDIA). Many startups have some sort of in-memory or near-memory architecture to reduce data movement, coupled with digital multiply-accumulate (MAC) logic. BrainChip is taking a different approach, applying event-base digital neuromorphic logic with SRAM that the company says is power-efficient, flexible, scalable, and enables on-chip learning. Let’s take a closer look.

The Akida Platform​


Brainchip has a lengthy list of enhancements it has engineered to the second-generation Akida platform. The motivation of these additions has been to enable processing on the modalities customers are increasingly demanding: real-time video, audio, and time-series data such as speech recognition, human action recognition, text translation, and video object detection. For some of these apps, the addition of an optional Vision Transformer (ViT) engine, along with an enhanced neural mesh can deliver up to 50 TOPS (Trillions of Operations Per Second) according to the company.

The company is selling its IP into designs for sensors and SoCs looking to add AI to the edge. While uptake has been slow for BrainChip’s first product, the AKD1000, there have been some high-profile demonstrations of its use by companies like Mercedes in the EQXX concept vehicle and by NASA on a project for autonomy and cognition in small satellites and adoption of their development kits and boards by a number of companies for prototyping purposes.

Now, with the second generation. BrainChip has added support for 8-bit weights and activations, the ViT mentioned above, and hardware support for an innovative Temporal Event-Based Neural Net (TENN) support. Akida maintains its ability to process multiple layers at a time, managed by its smart DMA which handles model and data load and store autonomously. This can enable low-power sensors attached to an Akida node without the need for CPU. The diagram below shows how Akida sensors can be coupled with an SoC for multi-sensor inference processing.



[IMG alt="The Akida IP can be used to create sensors and be
deployed within more advanced SoCs."]https://imageio.forbes.com/specials...ed-SoCs-/960x0.jpg?format=jpg&width=960[/IMG]

The Akida IP can be used to create sensors and be deployed within more advanced SoCs.
BrainChip


The new Akida platform, expected to be available later this year. is designed to process a variety of popular networks, including CNNs, DNNs, Vision Transformers, and SNNs. The event-based design is particularly good at time series data for problems such as audio processing, video object detection, and vital sign monitoring and prediction.

BrainChip has shown some initial benchmarks that demonstrate orders of magnitude fewer operations and smaller model size which can benefit edge AI implementations. In video object detection, a 16m implementation can handle 30FPS at 1382x512 resolution, in under 75mW. Keyword detection in 28nm can support over 125 inferences/sec taking less than 2 microJoules per inference. BrainChip has applied for patents in TENN model acceleration.

[IMG alt="The Akida processor and software can implement
multi-pass processing as well as on-chip learning."]https://imageio.forbes.com/specials...-well-as/960x0.jpg?format=jpg&width=960[/IMG]

The Akida processor and software can implement multi-pass processing as well as on-chip learning.
BrainChip
Akida’s runtime software manages the operation efficiently including key features like its multi-pass processing which are handled transparent to the user. Model development and tuning is supported on the TensorFlow framework with MetaTF.

The Akida Architecture and associated Benefits.


The Akida Architecture and associated Benefits.
BrainChip
BrainChip envisions three ranges of adoption, including a basic MCU with 1-4 nodes for always-on CPU-less operation, a MCU+Deep Learning Accelerators class, and a high-end MCU with up to 64 nodes and an optional ViT processor. In all cases, on-chip learning is possible by leveraging the trained model as a feature extractor and adding new classes in the final layer while untethered from cloud training resources.

[IMG alt="Akida can provide power efficient AI alternatives
across a broad range of solutions that currently
implement MCUs and GPUs."]https://imageio.forbes.com/specials...olutions/960x0.jpg?format=jpg&width=960[/IMG]

Akida can provide power efficient AI alternatives across a broad range of solutions that currently ... [+]
BrainChip

Conclusions​


While BrainChip has been vocal in the past about the partnerships it has forged, such as with MegaChips and Renesas, commercial growth has been slower, possibly a function of the IP model taking longer to ramp. With the inclusion of 8-bit operations, the Vision Transformer, temporal convolutions (TENN models), and the ability to run models like ResNet-50 completely on the Akida Neural processor with minimal CPU dependencies, we believe the company is laying the foundation to turn the corner and land some bigger design wins. A key factor may be the software, which is currently TensorFlow based but will soon support PyTorch as well, an essential addition given the current development landscape.
 
  • Like
  • Fire
  • Love
Reactions: 54 users
Jobs ads for Sales Executives in USA & Europe for Edge Impulse targeting health monitoring market.

Proven experience with identifying and onboarding enterprise customers in the health monitoring arena.


So cool that our partners feel the need to increase their workforce to target or maximise potential from our new enhanced Akida range.

Bit of 'the chicken and the egg' scenario here:

1678251274060.png


Was Edge Impulse involved in the feedback loop with their customers having them already lined up for the change?
Or did this second generation enhancement trigger the mass market requiring the new sales FTE.

Either way one of OUR Ecosystem partners are preparing to manage the increase in demand which will help Sean execute his last statement from the txt above
"we are focused on executing more IP licenced agreements and generating revenue growth over the coming years."

Our team has support to assist our ubiquitous ambitions.

I am a patient person but I'd be so pleased to see ONE license land before May 23rd.
Will this last BRN announcement be the final key to unlocking the licensing charge. Was it the final proof that Akida can evolve and extend its reach into the future.
 
  • Like
  • Fire
  • Love
Reactions: 38 users

White Horse

Regular
From the fishmonger and baker of fine bread.

BrainChip Introduces Second-Generation Akida Platform



peter van der Made


Peter Van Der Made

Chief Technology Officer at BrainChip

1 article


March 8, 2023

Introduces Vision Transformers and Spatial-Temporal Convolution for radically fast, hyper-efficient, and secure Edge AIoT products, untethered from the cloud

Laguna Hills, Calif. – March 6, 2023 – BrainChip Holdings Ltd (ASX: BRN, OTCQX: BRCHF, ADR: BCHPY), the world’s first commercial producer of ultra-low power, fully digital, neuromorphic AI IP, today announced the second generation of its Akida™ platform that drives extremely efficient and intelligent edge devices for the Artificial Intelligence of Things (AIoT) solutions and services market that is expected to be $1T+ by 2030. This hyper-efficient yet powerful neural processing system, architected for embedded Edge AI applications, now adds efficient 8-bit processing to go with advanced capabilities such as time domain convolutions and vision transformer acceleration, for an unprecedented level of performance in sub-watt devices, taking them from perception towards cognition.

The second-generation of Akida now includes Temporal Event Based Neural Nets (TENN) spatial-temporal convolutions that supercharge the processing of raw time-continuous streaming data, such as video analytics, target tracking, audio classification, analysis of MRI and CT scans for vital signs prediction, and time series analytics used in forecasting, and predictive maintenance. These capabilities are critically needed in industrial, automotive, digital health, smart home and smart city applications. The TENNs allow for radically simpler implementations by consuming raw data directly from sensors – drastically reduces model size and operations performed, while maintaining very high accuracy. This can shrink design cycles and dramatically lower the cost of development.

Another addition to the second generation of Akida is Vision Transformers (ViT) acceleration, a leading edge neural network that has been shown to perform extremely well on various computer vision tasks, such as image classification, object detection, and semantic segmentation. This powerful acceleration, combined with Akida’s ability to process multiple layers simultaneously and hardware support for skip connections, allows it to self-manage the execution of complex networks like RESNET-50 completely in the neural processor without CPU intervention and minimizes system load.

The Akida IP platform has a unique ability to learn on the device for continuous improvement and data-less customization that improves security and privacy. This, combined with the efficiency and performance available, enable very differentiated solutions that until now have not been possible. These include secure, small form factor devices like hearable and wearable devices, that take raw audio input, medical devices for monitoring heart and respiratory rates and other vitals that consume only microwatts of power. This can scale up to HD-resolution vision solutions delivered through high-value, battery-operated or fanless devices enabling a wide variety of applications from surveillance systems to factory management and augmented reality to scale effectively.

“We see an increasing demand for real-time, on-device, intelligence in AI applications powered by our MCUs and the need to make sensors smarter for industrial and IoT devices,” said Roger Wendelken, Senior Vice President in Renesas’ IoT and Infrastructure Business Unit. “We licensed Akida neural processors because of their unique neuromorphic approach to bring hyper-efficient acceleration for today’s mainstream AI models at the edge. With the addition of advanced temporal convolution and vision transformers, we can see how low-power MCUs can revolutionize vision, perception, and predictive applications in wide variety of markets like industrial and consumer IoT and personalized healthcare, just to name a few.”

“Advancements in AI require parallel advancements in on-device learning capabilities while simultaneously overcoming the challenges of efficiency, scalability, and latency,” said Richard Wawrzyniak, principal analyst at Semico Research. “BrainChip has demonstrated the ability to create a truly intelligent edge with Akida and moves the needle even more in terms of how Edge AI solutions are developed and deployed. The benefits of on-chip AI from a performance and cost perspective are hard to deny.”

“Our customers wanted us to enable expanded predictive intelligence, target tracking, object detection, scene segmentation, and advanced vision capabilities. This new generation of Akida allows designers and developers to do things that were not possible before in a low-power edge device,” said Sean Hehir, BrainChip CEO. “By inferring and learning from raw sensor data, removing the need for digital signal pre-processing, we take a substantial step toward providing a cloudless Edge AI experience.”

Akida’s software and tooling further simplifies the development and deployment of solutions and services with these features:


  • An efficient runtime engine that autonomously manages model accelerations completely transparent to the developer
  • MetaTF™ software that developers can use with their preferred framework, like TensorFlow/Keras, or development platform, like Edge Impulse, to easily develop, tune, and deploy AI solutions.
  • Supports all types of Convolutional Neural Networks (CNN), Deep Learning Networks (DNN), Vision Transformer Networks (ViT) as well as Spiking Neural Networks (SNNs), future-proofing designs as the models get more advanced.



Akida comes with a Models Zoo and a burgeoning ecosystem of software, tools, and model vendors, as well as IP, SoC, foundry and system integrator partners. BrainChip is engaged with early adopters on the second generation IP platform. General availability will follow in Q3’ 2023.

See what they’re saying:​


“At Prophesee, we are driven by the pursuit of groundbreaking innovation addressing event-based vision solutions. Combining our highly efficient neuromorphic-enabled Metavision sensing approach with Brainchip’s Akida neuromorphic processor holds great potential for developers of high-performance, low-power Edge AI applications. We value our partnership with BrainChip and look forward to getting started with their 2nd generation Akida platform, supporting vision transformers and TENNs,” said Luca Verre, Co-Founder and CEO at Prophesee.

Luca Verre, Co-Founder and CEO, Prophesee

“BrainChip and its unique digital neuromorphic IP have been part of IFS’ Accelerator IP Alliance ecosystem since 2022,” said Suk Lee, Vice President of Design Ecosystem Development at IFS. “We are keen to see how the capabilities in Akida’s latest generation offerings enable more compelling AI use cases at the edge”

Suk Lee, VP Design Ecosystem Development, Intel Foundry Services

“Edge Impulse is thrilled to collaborate with BrainChip and harness their groundbreaking neuromorphic technology. Akida’s 2nd generation platform adds TENNs and Vision Transformers to a strong neuromorphic foundation. That’s going to accelerate the demand for intelligent solutions. Our growing partnership is a testament to the immense potential of combining Edge Impulse’s advanced machine learning capabilities with BrainChip’s innovative approach to computing. Together, we’re forging a path toward a more intelligent and efficient future,” said Zach Shelby, Co-Founder and CEO at Edge Impulse.

Zach Shelby, Co-Founder and CEO, Edge Impulse

“BrainChip has some exciting upcoming news and developments underway,” said Daniel Mandell, Director at VDC Research. “Their 2nd generation Akida platform provides direct support for the intelligence chip market, which is exploding. IoT market opportunities are driving rapid change in our global technology ecosystem, and BrainChip will help us get there.”

Daniel Mandell, Director, VDC Research

“Integration of AI Accelerators, such as BrainChip’s Akida technology, has application for high-performance RF, including spectrum monitoring, low-latency links, distributed networking, AESA radar, and 5G base stations,” said John Shanton, CEO of Ipsolon Research, a leader in small form factor, low power SDR technology.

John Shanton, CEO, Ipsolon Research

“Through our collaboration with BrainChip, we are enabling the combination of SiFive’s RISC-V processor IP portfolio and BrainChip’s 2nd generation Akida neuromorophic IP to provide a power-efficient, high capability solution for AI processing on the Edge,” said Phil Dworsky, Global Head of Strategic Alliances at SiFive. “Deeply embedded applications can benefit from the combination of compact SiFive Essential™ processors with BrainChip’s Akida-E, efficient processors; more complex applications including object detection, robotics, and more can take advantage of SiFive X280 Intelligence™ AI Dataflow Processors tightly integrated with BrainChip’s Akida-S or Akida-P neural processors.”

Phil Dworsky, Global Head of Strategic Alliances, SiFive

“Ai Labs is excited about the introduction of BrainChip’s 2nd generation Akida neuromorphic IP, which will support vision transformers and TENNs. This will enable high-end vision and multi-sensory capability devices to scale rapidly. Together, Ai Labs and BrainChip will support our customers’ needs to address complex problems,” said Bhasker Rao, Founder of Ai Labs. “Improving development and deployment for industries such as manufacturing, oil and gas, power generation, and water treatment, preventing costly failures and reducing machine downtime.”

Bhasker Rao, Founder, Ai Labs

“We see an increasing demand for real-time, on-device, intelligence in AI applications powered by our MCUs and the need to make sensors smarter for industrial and IoT devices,” said Roger Wendelken, Senior Vice President in Renesas’ IoT and Infrastructure Business Unit. “We licensed Akida neural processors because of their unique neuromorphic approach to bring hyper-efficient acceleration for today’s mainstream AI models at the edge. With the addition of advanced temporal convolution and vision transformers, we can see how low-power MCUs can revolutionize vision, perception, and predictive applications in a wide variety of markets like industrial and consumer IoT and personalized healthcare, just to name a few.”

Roger Wendelken, Senior Vice President IoT and Infrastructure Business Unit, Renesas

“We see a growing number of predictive industrial (including HVAC, motor control) or automotive (including fleet maintenance), building automation, remote digital health equipment and other AIoT applications use complex models with minimal impact to product BOM and need faster real-time performance at the Edge” said Nalin Balan, Head of Business Development at Reality ai, a Renesas company. “BrainChip’s ability to efficiently handle streaming high frequency signal data, vision, and other advanced models at the edge can radically improve scale and timely delivery of intelligent services.”

Nalin Balan, Head of Business Development, Reality.ai, a Renesas Company

“Advancements in AI require parallel advancements in on-device learning capabilities while simultaneously overcoming the challenges of efficiency, scalability, and latency,” said Richard Wawrzyniak, Principal Analyst at Semico Research. “BrainChip has demonstrated the ability to create a truly intelligent edge with Akida and moves the needle even more, in terms of how Edge AI solutions are developed and deployed. The benefits of on-chip AI from a performance and cost perspective are hard to deny.”

Richard Wawrzyniak, Principal Analyst, Semico Research

“BrainChip’s cutting-edge neuromorphic technology is paving the way for the future of artificial intelligence, and Drexel University recognizes its immense potential to revolutionize numerous industries. We have experienced that neuromorphic compute is easy to use and addresses real-world applications today. We are proud to partner with BrainChip and advancing their groundbreaking technology, including TENNS and how it handles time series data, which is the basis to address a lot of complex problems and unlocking its full potential for the betterment of society,” said Anup Das, Associate Professor and Nagarajan Kandasamy, Interim Department Head of Electrical and Computer Engineering, Drexel University.

Anup Das, Associate Professor, Drexel University

“Our customers wanted us to enable expanded predictive intelligence, target tracking, object detection, scene segmentation, and advanced vision capabilities. This new generation of Akida allows designers and developers to do things that were not possible before in a low-power edge device,” said Sean Hehir, BrainChip CEO. “By inferring and learning from raw sensor data, removing the need for digital signal pre-processing, we take a substantial step toward providing a cloudless Edge AI experience.”

Sean Hehir, CEO, BrainChip



About BrainChip Holdings Ltd (ASX: BRN, OTCQX: BRCHF, ADR: BCHPY)

BrainChip is the worldwide leader in edge AI on-chip processing and learning. The company’s first-to-market, fully digital, event-based AI processor, AkidaTM, uses neuromorphic principles to mimic the human brain, analyzing only essential sensor inputs at the point of acquisition, processing data with unparalleled efficiency, precision, and economy of energy. Akida uniquely enables edge learning local to the chip, independent of the cloud, dramatically reducing latency while improving privacy and data security. Akida Neural processor IP, which can be integrated into SoCs on any process technology, has shown substantial benefits on today’s workloads and networks, and offers a platform for developers to create, tune and run their models using standard AI workflows like TensorFlow/Keras. In enabling effective edge compute to be universally deployable across real world applications such as connected cars, consumer electronics, and industrial IoT, BrainChip is proving that on-chip AI, close to the sensor, is the future, for its customers’ products, as well as the planet. Explore the benefits of Essential AI at www.brainchip.com.

Follow BrainChip on Twitter: https://www.twitter.com/BrainChip_inc

Follow BrainChip on LinkedIn: https://www.linkedin.com/company/7792006
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 51 users

TECH

Regular
Hi Brainchip supporters :ROFLMAO:

Below is a great article, take a read, I personally like the bit about the NASDAQ....the company has already stated that it is focusing on
converting on all the year/s engineering work, working in with and listening to our EAP's into IP Licenses, I'm of the view that a lot will
just happen rather quickly, a bit like falling domino's, a gentle nudge and they'll all fall basically at once.

The NASDAQ must be getting very tempting, I'm not 100% sure, but I don't think we "qualify" just yet, but I'm happy for someone in the know to educate me.

Spoke to my friend today, she fly's out of Perth tonight and has confirmed a booking with the team in Germany next week, accompanied by her tech engineer, I have asked for some photos, so here's hoping.


Regards....Tech (y)
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 51 users

VictorG

Member
Worth a listen.

 
  • Like
  • Love
  • Fire
Reactions: 40 users
Hi Brainchip supporters :ROFLMAO:

Below is a great article, take a read, I personally like the bit about the NASDAQ....the company has already stated that it is focusing on
converting on all the year/s engineering work, working in with and listening to our EAP's into IP Licenses, I'm of the view that a lot will
just happen rather quickly, a bit like falling domino's, a gentle nudge and they'll all fall basically at once.

The NASDAQ must be getting very tempting, I'm not 100% sure, but I don't think we "qualify" just yet, but I'm happy from someone in the know to educate me.

Spoke to my friend today, she fly's out of Perth tonight and has confirmed a booking with the team in Germany next week, accompanied by her tech engineer, I have asked for some photos, so here's hoping.


Regards....Tech (y)
I asked ChatGPT
re NASDAQ LISTING

I stopped after reading criteria 1, ill be happy there.
Second thoughts, Oh look! fisrt line of criteria 4 is even better, ill take that

1678260303487.png


Whats that saying people are on about here so often? is it 'talk like Raquel Welch and she'll make it happen' ?
;)


oops late edit - I read criteria 4 a factor of 10 more than written... as you were
 
  • Like
Reactions: 6 users

Vladsblood

Regular
Hi Brainchip supporters :ROFLMAO:

Below is a great article, take a read, I personally like the bit about the NASDAQ....the company has already stated that it is focusing on
converting on all the year/s engineering work, working in with and listening to our EAP's into IP Licenses, I'm of the view that a lot will
just happen rather quickly, a bit like falling domino's, a gentle nudge and they'll all fall basically at once.

The NASDAQ must be getting very tempting, I'm not 100% sure, but I don't think we "qualify" just yet, but I'm happy from someone in the know to educate me.

Spoke to my friend today, she fly's out of Perth tonight and has confirmed a booking with the team in Germany next week, accompanied by her tech engineer, I have asked for some photos, so here's hoping.


Regards....Tech (y)
Hi Tech We appear to have the same insight of viewing our Board working with stealthy resolve towards a full Nasdaq listing which I think 🤔 is much closer than some think!
Wouldn’t want to be caught with our pants down “lagging “ behind doing the splits on Nasdaq for the whole World to be left wondering just why we are coming late to the Nasdaq explosion party that Brainchip is creating!
The all new World Technology Leader is Brainchip!! $$$$ Vlad
 
  • Like
  • Fire
  • Love
Reactions: 27 users

Steve10

Regular
STMicroelectronics' ISPU processes data locally in a few micro watts.

The Onlife era of MEMS: integrating AI in sensors for decision-making in the Edge​

In the Onlife era, intelligent processing is integrated into sensors in the Edge. This means connected objects can sense, process, and take actions in a seamless way, without impacting our user experience and bridging the online and offline world.

Artificial Intelligence is distributed in the nodes, enabling more power-efficient connected devices, enhanced data privacy, and faster decision making thanks to fewer data transfers.

The technology behind the Onlife Era​

In 2022, ST brings you a new generation of MEMS sensors, featuring an embedded intelligent sensor processing unit (ISPU). Optimized to analyze motion data with signal processing and Artificial Intelligence techniques, the ISPU processes data in the Edge, without waking up the system and before transferring data to the MCU and gateway/cloud.

MEMS sensor with AI core (ISPU - intelligent sensor processing unit)​

ST brings you a new generation of MEMS sensors, featuring an embedded intelligent sensor processing unit (ISPU). The ISPU is an ultra-low-power, highly computationally efficient, high-performance programmable core that can execute signal processing and AI algorithms in the edge. ISPU can be programmed in C-code and it is provided with an ecosystem with libraries and 3rd party tools/IDE which enable as well a seamless and automated implementation of complex AI models ISPU is a state-of-the-art feature for any personal electronics, from wearable accessories to high-end applications and industrial market as anomaly detection, automation, asset tracking and alarms.




They are using the ISPU processor for personalized fitness training.



They design & manufacture chips in three manufacturing plants in France, two in Italy & one in Singapore.

 
  • Like
  • Love
  • Fire
Reactions: 14 users

TheFunkMachine

seeds have the potential to become trees.
I would be happy with another 2 IP deals signed by the end of year. Surely that’s not asking to much
I agree, but I personally want to see some more. I would be quite happy with
2 x IP deals
Renasas product release
Want to hear more from Megachips
And some more patent filing/approvals
And continued signs of revenu in our Q reports trough said IP deals and product releases.

Bonus would be:
- any other new partnerships announced
- Hearing from our EAP partners how they are tracking like NASA , Valeo, Ford etc.

Beyond into 2025 I predict will be so crazy and every hardship and struggle will be paid back 100x

I’m waiting patiently. Nothing can shake me of my precious shares. The day will come and a sweet one it will be!:)
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 37 users

Baisyet

Regular
 
  • Like
  • Fire
Reactions: 13 users
Top Bottom