BRN Discussion Ongoing

TheFunkMachine

seeds have the potential to become trees.
Nviso seems excited about our partnership. Good to see:)

I also like that this guy is using “the tip of the Iceberg” comment so famously used by Rob Telson in every interview;)
 

Attachments

  • 6BB919AA-EE25-4DD5-84D5-9AF1A458AD66.png
    6BB919AA-EE25-4DD5-84D5-9AF1A458AD66.png
    688.5 KB · Views: 71
  • Like
  • Fire
  • Love
Reactions: 32 users

TechGirl

Founding Member
Have not been able to put a number on how many customers Nvidia has if anyone finds a number it would be exciting to know. FF

First Google search I did made me not care about the actual number of Nvidia's customers cause the 4 they mentioned will do me just fine AMAMZON, FACEBOOK, GOOGLE & TESLA


nvidia.jpg



Donald Duck Money GIF
 
  • Like
  • Haha
  • Fire
Reactions: 46 users

JK200SX

Regular
For your viewing pleasure.....




1652315098004.png
 
  • Like
  • Fire
  • Love
Reactions: 31 users
Morning chippers, breakfast

This is what’s going to make 5star handcap cars. Watch every company scramble to implement this in or they have a lower rating
 
  • Like
  • Fire
  • Love
Reactions: 24 users

MDhere

Regular
Some background on Nvidia for those who have never looked. The bottom line is Nvidia has at its core the pursuit of innovation and is a perfect partner for a company that is heralding a technology Ai Edge Revolution:

For the fiscal year 2020, Nvidia reported earnings of US$2.796 billion, with an annual revenue of US$10.918 billion, a decline of 6.8% over the previous fiscal cycle. Nvidia's shares traded at over $531 per share, and its market capitalization was valued at over US$328.7 billion in January 2021.
View attachment 6314
https://en.m.wikipedia.org › wiki

Nvidia - Wikipedia



My opinion only DYOR
FF

AKIDA BALLISTA
ok my calculator has exploded!!
 
  • Haha
  • Like
  • Love
Reactions: 12 users

Terroni2105

Founding Member

The emerging field of neuromorphic processing isn’t an easy one to navigate. There are major players in the field that are leveraging their size and ample resources – the highest profile being Intel with its Loihi processors and IBM’s TrueNorth initiative – and a growing list of startups that include the likes of SynSense, Innatera Nanosystems and GrAI Matter Labs.

Included in that latter list is BrainChip, a company that has been developing its Akida chip – Akida is Greek for “spike” – and accompanying IP for more than a decade. We’ve followed BrainChip over the past few years, speaking with them in 2018 and then again two years later, and the company has proven to be adaptable in a rapidly evolving space. The initial plan was to get the commercial SoC into the market by 2019, but BrainChip extended the deadline to add the capability to run convolutional neural networks (CNNs) along with spiking neural networks (SNNs).

In January, the company announced the full commercialization of its AKD1000 platform, which includes its Mini PCIe board that leverages the Akida neural network processor. It’s a key part of BrainChip’s strategy of using the technology as reference models as it pursues partnerships with hardware and chip vendors that will incorporate it in their own designs.

“Looking at our fundamental business model, is it a chip or IP or both?” Jerome Nadel, BrainChip’s chief marketing officer, tells The Next Platform. “It’s an IP license model. We have reference chips, but our go-to-market is definitely to work with ecosystem partners, especially who would take a license, like a chip vendor or a ASIC designer and tier one OEMs. … If we’re connected with a reference design to sensors for various sensor modalities or to an application software development, when somebody puts together AI enablement, they want to run it on our hardware and there’s already interoperability. You’ll see a lot of these building blocks as we’re trying to penetrate the ecosystem, because ultimately when you look at the categoric growth in edge AI, it’s really going to come from basic devices that leverage intelligent sensors.”

BrainChip is aiming its technology at the edge, where more data is expected to be generated in the coming years. Pointing to IDC and McKinsey research, BrainChip expects the market for edge-based devices needing AI to grow from $44 billion this year to $70 billion by 2025. In addition, at last week’s Dell Technologies World event, CEO Michael Dell reiterated his belief that while 10 percent of data now is generated at the edge, that will shift to 75 percent by 2025. Where data is created, AI will follow. BrainChip has designed Akida for the high-processing, low-power environment and to be able to run AI analytic workloads – particularly inference – on the chip to lessen the data flow to and from the cloud and thus reduce latency in generating results.

Neuromorphic chips are designed to mimic the brain through the use of SNNs. BrainChip broaden the workloads Akida could run by being able to run CNNs as well, which are useful in edge environments for such tasks as embedded vision, embedded audio, automated driving for LiDAR and RADAR remote sensing devices, and industrial IoT. The company is looking at such sectors as autonomous driving, smart health and smart cities as growth areas.




BrainChip already is seeing some success. It’s Akida 1000 platform is being used in Mercedes-Benz’s Vision EQXX concept car for in-cabin AI, including driver and voice authentication, keyword spotting and contextual understanding.

The vendor sees partnerships as an avenue for increasing its presence in the neuromorphic chip field.

“If we look at a five-year strategic plan, our outer three years probably look different than our inner two,” Nadel says. “In the inner two we we’re still going to focus on chip vendors and designers and tier-one OEMs. But the outer three, if you look at categories, it’s really going to come from basic devices, be they in-car or in-cabin. be they in consumer electronics that are looking for this AI enablement. We need to be in the ecosystem. Our IP is de facto and the business model wraps around that.”

The company has announced a number of partnerships, including with nViso, an AI analytics company. The collaboration will target battery-powered applications in robotics and automotive sectors using Akida chips for nViso’s AI technology for social robots and in-cabin monitoring systems. BrainChip also is working with SiFive to integrate the Akida technology with SiFive’s RISC-V processors for edge AI computing workloads and MosChip, running its Akida IP with the vendor’s ASIC platform for smart edge devices. BrainChip also is working with Arm.

To accelerate the strategy, the company this week rolled out its AI Enablement Program to offer vendors working prototypes of BrainChip IP atop Akida hardware to demonstrate the platform’s capabilities for running AI inference and learning on-chip and in a device. The vendor also is offering support for identifying use cases for sensor and model integration.



The program includes three levels – the Basic and Advanced prototypes to the Functioning Solution – with the number of AKD1000 chips scaling to 100, custom models for some users, 40 to 160 hours with machine learning experts and two to ten development systems. The prototypes will enable BrianChip to get its commercial products to users at a time when other competitors are still developing their own technologies in the relatively nascent market.

“There’s a step of being clear about the use cases and perhaps a road map of more sensory integration and sensor fusion,” Nadel says. “This is not how we make a living as a business model. The intent is to demonstrate real, tangible working systems out of our technology. The thinking was, we could get these into the hands of people and they could see what we do.”

BrainChips Akida IP includes support for up to 1,024 nodes that can be configured into two to 256 nodes connected over a mesh network, with each node comprising four neural processing units. Each of the NPUs includes configurable SRAM and each NPU can be configured for CNNs if needed and each is based on events or spikes, using data sparsity, activations, and weights to reduce the number operations by at least two-fold. The Akida Neural SoC can be used standalone or integrated as a co-processor a range of use cases and provides 1.2 million neurons and 10 billion synapses.

The offering also includes the MetaTF machine learning framework for developing neural networks for edge applications and three reference development systems for PCI, PC shuttle and Raspberry Pi systems.

The platform can be used for one-shot on-chip learning by using the trained model to extract features and adding new classes onto it or in multi-pass processing that leverages parallel processing to reduce the number of NPUs needed.

Here is the one shot:



And there is the multi-pass:




“The idea of our accelerator being close to the sensor means that you’re not sending sensor data, you’re sending inference data,” Nadel said. “It’s really a systems architectural play that we envision our micro hardware is buddied up with sensors. The sensor captures data, it’s pre-processed. We do the inference off of that and the learning at the center, but especially the inference. Like an in-car Advanced Driver Assistance System, you’re not tasking the server box loaded with GPUs with all of the data computation and inference. You’re getting the inference data, the metadata, and your load is going to be lighter.”

The on-chip data processing is part of BrainChip’s belief that for much of edge AI, the future will not require clouds. Rather than send all the data to the cloud – bringing in the higher latency and costs – the key will be doing it all on the chip itself. Nadel says it’s a “bit of a provocation to the semiconductor industry talking about cloud independence. It’s not anti-cloud, but the idea is that hyperscale down to the edge is probably the wrong approach. You have to go sensor up.”

Going back to the cloud also means having to retraining the model if there is a change in object classification, Anil Mankar, co-founder and chief development officer, tells The Next Platform. Adding more classes means changing the rates in the classification.

“On-chip learning,” Mankar says. “It’s called incremental learning or continuous learning, and that is only possible because … we are working with spikes and we actually copy similarly how our brain learns faces and objects and things like that. People don’t want to do transfer learning – go back to the cloud, get new rates. Now you can classify more objects. Once you have an activity on the device, you don’t need cloud, you don’t need to go backwards. Whatever you learn, you learn” and that doesn’t change when something new is added.

What a fantastic write up for Brainchip! I love it. Good job Jerome Nadel.
 
  • Like
  • Love
  • Fire
Reactions: 38 users
Having just watched this video can I say Blind Freddie encourages everyone to watch it and has asked me to CONGRATULATE the 1,000 Eyes for calling the partnership with Nvidia BEFORE ALL THE INVESTMENT COMMUNITY WORLD WIDE.

I personally would like to remind everyone here how much we have achieved by moving over here and bidding goodbye to the conflict model.

I argued before that we all carry sufficient personal doubt and insecurity that there is nothing to be gained from having deliberate purveyors of negative feedback for so called balance in our lives and particularly not when making business or investment decisions.
Congratulations to everyone here may it long continue.

2025 - 2025 - 2025 - 2025 - 2025

My opinion only DYOR
FF

AKIDA BALLISTA
Apologies to other posters Blind Freddie just checked the above post and kicked me in the shins for the leaving out the most exciting FACT so here it is.

European and US legislatures are adopting standards and legislation to enshrine those standards for DRIVER ALERTNESS MONITORING INCLUDING FATIGUE & PASSENGER PRESENCE IN VEHICLES to protect CHILDREN & PETS.

These standards will demand always on working systems which are unaffected by lack of cloud connectivity before vehicles including Autonomous vehicles can leave the garage or children and pets can be transported including commercially.

Guess what Nviso, Brainchip and Nvidia have demonstrated in this video.

If Brainchip did not have any other product category covered this one single achievement would blow the lid off the reservoir of its commercial success.

If you read the post of Tim Llewellyn CEO of Nviso it is clear he understands.

“Tip of the Iceberg” people.

As @MC is taking his 500th curtain call I will point out that it was my High Court decision that discovered micro sleeps which set in place the need for driver alertness monitoring worldwide.

My opinion only DYOR
FF


AKIDA BALLISTA
 
  • Like
  • Fire
  • Haha
Reactions: 63 users

TECH

Regular
Lawyers even old retired ones are all about the fine print. Under Industrial on the new website the following appears:

automated decisions.
Finally, the need for machine learning to add classification categories without disrupting output is critical.
High performance (inferences/second), remarkable efficiency (microwatt), and one-shot on-chip learning, bring optimized productivity, reduced downtime, and improved security to your factory floor.
BrainChip has demonstrated and deployed capabilities for industrial applications including:
  • Object classification
  • Robotics control for vision, audition, tactile
  • Quality control
  • Machine control and preventative maintenance
industrial_2x.png

BrainChip works with industrial leaders to make their manufacturing environments smarter, more productive, and fundamentally secure.

Now in the fine print above the following appears:

"BrainChip has demonstrated and deployed capabilities for industrial applications including:
  • Object classification
  • Robotics control for vision, audition, tactile
  • Quality control
  • Machine control and preventative maintenance"
As far as I am aware BrainChip has not released details of an EAP yet who meets the description of a customer that would need these capabilities deployed in an industrial setting. Yet here it is stated as FACT. So I expect it has to be true. Why? Well this new website is not aimed at retail investors it is aimed at customers.

So suppose I am a customer and I ring up and after suitably verifying who I am and my genuine need to know I say can I see how one of these deployed object classifications, robotic controlled vision, audition, tactile, quality control or machine control and preventative maintenance is/are running in the real world I am unlikely to be impressed if Rob Telson replies sorry that is just for advertising it has not really happened.

So the point of the fine print is Brainchip must be considerably advanced in these four areas to make such a bold and definite statement and it strikes me that when we finally know who the remaining EAP's are that are expected to convert at least one will be engaged in the industrial use of AKIDA technology for these capabilities because these are the included deployed capabilities of which there are other capabilities deployed but not listed, based once again on the fine print.

So who are these INDUSTRIAL LEADERS that Brainchip is working with is the question?

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Fire
Reactions: 10 users

JK200SX

Regular
  • Haha
  • Like
  • Love
Reactions: 12 users

VictorG

Member
Have not been able to put a number on how many customers Nvidia has if anyone finds a number it would be exciting to know. FF
Nvidia has lots and lots of customers, and then some!!
 
  • Like
  • Fire
  • Love
Reactions: 17 users

Slade

Top 20
Lausanne, Switzerland – May 11, 2022 – nViso SA (NVISO), the leading Human Behavioural Analytics AI company, today announced at the AI Expo Japan new capabilities and milestones for its Human Behaviour AI SDK in support for neuromorphic computing. These accomplishments build upon the partnership recently announced in April 2022 by NVISO and BrainChip, where it will be showing the world’s first neuromorphic SDK for Human Behavioural AI designed for ultra-low power mass market consumer products at AI Expo 2022 Tokyo from the 11th to the 13th of May.

New features and achievements facilitated by the upgrade to neuromorphic computing include:

- The world’s first commercial grade emotion recognition AI App designed for neuromorphic processing, with peak speeds of more than 250fps achieved using the BrainChip Akida processor. Using NVISO’s advanced edge deployment methodology and BrainChip MetaTF framework, the neuromorphic optimized Emotion Recognition AI App currently delivers an accuracy of 90.28% with speeds up to 250fps using 2-bit quantization. By comparison the same AI App implemented on a Raspberry Pi 4 single core ARM A53 CPU processor using 8-bit compression shows similar accuracy but only running at 12fps, resulting in 20x improvement in performance.

- Fully integrated AI Solutions for Smart Living and Smart Mobility use cases requiring ultra-low power consumption leveraging true heterogeneous digital to analog AI processing technologies ranging from MPU, CPU, GPU, and NPU combined with neuromorphic computing technology. These AI Apps are specifically designed to leverage resource constrained low-power and low-memory hardware platforms deployed at the extreme edge, where human observations are made and processed locally without an internet connection both addressing privacy concerns and removing the latency and power consumption costs associated with moving data for processing in the cloud.

“We are committed to work with technology leaders like NVISO to help define the next-generation user experiences and accelerate the future of smart living and mobility,” said Jerome Nadel, CMO at BrainChip. “We are excited to see the AI Apps and Solutions from NVISO so quickly available on the BrainChip Akida platform and to bring these new consumer and automotive experiences to customers around the world.”

NVISO’s solutions deliver on these use cases through a range of AI Apps providing visual observation, perception and semantic reasoning capabilities, the results of which can be used in identifying issues, in decision making processes and in supporting autonomous “human like” interactions. Examples of these AI Apps provide the analysis of core signals of human behaviour such as facial expressions, emotions, identity, head pose, gaze, gestures, activities, and the identification of objects with which users interact. These AI Apps can be optimised for typically resource constrained low power and low-cost processing platforms deployed on the edge. Furthermore, NVISO AI Apps can be easily configured to suit any camera system for optimal performance in terms of distance and camera angle, and thanks to NVISO’s large scale proprietary human behaviour databases NVISO’s AI Apps are robust to the imaging conditions often found in real world deployments. Unlike cloud-based solutions, NVISO’s solutions do not require information to be sent off-device for processing elsewhere so user privacy and safety can be protected.​
 
  • Like
  • Fire
  • Love
Reactions: 48 users

Dr E Brown

Regular
  • Like
  • Fire
  • Love
Reactions: 45 users

TECH

Regular
Good morning from the wild west, cold front moving over us...

The world markets are correcting themselves which is a very healthy thing, the S&P is predicted to fall up to another 800 points to settle around the 3,100 mark.

Companies that have good fundamentals will turn around again, companies like Brainchip, whose fundamentals aren't good, they are GREAT, well, it can become a great buying opportunity, so don't despair, even as I write this, we have once again gone against the trend.

Overnight I sent Jerome a congratulatory message with regards the new website and the immediate impact his talent has had on the direction marketing is taking us...below is just another comment that reflects drive, professionalism and confidence in our technology.

" We are ramping marketing, and everything at BrainChip. Stay tuned..."
 
  • Like
  • Fire
  • Love
Reactions: 71 users

RobjHunt

Regular
Anyone else having problems with the Bell Direct trading platform? It seems broken for me.
 
  • Like
Reactions: 2 users

Dhm

Regular
@Fact Finder I have to ask, are you and @Blind Freddie one and the same? Perhaps similar to Sir Les and Dame Edna?



Poor Jacki Weaver!

 
Last edited:
  • Haha
  • Like
  • Love
Reactions: 21 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Morning Brain Fam,

Everyone, including Blind Freddie😝, knows about my obsession with Cerence as per multiple previous posts, so continuing on in that vein, I thought I'd share this conversation between Christophe Couvreur (Vice-President Core Technologies and Hybrid Platforms at Cerence) and Dean Harris (Automotive Business Development, NVIDIA), Alexandra Baleta (Manufacturing and Automotive Industry Director, VMWare) and Sunil Samel (VP Products, Akridata).

Every single one of the 'challenges' they raised back in June 2021 ( latency, scalability, security, etc) can be addressed via the adoption of Akida, which Christophe Couvreur would already be aware of having worked together with Mercedes on the EQXX concept car.

Don't forget Cerence prides itself on having the Fastest, Most Powerful and Intelligent AI Assistant Platform for Global Mobility.

For Cerence it's all about providing the best user experience, which they are unable to offer without AKIDA IMO.




Screen Shot 2022-05-12 at 11.31.25 am.png



Screen Shot 2022-05-12 at 11.32.06 am.png



1 pm.png
 
  • Like
  • Fire
  • Love
Reactions: 60 users
Ha ha I went down the same path yesterday with the drone on the new website via google image search!


View attachment 6096
View attachment 6098

View attachment 6095
Just had a look at NVISO press releases. In August last year they announced

NVISO and Tobii to collaborate to accelerate innovation in Interior Monitoring Systems

When I did some research of Tobii they too had an interesting announcement in February this year.

Tobii is in negotiation to be the eye tracking technology provider for the Sony PlayStation VR2

The front page of their website opens up some incredible opportunities for beneficial ai.
Nailed it mate, slam dunk. PS VR2 is set to lunch in the back half of 22… adding to that explosion in sales
 
D

Deleted member 118

Guest
I know a few of us are getting old and I came across and thought I’d share it with a few of you, because there will be one a day that you might not be able to drive a car.

 
  • Like
  • Love
  • Fire
Reactions: 14 users

Slade

Top 20
I know a few of us are getting old and I came across and thought I’d share it with a few of you, because there will be one a day that you might not be able to drive a car.

That thing could follow you around carrying an eski full of beer. Mobile self driving eski........it's a winner!
 
  • Haha
  • Like
  • Love
Reactions: 24 users

RobjHunt

Regular
Yep o_O

Bruken.JPG
 
  • Like
Reactions: 1 users
Top Bottom