BRN Discussion Ongoing

uiux

Regular
I will take a leaf out of your Shareman book:

I asked a question and I am waiting for your answer.

I suspect that you like Shareman will go on about your merry way.

But I make you this promise I will no longer allow your foul mouth and aggression to go unchallenged.

FF

I've said my piece.
 
  • Like
Reactions: 3 users
S

Straw

Guest
I have no solid expectation for the next 4C except for the possibility of there being some positive lumpiness and if not I'll take it as being a less lumpy quarter. I'd hate to see shorters benefit from anyone's expectations.

As for QE that is sad as she seemed like a committed and reasonable person with a lot of expectation on them to at least project stability/community which is a big ask for any human being and sorely needed these days (any personal reservations about the concept of Monarchy withstanding).
 
  • Like
  • Love
Reactions: 9 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Howdy Team,

I was just reading this summary of a report about Automotive Human Machine Interface Opportunities. The report offers an overview of HMI and it states the following

"'Consumer interaction with in-vehicle infotainment systems started with a mere hard button and rotary dial system interface but now includes advanced features such as touch screen, voice recognition, and even gesture control. This study offers an overview of in-vehicle HMI features, such as driver display and cockpit display screens, mapping, media integration, and smartphone connectivity, among the top 5 OEMs in the premium and mass market sectors in addition to model availability and infotainment pricing."


It also highlights the following OEM's who are working on HMI features and strategies. Now we know for sure that BrainChip and Mercedes are working together and IMO we won't have to wait too much longer before we can confirm (beyond merely speculating) many others on the listwho are doing the same thing. IMO obviously.

  • Audi
  • BMW
  • Chevrolet
  • Ford
  • Honda
  • Lexus
  • Nissan
  • Mercedes-Benz
  • Tesla
  • Toyota

 
  • Like
  • Love
  • Fire
Reactions: 33 users

Lex555

Regular
I've said my piece.
I've said my piece.
FF just caught you 😆
179138D2-0578-4CAA-ABF7-DE732AA1655B.gif
 
  • Haha
  • Like
Reactions: 7 users

miaeffect

Oat latte lover
Can we take the royal discussion somewhere else I seriously could not give a fuck about the monarchy and I dont want to read about it in this thread
I agree with uiux. Don't want to read about the monarchy.

So I just scroll down.
 
  • Like
  • Love
Reactions: 14 users
I agree with uiux. Don't want to read about the monarchy.

So I just scroll down.
And you managed to say that politely without aggression and foul language.

FF

AKIDA BALLISTA
 
  • Like
  • Love
  • Haha
Reactions: 27 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Howdy Team,

I was just reading this summary of a report about Automotive Human Machine Interface Opportunities. The report offers an overview of HMI and it states the following

"'Consumer interaction with in-vehicle infotainment systems started with a mere hard button and rotary dial system interface but now includes advanced features such as touch screen, voice recognition, and even gesture control. This study offers an overview of in-vehicle HMI features, such as driver display and cockpit display screens, mapping, media integration, and smartphone connectivity, among the top 5 OEMs in the premium and mass market sectors in addition to model availability and infotainment pricing."

It also highlights the following OEM's who are working on HMI features and strategies. Now we know for sure that BrainChip and Mercedes are working together and IMO we won't have to wait too much longer before we can confirm (beyond merely speculating) many others on the listwho are doing the same thing. IMO obviously.

  • Audi
  • BMW
  • Chevrolet
  • Ford
  • Honda
  • Lexus
  • Nissan
  • Mercedes-Benz
  • Tesla
  • Toyota



Guess who else likes Human Machine Interface features...LG Electronics.

❤️


1 am.png



 
  • Like
  • Love
  • Fire
Reactions: 51 users

robsmark

Regular
Stop bickering everyone, you’re upsetting the SP.
 
  • Haha
  • Like
Reactions: 33 users

Deadpool

hyper-efficient Ai
That reminds me ,I haven't taken my medication this morning, think some on here may have forgotten as well.o_O
mrw GIF

GstK
 
  • Haha
  • Like
  • Fire
Reactions: 21 users

Learning

Learning to the Top 🕵‍♂️


It's great to be a shareholder.
 
  • Like
  • Love
  • Fire
Reactions: 35 users

Learning

Learning to the Top 🕵‍♂️
On another happy front😎,

Just able to converts 20% of mrs Learning super into BRN.
That's right shorter, it's lock away until her retirement. Oh wait how long is that? Mrs Learning is not due to retirement for another 30+ years. 😂😂😂

Or unless BRN is $50 in ten years, then we all can have very early retirement. 🎉🎉🎉😂😂😂🏖🏖🏖

It's great to be a shareholder.
 
  • Like
  • Love
  • Fire
Reactions: 57 users

Vanman1100

Regular
We’ve had Greek and now foul language.
What a week.
 
  • Haha
  • Like
Reactions: 21 users

Learning

Learning to the Top 🕵‍♂️
This is intresting:

Intel & Mercedes!

"While this may sound futuristic, Intel’s neuromorphic computing research is already fostering interesting use cases, including how to add new voice interaction commands to Mercedes-Benz vehicles; create a robotic hand that delivers medications to patients; or develop chips that recognize hazardous chemicals."


It's great to be a shareholder.
 
  • Like
  • Thinking
  • Love
Reactions: 16 users

uiux

Regular
This is intresting:

Intel & Mercedes!

"While this may sound futuristic, Intel’s neuromorphic computing research is already fostering interesting use cases, including how to add new voice interaction commands to Mercedes-Benz vehicles; create a robotic hand that delivers medications to patients; or develop chips that recognize hazardous chemicals."


It's great to be a shareholder.

It's based on a story from 2020


Mercedes has since switched from Intel to BrainChip


The article is entirely misleading
 
  • Like
  • Fire
  • Love
Reactions: 35 users
Guess who else likes Human Machine Interface features...LG Electronics.

❤️


View attachment 16182


Just as I ask the 1 percent question I feel it is appropriate to conclude that as Mercedes Benz claimed that AKIDA offered a five to ten times performance improvement over anything else this must have included anything that LG could suggest or introduce.

That being the case what would the logical argument be that LG could mount for not exploring the use of AKIDA technology in this way.

I am reminded in saying this that @TECH has going back to HC days suggested LG as a company to watch.

My opinion only DYOR
FF

AKIDA BALLISTA
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 34 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Interesting


1 am.png
 
  • Like
  • Love
Reactions: 13 users

Concept: California’s tech company BrainChip and French tech startup Prophesse have partnered to launch next-gen platforms for original equipment manufacturers (OEMs) looking to integrate event-based vision systems with high levels of AI performance. The partnership combines Prophesee’s computer vision technology with BrainChip’s neuromorphic processor Akida to deliver a complete high-performance and ultra-low-power solution.

Nature of Disruption:Prophesee’s computer vision technology leverages patented sensor design and AI algorithms. It mimics the eye and brain to reveal what was invisible until now using standard frame-based technology. The new computer vision technology has applications in autonomous vehicles, industrial automation, IoT, security and surveillance, and AR or VR. BrainChip’s Akida mimics the human brain to analyze only essential sensor inputs at the point of acquisition. It can process data with improved efficiency and precision. Also, it keeps AI or ML local to the chip, independent of the cloud to reduce the latency. The combination of both technologies can help advance AI enablement and offer manufacturers a ready-to-implement solution. Additionally, it helps OEMs looking to leverage edge-based visual technologies as part of their product offerings.

Outlook: The application of computer vision is increasing in various industries including automotive, healthcare, retail, robotics, agriculture, and manufacturing. It gives AI-enabled gadgets an edge to perform efficiently. BrainChip and Prophesee claim that the combination of their technologies can provide OEMs with a computer vision solution that can be directly implemented in a manufacturer’s end product. It can enable data processing with better efficiency, precision, and economy of energy at the point of acquisition
Just on that FF.

Been reading one of our latest Patent apps from earlier this year.

Appears to be an evolution of earlier ones but some parts lead me to believe it is heavily slanted at event cameras and our Akida in relation to processing the information.

Interesting read and whilst most is above my pay grade there are elements that are understandable.

I snipped a few sections.

First link to overarching BRN USPTO including TMs (live & dead). Second link is the patent app.





Event-based Extraction Of Features In A Convolutional Spiking Neural Network​


U.S. patent application number 17/583640 was filed with the patent office on 2022-05-12 for event-based extraction of features in a convolutional spiking neural network. This patent application is currently assigned to BrainChip, Inc.. The applicant listed for this patent is BrainChip, Inc.. Invention is credited to Kristofor D. CARLSON, Milind JOSHI, Douglas MCLELLAND, Harshil K. PATEL, Anup A. VANARSE.



Application Number20220147797 17/583640
Document ID/
Family ID1000006135263
Filed Date2022-05-12

United States Patent Application
20220147797
Kind Code
A1
MCLELLAND; Douglas ; et al.
May 12, 2022


[0100] DVS Camera: DVS stands for dynamic vision sensor. DVS cameras generate events which event-based processors like embodiments of the present approach can process directly. Most cameras produce frame-based images. The main advantage to DVS cameras is the low-latency and the potential for extremely low-power operation.

[0116] A deep neural network (DNN) is defined as an artificial neural network that has multiple hidden layers between its input and output layers, each layer comprising a perceptron. A Convolutional Neural network (CNN) is a class of DNN that performs convolutions and is primarily used for vision processing. CNNs share synaptic weights between neurons. Shared weight values in a perceptron are referred to as filters (aka kernels). Each layer in a conventional CNN is a perceptron. In the present embodiment, event-based convolution is implemented in a Spiking Neural Network (SNN) using event-based rank-coding rather than rate-coding, which has advantages in speed and considerably lower power consumption. Rank coding differs from rate-coding of spike events in that values are encoded in the order of spikes transmitted. In rate coding the repetition rate of spikes transmitted expresses a real number. CNNs process color images which are defined as: imageWidth.times.imageHeight.times.channelNumber. A color image generally has 3 channels (Red, Green, and Blue). CNNs often have many layers, so the output of a convolutional layer is the input to the next convolutional layer. Descriptions are provided as to how convolutions take place in conventional perceptron-based CNN before discussing the event-based convolution methods implemented in the present invention and show that the result of convolution in a CNN and event-based convolution in the present invention return the same results.

[0256] One advantage of implementing event-based transposed convolution is to achieve reusability of the spiking neuron circuits, which will reduce the size and cost of the neuromorphic hardware (e.g. a chip). In other words, the neuromorphic hardware is optimized to implement a diverse variety of use cases by effectively reusing the spiking neuron circuits available on the hardware.

[0283] The present method and system also includes data augmentation capability arranged to augment the network training phase by automatically training the network to recognize patterns in images that are similar to existing training images. In this way, feature extraction during feature prediction by the network is enhanced and a more robust network achieved.

[0284] Training data augmentation is a known pre-processing step that is performed to generate new and varying examples of original input data samples. When used in conjunction with convolutional neural networks, data augmentation techniques can significantly improve the performance of the neural network model by exposing robust and unique features.

[0286] However, existing training data augmentation techniques are carried out separately of the neural network, which is cumbersome, expensive and time consuming.

[0287] According to an embodiment of the present invention, an arrangement is provided whereby the set of training samples is effectively augmented on-the-fly by the network itself by carrying out defined processes on existing samples as they are input to the network during the training phase. Accordingly, with the present system and method, training data augmentation is performed on a neuromorphic chip, which substantially reduces user involvement, and avoids the need for separate preprocessing before commencement of the training phase.

[0315] It will be appreciated that the disclosed data augmentation technique produces a more robust machine learning model by effectively creating different training inputs using existing training data. Performing transformations on the data can create new samples for the model to train on, and varying the data through transformations can cover a larger input domain with limited samples. Advantageously, transforming the data creates new input samples, artificially creating different scenarios for the model to train on. This technique can also be used to create new samples while implementing one/low shot learning in the spiking domain.
 
  • Like
  • Love
  • Fire
Reactions: 21 users

Fox151

Regular
There's going to be about a months long heavy news cycle regurgitating endless BS about this

I want to ensure that this thread and the BRN forum remains a space focused on the technology and industry so I can come here knowing I won't be exposed to this story, because like I said, I really don't give a shit about the monarchy. It's going to be unavoidable and hopefully those that want to talk about the monarchy can respect those of us who wish to avoid discussions about it.
Don't justify it uiux. You were out of line.
 
  • Like
  • Love
Reactions: 10 users

uiux

Regular
  • Like
Reactions: 2 users

Diogenese

Top 20

Hi equanimous,

8 September 2022
".@NASA has tapped @SiFive (and @MicrochipTech) to create a space-centric RISC-V processor: the High-Performance Spaceflight Computing chip. At heart of the HPSC will be SiFive's X280 64-bit RISC-V cores, which include ML acceleration capabilities."


SiFive does have its own ML, but it is software-based:

https://www.sifive.com/cores/intelligence
SiFive Intelligence is an integrated software + hardware solution that addresses energy efficient inference applications. It starts with SiFive’s industry-leading RISC-V Core IP, adds RISC-V Vector (RVV) support, and then goes a step further with the inclusion of software tools and new SiFive Intelligence Extensions, vector operations specifically tuned for the acceleration of machine learning operations. These new instructions, integrated with a multi-core capable, Linux-capable, dual-issue microarchitecture, with up to 512b wide vectors, and bundled with TensorFlow Lite support, are well-suited for high-performance, low-power inference applications.

We've been friends with SiFive in public since April, so that's 5 months plus the clandestine (NDA) period which would add several months.

https://brainchip.com/brainchip-sifive-partner-deploy-ai-ml-at-edge/
Laguna Hills, Calif. – April 5, 2022 BrainChip Holdings Ltd (ASX: BRN, OTCQX: BRCHF, ADR: BCHPY), the world’s first commercial producer of ultra-low power neuromorphic AI chips and IP, and SiFive, Inc., the founder and leader of RISC-V computing, have combined their respective technologies to offer chip designers optimized AI/ML compute at the edge.

Hardware ML would be an optimization compared to software ML.

We are not listed among Microchip's AI/ML partners, but one of our friends is:

https://www.microchip.com/en-us/solutions/machine-learning/design-partners


Get Expert Help With Your AI/ML Design​


If you need assistance developing an Artificial Intelligence (AI) or Machine Learning (ML) project, we have partnered with industry-leading design companies to provide state-of-the art AI-based solutions and software tools that support our portfolio of silicon products. These partners have proven capabilities and are uniquely qualified to provide you with the support you need to successfully bring your innovative design to life.



Design PartnerLocationContact InfoApplication AreaFocus/StrengthFocus Microchip ProductsCustomer Prerequisites
edge-impulse.png
Edge ImpulseSan Jose, USAhello@edgeimpulse.comSmart Predictive Maintenance, Smart HMIDevelopment Software ToolkitArm Cortex-based 32-bit MCUs and MPUsAI solution developers
motion-gestures-logo.png
Motion GesturesWaterloo, Canadainfo@motiongestures.comSmart HMIGesture Recognition SolutionArm Cortex-based 32-bit MCUs and MPUsNo AI experience required
SensiML-Logo.png
SensiMLOregon, USAinfo@sensiml.com +1 (503) 567-1248Predictive Maintenance, Gesture Recognition, Process ControlSmall-Footprint, Low-Power Edge DevicesArm Cortex-based 32-bit MCUs and MPUsAI solutions developers


So. given we have known SiFive for more than 6 months, and given we are friends with Edge Impulse, and given the objective our partnership with SiFive of producing optimized AI/ML, and given NASA's penchant for energy efficiency and autonomous operation, you'd have to think that we have a reasonable chance of being involved.
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 30 users
Top Bottom