BRN Discussion Ongoing

Boab

I wish I could paint like Vincent
  • Like
  • Love
  • Fire
Reactions: 9 users

Pappagallo

Regular
Hi @Fact Finder
Something I have been pondering is why Mercedes let the world know they were using Akida when they have a strict NDA in place with Brainchip protecting this secret?

Wouldn’t it be in Mercedes best interest to have kept this under wraps until they release a vehicle containing this technology to the market?

The only reason I can think of is perhaps Brainchip asked a favour of Mercedes? Purely speculative but I don’t know why Merc would release details of a new secret ingredient (Akida) to let other car manufacturers know about their competitive advantage.

Either way I am so glad they did disclose our tech!

I think it’s just marketing. Concept cars are a marketing exercise, a glimpse into the next generation of products. Maybe seeing a Mercedes “that thinks like you” today stops you from buying a Tesla tomorrow especially when MB explicitly states that the concept tech will hit the road in only a couple of years.

Tesla to their credit got the jump on everyone wrt the electric revolution and all of the legacy car makers are now playing catch up. Marketing plays a big role in doing that. Hence the concept car. Hence the TV commercial during the Masters specifically highlighting neuromorphic computing. Does a Tesla have this next level energy-saving self learning architecture? Hmm maybe I’ll wait for this something better that’s just around the corner.

Mercedes has probably been playing with Akida for two years now so they already have their lead. The NDA has mostly done its job. Now they’re moving into the next phase of the cycle which is creating the buzz.
 
  • Like
  • Fire
  • Love
Reactions: 16 users
Can’t open the link unfortunately.
Opened for me but maybe you are the wrong religion or something. That affects technology doesn’t it. Or was it metal hip screws I read about on Twitter. 😂🤣😂 You should be able to track from this:

“ Algorithm-architecture adequacy of spiking neural networks for massively parallel processing hardware Paul Ferré
To cite this version:
Paul Ferré. Algorithm-architecture adequacy of spiking neural networks for massively parallel pro- cessing hardware. Machine Learning [cs.LG]. Université Paul Sabatier - Toulouse III, 2018. English.
NNT : 2018TOU30318 . tel-02400657
HAL Id: tel-02400657 https://tel.archives-ouvertes.fr/tel-02400657 Submitted on 9 Dec 2019
HAL is a multi-disciplinary open access archive for the deposit and dissemination of sci- entific research documents, whether they are pub- lished or not. The documents may come from teaching and research institutions in France or abroad, or from public or private research centers.
L’archive ouverte pluridisciplinaire HAL, est destinée au dépôt et à la diffusion de documents scientifiques de niveau recherche, publiés ou non, émanant des établissements d’enseignement et de recherche français ou étrangers, des laboratoires publics ou privés”

Regards
FF

AKIDA BALLISTA
 
  • Like
  • Haha
Reactions: 4 users

Deadpool

hyper-efficient Ai
Prophesee Gen 4 neuromorphic event based camera demos by Western Sydney University on Youtube past few days.



The International Centre for Neuromorphic Systems (ICNS) used their Astrosite Mobile Neuromorphic Observatory to record this striking footage of the full lunar eclipse through a neuromorphic event-based camera. What makes it so remarkable is that you can see both the illuminated and eclipsed portions of the moon clearly in the data from the neuromorphic sensor. The conventional sensor, included in the inset for comparison, can only see the illuminated portion of the moon. It is the neuromorphic camera technology that allows us to do this. The biology-inspired pixels in these cameras work independently of one another, allowing some to see bright parts of the moon whilst others are looking at much darker parts.



Neuromorphic cameras are great at capturing high-speed events with incredible time resolution and we can see incredible details in our series of rocket launches videos. However, in this video, we play with time to show 4 hours of data from our sensor inside the exclusion zone around the rocket in this incredible varying-timelapse video. We start with the camera on the ground and the recording sped up 300x. You can see the clouds and stars moving through the field of view as the night progresses. Whilst we're playing the data back at 300x real-time, we captured the data with microsecond time resolution. When something interesting happens, such as a shooting star, then we can slow time right down and view it in exquisite detail. We then speed up the recording again to wait for the launch. Just as the rocket positions itself for the final countdown, we slow time right back down for the launch. We can track the rocket all the way until the second stage ignites.



In 2021, a team of researchers from the International Centre for Neuromorphic Systems (ICNS) travelled to the remote town of Nhulunbuy in the Northern Territory of Australia. Their mission was to record two rocket launches with our biology-inspired neuromorphic cameras. Each launch happened at night and was recorded with four neuromorphic cameras, including two cameras placed in boxes located a mere 30 metres away from the rocket. Our neuromorphic cameras captured the rockets in striking detail, despite their high velocity and brightness, including detail of the exhaust gas flows, and rocks and polystyrene packing that were flung around the launch pad, yielding far more useful information than conventional sensors.



Neuromorphic cameras don't take pictures. They see the world in a completely different way from a normal camera. In this video, we film a rocket launch with both a neuromorphic camera and a normal camera to show the difference between the two. The launch is spectacular in both cameras, but the real magic happens when we slow down the playback. The neuromorphic camera captures events with incredible timing resolution and isn't affected by the bright burst of light coming from the rocket firing. In fact, we can see all sorts of details in the rocket launch including the debris kicked up by the rockets, all sorts of interesting effects in the rocket plume, and even the spin thruster ignitors flying out and burning up. All this before we even start seeing anything useful on the normal camera!

This is very exciting stuff, and we are only getting started.
 
  • Like
  • Fire
  • Love
Reactions: 11 users

Slade

Top 20
The way that I deal with any doubts that I have about BrainChip during these volatile times is I ask myself, if I had just come across the company and spent a day researching them, would I invest? The answer is hell yes. That being the case it makes no sense to start stressing now. If I sold I would only want to buy back in anyway.
 
  • Like
  • Fire
  • Love
Reactions: 27 users

Sirod69

bavarian girl ;-)
The way that I deal with any doubts that I have about BrainChip during these volatile times is I ask myself, if I had just come across the company and spent a day researching them, would I invest? The answer is hell yes. That being the case it makes no sense to start stressing now. If I sold I would only want to buy back in anyway.
@Slade I quite agree with you, I want to say that back then, when BRN was EUR 1.70 with us, I should have sold and bought myself much lower again. Yes, but unfortunately you don't know any better beforehand.
I don't think any of us would have thought that it would go down that far again. I am also extremely in the red, like probably many here? Nevertheless, I always bought more and every morning I was sad again that it went down again. But yes, as you say, we are simply convinced and that is why we stay and we stay long. Nice to be a connection here that believes in brain chip and we have blind freddy here
 
  • Like
  • Love
  • Fire
Reactions: 40 users

Realinfo

Regular
TSExers may know my story with Apple…in a nutshell, I bought in soon after Steve Jobs returned from his hiatus, and sold soon after he died. I was a deadset Jobs fan, and considered it ‘the day the music died’

Tim Cook was appointed CEO shortly before Steve died, and whilst he was a competent businessman, I thought like many others, the loss of such an inspirational leader, would cause Apple to begin a gradual slide.

We all know how that decision worked out.

In modern parlance, my learning from this was…it’s wonderful to have a visionary like Steve to create the magic, but there is no substitute for a canny business person like Tim to take that magic and run it for all it’s worth.

Peter has / is creating the magic…can Sean do a Tim Cook?

Just some weekend musings for TSExers to think about.

Can Sean do a Tim Cook?

Imagine the onerous task of taking the helm from a person like Stevie J. Tim Cook doesn’t have the creativity or vision of Jobs, but he‘s a first class leader. He methodically worked his way through all the facets of Apple’s business, coaxing, cajoling and motivating the Apple Army to be as dynamic, productive and efficient as possible. He turned a company that was already blessed, into a money making powerhouse.

If you go back and look at the history of Apple, you will see that it wasn’t always this way. A similar story can be told about most of the technology giants out there today.

I believe our battler is also blessed…that it‘s destined to become a giant, but somebody has to provide the spark to make it happen.

Is Sean Hehir that somebody…can he do a Tim Cook?

I’ve listened to his plan and I think he can. I believe we should give him the time to see it through, but that time cannot be infinite.
 
  • Like
  • Fire
  • Love
Reactions: 19 users
Can Sean do a Tim Cook?

Imagine the onerous task of taking the helm from a person like Stevie J. Tim Cook doesn’t have the creativity or vision of Jobs, but he‘s a first class leader. He methodically worked his way through all the facets of Apple’s business, coaxing, cajoling and motivating the Apple Army to be as dynamic, productive and efficient as possible. He turned a company that was already blessed, into a money making powerhouse.

If you go back and look at the history of Apple, you will see that it wasn’t always this way. A similar story can be told about most of the technology giants out there today.

I believe our battler is also blessed…that it‘s destined to become a giant, but somebody has to provide the spark to make it happen.

Is Sean Hehir that somebody…can he do a Tim Cook?

I’ve listened to his plan and I think he can. I believe we should give him the time to see it through, but that time cannot be infinite.
And as we all know the CEO Sean Hehir did not ask for infinity just to the next AGM a period of 12 months in which to prove his worth and be judged by shareholders.

He has had five months, so seven months to go until he will present himself for judgment.

We have had some interesting developments, some exciting partnerships, great appointments, new patents and interesting unexpected technology reveals such as AKIDA doing regression analysis in the last five months and while income is an over simplistic benchmark, the benchmark he set was for income to be outpacing the growth in expenses and the ability to revisit with the Board the question of giving guidance as to expected revenue.

The progress with Edge Impulse is deserving of special note as it has moved very quickly and is now clearly driving interest in the AKIDA technology solution.

Some lofty ambitions in these difficult times but he was aware last May, 2022 of all the potential headwinds. Time will tell if he chose the correct sails to weather the economic and political storms.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Fire
  • Love
Reactions: 33 users

Diogenese

Top 20
Hi @Diogenese
Could a link to Brainchip be via the New Zealand University where some research work was done by one of Brainchips now employees on SNN???
Regards
FF

AKIDA BALLISTA
Hi FF,

Weta were involved in Lord of the Rings animation, and they claim 1500 employees.

They have about 80 patents.

This is one of several relate to facial expressions, which a couple of our friends are involved in:

US11410370B1 Systems and methods for computer animation of an artificial character using facial poses from a live actor

1667612420288.png


Not sure if it only does freckles.


1667612659608.png








Embodiments described herein provide an approach of animating a character face of an artificial character based on facial poses performed by a live actor. Geometric characteristics of the facial surface corresponding to each facial pose performed the live actor may be learnt by a machine learning system, which in turn build a mesh of a facial rig of an array of controllable elements applicable on a character face of an artificial character.

The neural network is taken s a given, and is not described in detail.

[0078] FIG. 2A illustrates an example neural network system 200 in which scan results are provided to a muscle simulator 202 and a neural network 204 and an anatomical model to the muscle simulator 202 . An output of the muscle simulator 202 is dynamic muscle activations for Actor A, which in turn are provided to neural network 204 . Neural network 204 then outputs a manifold to manifold storage 208 .


There is scope for incorporating Akida.
 
  • Like
  • Love
  • Fire
Reactions: 13 users

Diogenese

Top 20
They spilled the beans regards to using Akida in EQXX, not that they will deploy Akida in all their EQ range across all sectors of the vehicle absolute destroying performance of competition:)

If Mercedes has an NDA in place with Brainchip, which we now know we do, it most certainly is not towards a test vehicle, but towards actual products being released on the world scale. Just a thought. Maybe letting the world know about Akida in EQXX does not breach their actual NDA terms?
One way of looking at it is by seeing who has the greatest commercial interest in keeping mum about Akida. Akida will be less than 1% of the cost of the finished product, even if it plays a more important role in performance.

From this perspective, Mercedes clearly has the whip hand.
 
  • Like
  • Love
  • Fire
Reactions: 7 users
They spilled the beans regards to using Akida in EQXX, not that they will deploy Akida in all their EQ range across all sectors of the vehicle absolute destroying performance of competition:)

If Mercedes has an NDA in place with Brainchip, which we now know we do, it most certainly is not towards a test vehicle, but towards actual products being released on the world scale. Just a thought. Maybe letting the world know about Akida in EQXX does not breach their actual NDA terms?
I think it is wrong to say they spilled the beans. It was far from an accident it was a measured dispensing of just enough information to create interest a bit like giving a cocaine addict a tiny taste of the product you can supply. Perhaps not the usual marketing analogy but what can I say its the way I was brought up. LOL Anyway I have said it before they are the gorilla and Brainchip is the scientist trying to study them. To do so Brainchip has to play by Mercedes Benz rules.

We were on the last day of our kitchen renovation yesterday and the apprentice said to his boss your in the way. The boss replied, "No I am the boss. The boss is never in the way. You work around me."

In very simple terms this one interchange explains everything about Brainchip and Mercedes Benz.

I believe that one day the roles will be reversed but not just yet. So for Brainchip it will be "You can call me anything you like except late for dinner."

My opinion only DYOR
FF

AKIDA BALLISTA
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 25 users

Diogenese

Top 20
I went looking for this on Socionext’s Twitter and found this article.

The extracted paragraphs do sound like they could have been taken from a Chapter in the Brainchip Book of AKIDA for ADAS & AV:

“SoCs for Electric and Autonomous Car Makers​

By Rick Fiorenzi | Monday, October 17, 2022

Whether ADAS applications will be needed to be successful in the future is not a question of “if” but “when”.

Next-generation autonomous driving platforms require higher levels of performance to make split-second decisions.

A vehicle needs to comprehend, translate, and accurately perceive its surrounding environment and react to changes as fastest and safest means possible.

Future ADAS and Autonomous implementations (Figure 1) require higher performance, real-time edge computing with AI processing capabilities, along with high bandwidth interfaces to a host of high-resolution sensors, including radar, LiDAR, and camera.

Improving the “seeing/vision” capabilities of advanced driver assistance systems (ADAS) is extending beyond cameras and LiDARs by incorporating smart sensors to handle complex driving scenarios that the auto industry coins Level 4, or “high” automation.”

My opinion only it’s just such a pity Brainchip does not have an announced commercial deal with Socionext only a release from a Socionext Vice President saying they are proud to be helping Brainchip commercialise their entire AKIDA family or words to that effect.

When you read the full article it is advertising how Socionext can assist manufacturers to design and manufacture their own custom chips for this purpose using the customers and third party IP. AKIDA IP could come in handy you would like to think.

That is really weird I just went back to copy a link to this article and it has disappeared. It will have to be a Google search if you are interested.😇

My opinion only DYOR
FF

AKIDA BALLISTA
... or just click on Rick.
 
  • Fire
  • Love
Reactions: 3 users

Pandaxxx

Regular
Just thinking about whether the release of Akida 2 will move the SP? I’d think not…revenue announcements is what we need.

Thoughts?
 
  • Like
  • Thinking
Reactions: 5 users

Diogenese

Top 20
Not sure if we have touched on OrCam MyEye Pro before but it does a lot of what Akida is capable of ???

No Internet connection!

Hey OrCam voice assistant!

The OrCam MyEye Pro can also help identify products and even recognize people!






As with every year, the CES Innovation Award goes to a gadget that can make life a lot better.

This year, at CES 2022, the OrCam MyEye Pro is the winner and it’s easy to see why.

This product clips on to glasses and helps the visually impaired or blind by reading out text, whether it’s printed or digital.

With the “Smart Reading” feature, it’s like having a Ctrl+F function on the PC – the wearers can find specific information fast using text-to-speech.


The OrCam MyEye Pro can also help identify products and even recognize people.

“We are living in uncertain times, yet… our users’ challenges related to access have not stopped during the pandemic. If anything, they have intensified,” said OrCam co-founder and co-chairman Prof. Amnon Shashua.

The glasses accessory can be completely controlled by the Hey OrCam voice assistant, so it’s a great day-to-day gadget that can make a lot of activities accessible to the visually impaired.

With one of these, you could read as fast as @Fact Finder.
 
  • Haha
  • Like
  • Love
Reactions: 9 users

buena suerte :-)

BOB Bank of Brainchip
  • Haha
  • Love
  • Like
Reactions: 5 users

M_C

Founding Member
Possible ...............


What Can you do with a 150 MHz MCU? Much More Than You Can Imagine​

Delivering multitasking performance and advanced neural networking and ML capabilities with a low-power 150 MHz MCU may seem extraordinarily difficult, but the MCX N94x and MCX N54x devices are not ordinary MCUs. They are marvels of multicore design and peripheral integration.

The MCX N94x and MCX N54x are based on dual high-performance Arm® Cortex®-M33 cores running up to 150 MHz, along with 2MB of flash with optional full ECC RAM, smart DMA, a DSP co-processor, a secure subsystem and an integrated NXP-designed NPU. Developers can use any combination of these cores and accelerators for specific tasks without driving up the MCU clock speed or increasing power consumption.

The on-chip accelerators enable the MCX N MCUs to juggle multiple complex tasks very efficiently within a low power budget while keeping the system secure.
The multicore design delivers improved system performance and reduced power consumption by enabling smart, efficient distribution of workloads to the analog and digital peripherals. As a result, the MCUs consume less than 45 μA/MHz active current, less than 2.5 μA in power down mode with the real-time clock (RTC) enabled and 8 KB SRAM retention, and less than 1 μA in deep power-down mode with the RTC active and 8 KB SRAM.

The dual-core architecture pairs a full-featured Cortex-M33 core with a streamlined M33 core to manage control functions, enabling developers to run applications in parallel or reduce overall power consumption by turning off individual cores as needed. For example, during secure over-the-air (OTA) updates for IoT devices, the main M33 core can handle system security, while the second streamlined core executes control functionality.

The MCX N MCUs include NXP’s first instantiation of its specialized NPU designed to enable high-performance, low-power intelligence at the edge. The on-chip NPU delivers up to 30x faster ML throughput compared to using a CPU core alone.
 
  • Like
  • Thinking
  • Fire
Reactions: 13 users

Newk R

Regular
@Slade I quite agree with you, I want to say that back then, when BRN was EUR 1.70 with us, I should have sold and bought myself much lower again. Yes, but unfortunately you don't know any better beforehand.
I don't think any of us would have thought that it would go down that far again. I am also extremely in the red, like probably many here? Nevertheless, I always bought more and every morning I was sad again that it went down again. But yes, as you say, we are simply convinced and that is why we stay and we stay long. Nice to be a connection here that believes in brain chip and we have blind freddy here
I like to think it out this way. When the price got to $2 plus, did i sell? No. Why? Because I believe the long term gains will be far higher.
So if the price was $2 plus now would I sell? No. So what I had when the the price was $2 plus is precisely what I have now. A portfolio of BRN shares that will change my life in the future. Exactly nothing has changed except my stress levels.
 
  • Like
  • Love
  • Haha
Reactions: 38 users
... or just click on Rick.
WOW.

How cool is that just clicking on Rick too obvious by half.

Must be some other ‘ick’ word you should never click on.

Thanks @Diogenese once again to the rescue.

Regards
FF

AKIDA BALLISTA
 
  • Haha
  • Like
Reactions: 2 users
  • Haha
Reactions: 1 users
Not sure if this had been posted before. Apologies if it has.

What is the Real Promise of Artificial Intelligence? | Semico Research.

Where Does BrainChip Enter the Picture?​

Figure 2 shows some current results of running the BrainChip Akida Neuromorphic processing architecture for various workloads using 1-4-bits for weight quantization’s compared to other architectures running 8-bit quantization’s for the same workloads.

https://semico.com/content/what-real-promise-artificial-intelligence

Regards
 
  • Like
  • Love
  • Fire
Reactions: 34 users
Top Bottom