BRN Discussion Ongoing

Well looks like Brainchip is no longer revolutionary, high risk, cutting edge:

“While the demise of Moore’s Law has been somewhat circumvented by accelerated computing/domain-specific computing,
it’s still a tricky balance between flexibility in the computing architecture and per- formance, especially for quickly evolving workloads like AI.

Taking our cue from the most efficient computer ever known — the human brain — and using the results of millions of years of evolution as a starting point feels like a safe bet. ■

Sally Ward-Foxton is editor-in-chief of EE Times Weekend”

Feeling let down I enjoyed being invested in an outlier shunned by the WANCA’s and now with a few key strokes Brainchip’s technology has become a safe bet. 😞🤡

WHY DID SALLY HAVE TO PUBLISH THIS AT A TIME OF GLOBAL TURMOIL. WE WILL NOW HAVE THOUSANDS OF BORING SAFE HARBOUR INVESTORS TRYING TO BUY BRNASX.

My opinion only DYOR
FF


AKIDA BALLISTA
The above quote comes from one of many articles in the following linked European Edition of EE Times:


EE Times Europe
https://www.eetimes.eu › ...PDF
EMC Filter Design at the Push of a Button - EE Times Europe

13 Sept 2022 — BrainChip sees its neuromorphic processor next to every sensor in a car. ... Mercedes used BrainChip's Akida…”

Some of the articles have been posted but this is a convenient all in one place link.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Fire
  • Love
Reactions: 32 users

AARONASX

Holding onto what I've got
Well looks like Brainchip is no longer revolutionary, high risk, cutting edge:

“While the demise of Moore’s Law has been somewhat circumvented by accelerated computing/domain-specific computing,
it’s still a tricky balance between flexibility in the computing architecture and per- formance, especially for quickly evolving workloads like AI.

Taking our cue from the most efficient computer ever known — the human brain — and using the results of millions of years of evolution as a starting point feels like a safe bet. ■

Sally Ward-Foxton is editor-in-chief of EE Times Weekend”

Feeling let down I enjoyed being invested in an outlier shunned by the WANCA’s and now with a few key strokes Brainchip’s technology has become a safe bet. 😞🤡

WHY DID SALLY HAVE TO PUBLISH THIS AT A TIME OF GLOBAL TURMOIL. WE WILL NOW HAVE THOUSANDS OF BORING SAFE HARBOUR INVESTORS TRYING TO BUY BRNASX.

My opinion only DYOR
FF


AKIDA BALLISTA
It's a shit feeling when we strive to be the black sheep and someone now says we're a beautiful swan 🤣🤣🤣
 
  • Like
  • Haha
  • Fire
Reactions: 28 users
It's a shit feeling when we strive to be the black sheep and someone now says we're a beautiful swan 🤣🤣🤣
Next thing we will have posters boring us to tears with PE ratios, dividends, five year income projections, share buy back schemes and Nasdaq listings.

I’ll have to walk away.😂🤣😂😎🤡

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Haha
  • Like
  • Love
Reactions: 48 users

Xhosa12345

Regular
View attachment 17524
Damn shorters killing us 🤬😤😖

I just think its something we are going to have to learn to live with unfortunately until the disconnect between earnings and the market cap comes back to something manageable, and this is going to take a long time. The non announcements dont help, but maybe in long term they will because when there is one, its going to be a huge impact. fluff announcements would potentially make it worse....

as i said earlier, the company has not released anything contradicting where we are headed.... so thats a great sign.

we can only hope they get burned one day! i wish we were 80000 eyes instead of 1000, being there is 40K odd shareholders, in which case the mums and dads would hopefully not panic sell which adds to the 'trading pot'.

plus the world sux still, old PUTO, what a toss bag, he will keep doubling down until the end - like a typical aggressive male.
 
  • Like
  • Love
Reactions: 8 users

TechGirl

Founding Member
This video is back from 2016 - iniVation

DVS + Brainchip: Unsupervised learning for highway monitoring​

A DAVIS240B monitoring a highway is input to Brainchip’s spiking learning neural network emulator, performing unsupervised learning. Video credit: Brainchip



This video is still displayed on their website (screenshot below)

zzzzzzz.jpg





And this company iniVation won an award in June 2022, Best of Sensors award winners announced at Sensors Converge, do we know if we still have a connection?


The 2022 Innovative Product of the Year Award categories and winners included:

 Aerospace/Space: Dynamic Vision Sensor (DVS) by iniVation


Kinda GIF
 
  • Like
  • Fire
  • Love
Reactions: 29 users
Next thing we will have posters boring us to tears with PE ratios, dividends, five year income projections, share buy back schemes and Nasdaq listings.

I’ll have to walk away.😂🤣😂😎🤡

My opinion only DYOR
FF

AKIDA BALLISTA
Seriously I know @DingoBorat ‘s heart was in the right place but what good can come of Brainchip being advertised across Australia as the Number 1 Gainer on the ASX200.

Who knows what sort of bean counter type bores will see that and do a deep dive into the company and discover what a sleeping giant it is becoming across the technology world.

Next thing they will be recommending BRNASX to golf partners and dinner guests and worse still clients.

We will then have new posters telling us how they heard about BRNASX from their accountants.😞

Mark my words no good at all will come from this.

MF is doing its best to bury these stories and facts but try as they might I think it will now be too little too late.

Whether we like it or not we are likely going to make a lot of money.

My very sad opinion only so DYOR
(Try to ignore the note at the bottom of the page that the share price went up 7.23% yesterday.)
FF

AKIDA BALLISTA
 
  • Like
  • Haha
  • Love
Reactions: 68 users

TechGirl

Founding Member
Seriously I know @DingoBorat ‘s heart was in the right place but what good can come of Brainchip being advertised across Australia as the Number 1 Gainer on the ASX200.

Who knows what sort of bean counter type bores will see that and do a deep dive into the company and discover what a sleeping giant it is becoming across the technology world.

Next thing they will be recommending BRNASX to golf partners and dinner guests and worse still clients.

We will then have new posters telling us how they heard about BRNASX from their accountants.😞

Mark my words no good at all will come from this.

MF is doing its best to bury these stories and facts but try as they might I think it will now be too little too late.

Whether we like it or not we are likely going to make a lot of money.

My very sad opinion only so DYOR
(Try to ignore the note at the bottom of the page that the share price went up 7.23% yesterday.)
FF

AKIDA BALLISTA

God I love your twisted sense of humor :ROFLMAO::LOL::ROFLMAO:

Cracking Up Lol GIF by The Tonight Show Starring Jimmy Fallon
 
  • Like
  • Haha
  • Love
Reactions: 34 users

wilzy123

Founding Member
Seriously I know @DingoBorat ‘s heart was in the right place but what good can come of Brainchip being advertised across Australia as the Number 1 Gainer on the ASX200.

Who knows what sort of bean counter type bores will see that and do a deep dive into the company and discover what a sleeping giant it is becoming across the technology world.

Next thing they will be recommending BRNASX to golf partners and dinner guests and worse still clients.

We will then have new posters telling us how they heard about BRNASX from their accountants.😞

Mark my words no good at all will come from this.

MF is doing its best to bury these stories and facts but try as they might I think it will now be too little too late.

Whether we like it or not we are likely going to make a lot of money.

My very sad opinion only so DYOR
(Try to ignore the note at the bottom of the page that the share price went up 7.23% yesterday.)
FF

AKIDA BALLISTA
Home (1).jpg
sad-baby.gif
 
  • Haha
  • Like
  • Love
Reactions: 12 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Whacko-the-diddle-oh! This looks interesting!


The KI Delta Learning project is being funded by the German Federal Ministry for Economic Affairs and Energy. In addition to Porsche Engineering, partners include BMW, CARIAD and Mercedes-Benz, major suppliers such as Bosch, and nine universities, including the Technical University of Munich and the University of Stuttgart.

A nice Christmas present hopefully.



Screen Shot 2022-09-28 at 9.48.16 am.png
 
  • Like
  • Fire
Reactions: 28 users

FlipDollar

Never dog the boys
  • Like
  • Haha
  • Fire
Reactions: 11 users

stan9614

Regular
anyone recorded the Anil's presentation on TinyML via zoom last night?
 
  • Like
Reactions: 6 users

Kachoo

Regular
Seriously I know @DingoBorat ‘s heart was in the right place but what good can come of Brainchip being advertised across Australia as the Number 1 Gainer on the ASX200.

Who knows what sort of bean counter type bores will see that and do a deep dive into the company and discover what a sleeping giant it is becoming across the technology world.

Next thing they will be recommending BRNASX to golf partners and dinner guests and worse still clients.

We will then have new posters telling us how they heard about BRNASX from their accountants.😞

Mark my words no good at all will come from this.

MF is doing its best to bury these stories and facts but try as they might I think it will now be too little too late.

Whether we like it or not we are likely going to make a lot of money.

My very sad opinion only so DYOR
(Try to ignore the note at the bottom of the page that the share price went up 7.23% yesterday.)
FF

AKIDA BALLISTA
You crack me up.
 
  • Haha
  • Like
Reactions: 6 users
Was checking in on our patents again...you know...anything new yet....unfortunately, not yet but did find something I didn't know.


I posted this site before and has some additional functions for quick snapshots, at the end of the post.

Spotted a name on one of our more recent patents as below:

1664327840333.png


Thought ok, don't recognise you...yes, I know now that many others here probs do haha but anyway.

Thought check out what else involved in. Didn't realise his prior role with Samsung for 6 odd years as an example:

1664328052218.png


So, well versed in our area obviously and appears to be a great pick up back in 2021.

A previous connection like that is nice to have and be great if has access to some doors we would be allowed to knock on :unsure:

Intellectual Property Manager, Innovation and IP, R&D Strategy​

Samsung R&D Institute India - Delhi (SRI-DELHI) (Samsung India Electronics Pvt. Ltd.)​

Sep 2012 - Jul 20196 years 11 months
Noida Area, India
Headed IP division and managed innovation and IP operations at Samsung. Spurred creativity, inventions & innovations in the organization. Provided consultation in innovation and Intellectual Property strategy and commercial advice. Reviewing tech strategy and inventions, worked with researchers to develop inventions and consumer electronics product features utilising engineering skills and patenting new technology solutions/ inventions. Managed patents' life cycle from drafting & filing of inventions to filing responses of office actions during prosecution and to maintain granted patents, including handling various formalities at different patent offices world-wide. Prepared patent maps and patent reports. Performed white space and patent acquisition studies.

Also, was involved in startup and it's IP sensing to provide collaboration and M&A recommendations.

Handled multiple IP attorney vendor firms for availing various IP services for the organization. Handled financial forecasting, budgeting, and execution of IP services. Was involved in audit & process related aspects. Reviewed research papers for international conferences and publications being shared outside the organization from quality and IP leakage perspective.

-Conducted self research to identify challenges and resolve design and operation problems
-Was responsible for designing embedded solution for consumer devices
-Was responsible for feasibility check of next generation electronic systems and applications along with bench marking for commercialization
- Lead researchers in innovation and to analyse and reverse engineering of embedded systems and applications
-Analysed and enhanced efficiency, stability and scalability of solutions
-Guided researchers in the organization for advanced research & commercialise-able concept development

1664328515907.png


1664328533897.png


1664328585219.png
 
  • Like
  • Love
  • Fire
Reactions: 37 users

Proga

Regular
Breaking news on Bloomberg TV - Apple ditches production increase as demand falters

On the flipside, it should free up some fab slots for anyone trying to get to market using Akida IP.

I've owned many a share on the top 5 gainers list 1 day, only to be on the top 5 losers list the next.
 
Last edited:
  • Like
Reactions: 9 users

TechGirl

Founding Member
Lots of interesting connections where Anil is speaking

tinyML Neuromorphic Technical Program Committee

Charlotte Frenkel, Chair, Delft University of Technology
Christoph Posch, PROPHESEE
Jae-sun Seo, Arizona State University

Priya Panda, Yale University
Sadique Sheik, SynSense AG
Yulia Sandamirskaya, Intel
Friedemann Zenke, University of Basel
Andre van Schaik, Western Sydney University
Evgeni Gousev, Qualcomm

Ira Feldman, tinyML Foundation
Bette Cooper, tinyML Foundation
Olga Goremichina, tinyML Foundation

zz.jpg

Open below link for what each speaker is discussing

 
  • Like
  • Fire
  • Love
Reactions: 24 users

Diogenese

Top 20
Anyone record this?
No, but the BrainChip promo is interesting:
https://brainchip.com/tinyml-neuromorphic-engineering-forum-tuesday-september-27-2022-virtual/
"... He will highlight how hardware design choices such as the event-based computing paradigm, low-bit width precision computation, the co-location of processing and memory, distributed computation, and support for efficient, on-chip learning algorithms enable low-power, high-performance ML execution at the edge. Finally, Mankar will discuss how this architecture supports next-generation SNN algorithms such as binarized CNNs and algorithms that efficiently utilize temporal information to increase accuracy."

"Utilizing temporal information to increase accuracy" sounds like Anil may have splashed a bit of the secret sauce about - the discovery by Simon Thorpe's group that most of the relevant information is contained in the early-arriving spikes, leading to N-of-M coding.



1664329513734.png


1664329289448.png




1664329376246.png


1664329456228.png





It was more than just serendipity that PvdM was the only person in the world who recognized the practical implications of this and had the hardware to implement it.
 
  • Like
  • Fire
  • Love
Reactions: 41 users

VictorG

Member
No, but the BrainChip promo is interesting:
https://brainchip.com/tinyml-neuromorphic-engineering-forum-tuesday-september-27-2022-virtual/
"... He will highlight how hardware design choices such as the event-based computing paradigm, low-bit width precision computation, the co-location of processing and memory, distributed computation, and support for efficient, on-chip learning algorithms enable low-power, high-performance ML execution at the edge. Finally, Mankar will discuss how this architecture supports next-generation SNN algorithms such as binarized CNNs and algorithms that efficiently utilize temporal information to increase accuracy."

"Utilizing temporal information to increase accuracy" sounds like Anil may have splashed a bit of the secret sauce about - the discovery by Simon Thorpe's group that most of the relevant information is contained in the early-arriving spikes, leading to N-of-M coding.



View attachment 17553

View attachment 17550



View attachment 17551

View attachment 17552




It was more than just serendipity that PvdM was the only person in the world who recognized the practical implications of this and had the hardware to implement it.
What he said 👍
 
  • Haha
  • Like
  • Fire
Reactions: 17 users
Breaking news on Bloomberg TV - Apple ditches production increase as demand falters

On the flipside, it should free up some fab slots for anyone trying to get to market using Akida IP.

I've owned many a share on the top 5 gainers list 1 day, only to be on the top 5 losers list the next.
Thanks for trying to cheer me up.

But alas that artless one million dollar buy at 86 cents had boring lawyer type written all over it.

I know the type just buying so he can mention it at the golf club.

A lawyer I knew years ago now when he had new offices built rang up the law book company and ordered a metre of leather bound law books with gold leaf and dark green on the spine to go on the bookcase behind his desk. Didn’t care what law it covered as it was for appearance.

My son tried to cheer me up by saying don’t worry Dad it could have been a short covering his position.

But what short stands in the sunlight for twenty minutes before open and then just let’s his order stand.

Apart from anything else twenty minutes in direct sunlight he would turn to dust.

No I am afraid it’s too late. We’re becoming cardigan wearing blue chip investors and there is nothing anyone can do to stop it.

Regards
FF

AKIDA BALLISTA 😞
 
  • Haha
  • Like
  • Fire
Reactions: 36 users

alwaysgreen

Top 20
Here we go. Standard ASX bullshit with BRN. Up 7% one day then back down the next. 🤦
 
  • Like
  • Sad
Reactions: 8 users
Article from last year was interesting on Huawei's thoughts on neuromorphic.

Though we're not mentioned it appears Huawei finally cottoned on to a couple of the key aspects - highlighted.



30 March 2021

Huawei embraces neuromorphic computing for IoT​


By Phil Hunter
The convention of IoT devices being lightweight in processing capability is being turned on its head by the rise of neuromorphic computing.
The aim is to mimic the plasticity of the human brain in a new generation of chips optimized for data analytics, employing algorithms under the banners of AI and machine learning. This is being driven by several factors, including demand for ultra-low latency edge computing and desire to save network bandwidth by cutting down on data transmission between end IoT devices and the cloud or centralized data centers.
It is true that edge computing can be deployed in distributed servers, but this itself imposes an overhead and cost, as well as requiring a lot of local bandwidth in some cases.

The sticking point might appear to be power consumption, given that many IoT devices are deployed for long time periods in locations that are not convenient to visit frequently for battery changes. By a similar token, direct connections to the electricity grid are usually either unavailable or impractical, while having dedicated solar or wind panels would elevate costs per device too much in most use cases.

But this calculation ignores the high power consumption of radios, as we were reminded when talking recently to Henk Koopmans, CEO of R&D at Huawei UK. He actually cited the desire to boost battery life as a motivation for massive increases in IoT device processor capabilities, alongside need to reduce latency and save on data transfers to the cloud.

“As many IoT devices are battery powered, often in hard-to-access places, replacing the batteries is time-consuming and affects the cost efficiency of the business model,” Koopmans noted. “Local processing reduces the need for wireless transmissions, the part of the device using the most energy, thereby greatly extending the battery life.”

But this assumes that such a hike in local processing power can be achieved affordably without offsetting the energy gains through cutting wireless transmission drastically. As Koopmans put it, “The challenge, therefore, is to come up with a new type of processor, capable of a level of artificial intelligence to enable the device to locally analyze the data and locally make decisions, while still retaining the very low power consumption level required for IoT devices.”

Koopmans, and Huawei, are convinced that such capability will be achieved through the emerging field of neuromorphic computing, or the third generation of AI as it is sometimes dubbed. The first generation of AI, sometimes called expert systems, emerged over 40 years ago in the 1970s in rule-based systems that emulated classical logical processes to draw reasoned conclusions within a specific, narrowly defined problem domain or field of expertise.

The poster child of this first generation was a medical diagnostic system called Mycin developed at Stanford University in the early 1970s, which demonstrated the genre well but was limited in scope and gained little traction in the clinic. Indeed, it was initially confined to identification of bacteria causing severe infections, such as meningitis, and then recommend appropriate antibiotics with dosages adjusted for the patient’s body weight.

Then after a prolonged lull in the AI field, the second generation emerged during the noughties, brought on by the phenomenal advanced in computational power that enabled application of sophisticated statistical regression at scale to very large data sets. This enabled pattern matching and identification at far higher resolution and granularity, leading to valuable applications in sensing and perception under the banners of neural networks and deep learning.

The ability to identify video streams on the basis of objects within individual frames, as well as to diagnose medical conditions such as some cancers automatically through analysis of X-ray or MRI scanned images, are examples of proven applications.

This second generation has been said to be modelled on the structure and processes of the human brain, but in reality it has just been loosely inspired by that. The neuroscience behind human cognition was just not well enough understood for direct translation into AI algorithms.
The mantra of mimicking the human brain is still being used for the third generation of AI, or neuromorphic computing, but with rather more humility, or perhaps reality. There is much talk of incorporating aspects of biological neural networks more directly into electronic circuits, but with admission that this is as much to provide tools for neuroscientists to develop and test theories of how human brains operate in more detail, as in turn to take inspiration from the brain in cognitive computing.

Indeed, this is already proving to be a two way process with neuroscientists working alongside cognitive computing specialists. It is already clear that even if biomorphic computing does not mimic the brain exactly, an approach in which complex multilayered networks are embodied directly in the architecture of Very Large Scale Integration (VLSI) systems containing electronic analog circuits can greatly accelerate machine learning processes with higher efficiency and much reduced power consumption.

It can also mimic some of the flexibility or plasticity of the human brain, with ability to reconfigure rapidly in near real time to tackle problems more adaptively in response to feedback. Such a structure is also more resilient against failures in the system. Finally, there are also possible security gains, as Koopmans noted, by retaining personal data at a local level, rather than being sent to a cloud where it could be used in an unintended way.

A critical aspect of research therefore lies in investigating how the morphology, or structure, of individual neurons, circuits, applications, and large-scale architectures enables the desired level and type of computation, achieved through available fundamental components such as transistors and spintronic memories.

It could be said to be the usual suspects engaging in such research beyond Huawei, notably leading chipmakers. Intel has developed a chip called Loihi, which it describes as its fifth generation self-learning neuromorphic research test chip, introduced in November 2017. This is a 128-core design based on a specialized architecture that is fabricated on 14-nanometer process technology. The key design future is operation around spiking neural networks (SNNs), developed specifically for arranging logic elements to emulate neural networks as understood to exist in brains of humans and indeed many animals.

The key property is adaptability or plasticity, the ability to learn from experience at the silicon level so that the overall networks become more capable, or smarter, over time. This is achieved by adding a concept that is known to exist in animal brains, that of activation whereby neurons fire only when their membrane electric charge exceeds a set threshold. At this point the neuron generates and transmits a signal which causes other neurons receiving it either to increase or decrease their own potentials as a result. This leads to coordinated activations and firings that can correspond with or execute cognitive processes.

It can be seen then that such a system is a valuable tool for neuroscientists to investigate hypotheses, as well as a vehicle for cognitive computing R&D. There are various research projects working with such ideas, including the European Human Brain Project, which has designed its own a chip and is working on a project called ‘BrainScaleS-2’.

The key point for Koopmans is that the underlying concepts are being proven and that the prizes are huge. “By trying to figure out whether processors can in some way copy the functions of the brain would, even on a small scale, represent a major advance,” said Koopmans. “For example, by replacing the commonly accepted processor architecture, with its separation between CPU and memory, the interconnection between the two being a major bottleneck in processor speeds, with in-memory processing, would be revolutionary.” This is why much of the R&D effort is focused on this area.

The biggest challenge facing this field is not so much at the level of technical design but scaling up for commercial deployment in the field. It is hard to overestimate the importance of, and dependence on, the testing and development ecosystems that have grown up around conventional chip development and manufacture. “Silicon processor chips are designed using CAD (computer-aided design) tools,” said Koopmans. “These tools don’t just allow for the design of the chip, they are also capable of simulating the performance. The investment in such tools is enormous because chip design complexity is increasing all the time.”

As a result, Koopmans admitted that despite the optimism, large scale deployment is a long way off. “What is clear is that the first step is to create the tools to both design and simulate these new chips, which can take years, and we’re still in the research stage.”
 
  • Like
  • Fire
  • Love
Reactions: 19 users
Top Bottom