BRN Discussion Ongoing

Gies

Regular
I'll be watching out for a flying green baby from 40k ft this morning. It'll be my only indicator that the day is going well 🤪😂

Happy Christmas all.
I'll enjoy a pint or six for while at home in Ireland.

Akida Ballista baby 🔥🔥
I'll be watching out for a flying green baby from 40k ft this morning. It'll be my only indicator that the day is going well 🤪😂

Happy Christmas all.
I'll enjoy a pint or six for while at home in Ireland.

Akida Ballista baby 🔥🔥
66D59D3F-7743-4409-A256-A5B992783BF9.jpeg
 
  • Like
  • Haha
  • Fire
Reactions: 15 users

Esq.111

Fascinatingly Intuitive.
Good Morning Chippers,

Feeling good about today.

Australia, America & Europe have had time to digest and quantify all the great news last week.

Wooo hooo.

Regards,
Esq.
 
  • Like
  • Love
  • Fire
Reactions: 32 users

IloveLamp

Top 20
GREAT FIND GENEROUSLY SHARED Mt09 MANY THANKS.

So Renesas in a sign of deep and long term commitment to Brainchip’s AKIDA Technology specifically acquire the tools to support AKIDA:

“Renesas Electronics is to acquire US startup Reality Analytics Inc. (Reality AI) for its embedded edge AI tools in an all-cash transaction”

As a result as Anil Mankar has mentioned many times Renesas is tapping out a 22nm chip via a third party foundry with the intention to test the market at 40nm and above in house depending on market demand:

“Our next move is to more advanced technology nodes to push the microcontrollers into the gigahertz regime and that’s where the is overlap with microprocessors. The way I look at it is all about the system performance.”


“Now you have accelerators for driving AI with neural processing units rather than a dual core CPU. We are working with a third party taping out a device in December on 22nm CMOS,” said Chittipeddi.

Brainchip and Renesas signed a deal in December 2020 to implement the spiking neural network technology. Tools are vital for this new area. “The partner gives us the training tools that are needed,”

So the journey to every sensor being made intelligent continues.

My opinion only DYOR
FF

AKIDA BALLISTA
Nikkei Asia: TSMC fab in Japan at center of Sony's image sensor kingdom.

The move is just the first step in Sony's grand plan to bolster output of CMOS image sensors across Kyushu. At the group's Nagasaki facility, a fab that just opened last year is already undergoing a second expansion to bring additional facilities online as early as next year.

Sony also operates chip plants in Oita and Kagoshima prefectures, also in Kyushu.

This production network will rely on logic chips from TSMC's new fab, slated to begin full production in 2025 and to be run by a joint venture with Sony and Denso.

With the smartphone market having peaked in 2016 and 2017, and many consumers having upgraded to 5G handsets, demand for sensors in phones may slow, forcing Sony to cultivate new buyers for sensors.

The planned plant is expected to churn out sensors for smartphones, but it may have to start producing sensors for autonomous driving and factory automation applications down the road.

On top of this, setting up local infrastructure such as housing and schools for employees and their families will be important, as is talent acquisition.

Japan's chip sector fell behind foreign rivals after failing to keep investing in the development of next-generation products in the 2010s. But the global supply chain crises brought on by the pandemic has built momentum for a domestic revival with the newly formed Rapidus, a government-backed chipmaker set up by such top companies as NEC, Toyota Motor and Sony.

How public-private efforts can develop Kyushu into Japan's "Silicon Island" will hold the key to whether the country's much-anticipated chip-industry renaissance will materialize
 
  • Like
  • Fire
  • Love
Reactions: 22 users
  • Haha
  • Like
  • Love
Reactions: 18 users
Hi
I know Samsung has been spoken about at various times but I hadn't actually seen a direct neuromorphic study or reference by them till now.

Maybe has been posted before(?) or not.

They have their R&D division I just found...SAIT.

Here I found one section which clearly spells out essentially what we can offer them quite quickly for testing etc.

I wonder.


View attachment 24778
@Fullmoonfever

Over at the other place in 2020/21 after the CEO said a large Sth Korean technology company was extensively testing AKIDA I proposed it was Samsung because at that time they were the only ones who had a unit as you have referenced above with back then over 100 engineers devoted to neuromorphic research.

Then fuel was added to this when Anil Mankar publicly thanked Samsung for lending Brainchip a DVS camera for research purposes and then Samsung white goods appeared in Brainchip promotional videos.

I personally believe Samsung will eventually be revealed as a customer and would have been by now but for the imprisonment of their CEO by a corrupt government who were trying to takeover Samsung. They were forced to release him and put him back in charge after Samsung employees basically went on what we would call a go slow.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Fire
  • Love
Reactions: 55 users

IloveLamp

Top 20
Screenshot_20221219_081928_LinkedIn.jpg
Screenshot_20221219_081918_LinkedIn.jpg
 
  • Like
  • Fire
  • Love
Reactions: 59 users

Deadpool

hyper-efficient Ai
Nearly had a coronary episode when I saw this headline.
Obviously not us but just maybe we might have some collaboration??
Another fine example of an Aussie company kicking ass.

 
  • Like
  • Haha
  • Wow
Reactions: 25 users
How about a little nostalgia:

“As a leading provider of ASICs worldwide, we are pleased to offer our customers advanced technologies driving new innovations,” said Noriaki Kubo, Corporate Executive Vice President of Socionext Inc. “The Akida family of products allows us to stay at the forefront of the burgeoning AI market. BrainChip and Socionext have successfully collaborated on the Akida IC development and together, we aim to commercialize this product family and support our increasingly diverse customer base.”


My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Fire
  • Love
Reactions: 38 users
  • Like
  • Love
  • Fire
Reactions: 48 users

Mccabe84

Regular
I finally reached my goal on the amount of shares in BRN I wanted to own on Friday, but now with all this information coming out I’m trying to decide if i should buy more 🤔. I also agree I see more value in the shares now than when they hit $2.34, so many great companies coming aboard
 
  • Like
  • Fire
  • Love
Reactions: 52 users

TopCat

Regular
Nearly had a coronary episode when I saw this headline.
Obviously not us but just maybe we might have some collaboration??
Another fine example of an Aussie company kicking ass.

Maybe this could have something to do with it??

Abstract.
The use of robotic arms in various fields of human endeavor has increased over the years, and with recent advancements in artificial intelligence enabled by deep learning, they are increasingly being employed in medical applications like assistive robots for paralyzed patients with neurological disorders, welfare robots for the elderly, and prosthesis for amputees. However, robot arms tailored towards such applications are resource-constrained. As a result, deep learning with conventional artificial neural network (ANN) which is often run on GPU with high computational complexity and high power consumption cannot be handled by them. Neuromorphic processors, on the other hand, leverage spiking neural network (SNN) which has been shown to be less computationally complex and consume less power, making them suitable for such applications. Also, most robot arms unlike living agents that combine different sensory data to accurately perform a complex task, use uni-modal data which affects their accuracy. Conversely, multi-modal sensory data has been demonstrated to reach high accuracy and can be employed to achieve high accuracy in such robot arms. This paper presents the study of a multi-modal neurorobotic prosthetic arm control system based on recurrent spiking neural network. The robot arm control system uses multi-modal sensory data from visual (camera) and electromyography sensors, together with spike-based data processing on our previously proposed R-NASH neuromorphic processor to achieve robust accurate control of a robot arm with low power. The evaluation result using both uni-modal and multi-modal input data show that the multi-modal input achieves a more robust performance at 87%, compared to the uni-modal.
Keywords: Multi-modal, Deep Learning, Neurorobotic, Prosthetic Arm, Control System, Electromyography, Spiking Neural Network, Neuromorphic.

D36E865F-A487-4E64-9FEA-20949A4C49EA.jpeg
 
  • Like
  • Fire
  • Love
Reactions: 13 users

VictorG

Member
If Renesas, Intel and Brainchip are excited have we finally reached a tipping point on this journey to commercial success???

My opinion only DYOR
FF

AKIDA BALLISTA
I think so. I also believe many big name companies will now declare their love for BRN.
We've had a lot of wins throughout 22 but it was the Intel domino that will force all others to fall in line.

BRN's momentum is now unstoppable.
 
  • Like
  • Fire
  • Love
Reactions: 43 users

buena suerte :-)

BOB Bank of Brainchip
I'll be watching out for a flying green baby from 40k ft this morning. It'll be my only indicator that the day is going well 🤪😂

Happy Christmas all.
I'll enjoy a pint or six for you all while at home in Ireland.

Akida Ballista baby 🔥🔥
St Patricks Day Party GIF by Guinness US
Cheers @AusEire ... Enjoy your 'cold' Xmas nights in front of the fire in the local pub and maybe a 'lock in' or two with a few muso's ! 🎸🎸 ;)🎄🤶🎅
 
Last edited:
  • Like
  • Love
Reactions: 11 users

Gies

Regular
Hi @Gies

Are you showing us how they make Heineken or that celebrating drunk German Brainchip shareholders are leaving beer bottles all over the ski fields. 😂🤣😂🤡🤣😂

Merry Christmas.

Regards
FF

AKIDA BALLISTA


This is how the Dutch make Nothern Light. AKIDA must be inside.
 
  • Haha
  • Love
  • Like
Reactions: 7 users

AARONASX

Holding onto what I've got
Has this been shared? I haven't come across these guys before but we get a small mention

Giant.AI, Inc.

US 11478927 B1 (not BrainChips patent)

Hybrid Computing Architectures With Specialized Processors To Encode/decode Latent Representations For Controlling Dynamic Mechanical Systems​

(109) Based on the matching, a hardware machine-learning accelerator may be configured to execute operations of a machine learning model upon inputs received from one or more sensors or encoders. For example, some embodiments of robots and other controlled dynamic mechanical systems described herein may include a plurality of sensors of a modular system hardware design such that each sensor (or a grouping of sensors) is coupled (directly, in some examples) with special-purpose chipsets for performing a space (e.g., like a sub-space or latent-space) or other encoding of sensor data prior to downstream digestion by a higher-level component or model of the system. Moreover, one or more intermediate or downstream models, like various models for encoding inputs, may operate on those encoded outputs to combine sub-spaces into broader representations (which is not to suggest that the broader representation need be of higher dimensionality or size, but rather that it accounts for more properties in aggregate that are reported by sensors of the sensor layer). One or more of the upstream, intermediate (or downstream) encoders may be implemented within one or more hardware ML Accelerators like, but not limited to, Movidius chips, tensorflow edge compute devices, Nvidia Drive PX and Jetson TX1/TX2 Module, Intel Nervana processors, Mobileye EyeQ processors, Habana processors, Qualcomm's Cloud AI100 processors and SoC AI engines, IBM's TrueNorth processors, NXP's S32V234 and S32 chips, AWS Inferentia chips, Microsoft Brainwaive chips, Apple's Neural Engine, ARM's Project Trillium based processors, Cerebras's processors, Graphcore processors, PEZY Computing processors, Tenstorrent processors, Blaize processors, Adapteva processors, Mythic processors, Kalray's Massively Parallel Processor Array, BrainChip's spiking neural network processors, Almotiv's neural network acceleration core, Hailo-8 processors, and various neural network processing units from other vendors. Different ones of these ML Accelerators may be used to implement different ones of the aforementioned models upon sensor data (or upstream encoder output data), such as based on matching of model performance on a given accelerator for given sensor output.
 

Attachments

  • US_11478927_B1_I.pdf
    916.7 KB · Views: 82
  • Like
  • Fire
  • Love
Reactions: 35 users

alwaysgreen

Top 20
  • Like
  • Fire
Reactions: 8 users
Has this been shared? I haven't come across these guys before but we get a small mention

Giant.AI, Inc.

US 11478927 B1 (not BrainChips patent)

Hybrid Computing Architectures With Specialized Processors To Encode/decode Latent Representations For Controlling Dynamic Mechanical Systems​

(109) Based on the matching, a hardware machine-learning accelerator may be configured to execute operations of a machine learning model upon inputs received from one or more sensors or encoders. For example, some embodiments of robots and other controlled dynamic mechanical systems described herein may include a plurality of sensors of a modular system hardware design such that each sensor (or a grouping of sensors) is coupled (directly, in some examples) with special-purpose chipsets for performing a space (e.g., like a sub-space or latent-space) or other encoding of sensor data prior to downstream digestion by a higher-level component or model of the system. Moreover, one or more intermediate or downstream models, like various models for encoding inputs, may operate on those encoded outputs to combine sub-spaces into broader representations (which is not to suggest that the broader representation need be of higher dimensionality or size, but rather that it accounts for more properties in aggregate that are reported by sensors of the sensor layer). One or more of the upstream, intermediate (or downstream) encoders may be implemented within one or more hardware ML Accelerators like, but not limited to, Movidius chips, tensorflow edge compute devices, Nvidia Drive PX and Jetson TX1/TX2 Module, Intel Nervana processors, Mobileye EyeQ processors, Habana processors, Qualcomm's Cloud AI100 processors and SoC AI engines, IBM's TrueNorth processors, NXP's S32V234 and S32 chips, AWS Inferentia chips, Microsoft Brainwaive chips, Apple's Neural Engine, ARM's Project Trillium based processors, Cerebras's processors, Graphcore processors, PEZY Computing processors, Tenstorrent processors, Blaize processors, Adapteva processors, Mythic processors, Kalray's Massively Parallel Processor Array, BrainChip's spiking neural network processors, Almotiv's neural network acceleration core, Hailo-8 processors, and various neural network processing units from other vendors. Different ones of these ML Accelerators may be used to implement different ones of the aforementioned models upon sensor data (or upstream encoder output data), such as based on matching of model performance on a given accelerator for given sensor output.
Hi @AARONASX

In their video they say their robots are cheaper and easier to train. If you throw in the general industry requirements for lowest power possible to give useable battery life of the huge list of accelerators and GPU’s and CPU’s only one matches this criteria and no prize for guessing which one.

GREAT PICK UP will be watching closely in 2023.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Fire
Reactions: 24 users

Sam

Nothing changes if nothing changes
If Renesas, Intel and Brainchip are excited have we finally reached a tipping point on this journey to commercial success???

My opinion only DYOR
FF

AKIDA BALLISTA
I’ll go out on a limb and say yes! 😉🤯
 
  • Like
  • Fire
  • Love
Reactions: 15 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Hi

@Fullmoonfever

Over at the other place in 2020/21 after the CEO said a large Sth Korean technology company was extensively testing AKIDA I proposed it was Samsung because at that time they were the only ones who had a unit as you have referenced above with back then over 100 engineers devoted to neuromorphic research.

Then fuel was added to this when Anil Mankar publicly thanked Samsung for lending Brainchip a DVS camera for research purposes and then Samsung white goods appeared in Brainchip promotional videos.

I personally believe Samsung will eventually be revealed as a customer and would have been by now but for the imprisonment of their CEO by a corrupt government who were trying to takeover Samsung. They were forced to release him and put him back in charge after Samsung employees basically went on what we would call a go slow.

My opinion only DYOR
FF

AKIDA BALLISTA

Very cool FF. You know I only just realised this morning that Harman is a Samsung subsidiary. It just so happens that Harman and SoundHound have teamed up to deliver advanced voice AI. I really like the sound of this part of the announcement from November 2022 where it says,

"The deal grew out of SoundHound’s recent contract with DPCA’s parent brand Stellantis in Europe and incorporates its custom wake words and Edge+Cloud connectivity options combining on-device and cloud processing. The company also counts Hyundai, Kia, and Mercedes among its vehicular clients. The company also opened up its hybrid model to more than just cars, offering tools to embed a voice assistant with a range of cloud or on-device processing options in any compatible device."

It's not a matter of if, but when. And all the dots will join into a big conga-line of epic proportions IMO.


fraggle-rock-conga-line.gif






Screen Shot 2022-12-19 at 9.45.52 am.png
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 53 users
Very cool FF. You know I only just realised this morning that Harman is a Samsung subsidiary. It just so happens that Harman and SoundHound have teamed up to deliver advanced voice AI. I really like the sound of this part of the announcement from November 2022 where it says,

"The deal grew out of SoundHound’s recent contract with DPCA’s parent brand Stellantis in Europe and incorporates its custom wake words and Edge+Cloud connectivity options combining on-device and cloud processing. The company also counts Hyundai, Kia, and Mercedes among its vehicular clients. The company also opened up its hybrid model to more than just cars, offering tools to embed a voice assistant with a range of cloud or on-device processing options in any compatible device."

It's not a matter of if, but when. And all the dots will join into a big conga-line of epic proportions IMO.


View attachment 24799





View attachment 24794
You are definitely running with this SoundHound bizzzo. I'll have to pull my finger out and do some digging and see what I can unearth 💩🤔🔥
 
  • Like
  • Haha
  • Fire
Reactions: 10 users
Top Bottom