BRN Discussion Ongoing

zeeb0t

Administrator
Staff member
Hello everyone, sorry to interrupt the conversation. I was just wondering how things have been going since Dreddb0t was introduced. I've noticed that it has dealt with all reported posts since inception.

Also, please keep in mind that if you come across any content that you think violates the rules, you can click on the report button and Dreddb0t will make a decision within a minute, regardless of the time or day.
 
  • Like
  • Love
  • Fire
Reactions: 57 users

Diogenese

Top 20
Hello everyone, sorry to interrupt the conversation. I was just wondering how things have been going since Dreddb0t was introduced. I've noticed that it has dealt with all reported posts since inception.

Also, please keep in mind that if you come across any content that you think violates the rules, you can click on the report button and Dreddb0t will make a decision within a minute, regardless of the time or day.
Like cricket umpires, they are doing a good job if no one notices them.
 
  • Like
  • Haha
  • Love
Reactions: 22 users
Come on Littleshort tells us what Michael said???

Tell us please the suspense is killing me.😂🤣😂

Hi @Fact Finder - his comment was simply tagging in Blake Eastman from Nonverbal Group, which in itself is interesting - in terms of the founder of this particular company being tagged into a BrainChip

More interestingly is the inclusion of Kyongsik Yun on the Nonverbal Group website

Kyongsik specialises in machine learning at NASA Jet Propulsion Laboratory.

Kyongsik Yun is a technologist at the Jet Propulsion Laboratory, California Institute of Technology, and a senior member of the American Institute of Aeronautics and Astronautics (AIAA).

His research focuses on building brain-inspired technologies and systems, including deep learning computer vision, natural language processing, brain-computer interfaces, and noninvasive remote neuromodulation


What does it all mean? What is the link? I have no idea. The website is not phone friendly, maybe it is more clear viewing on a PC?


1679104302609.png




1679104610893.png


1679104717375.png

1679104838193.png
 
  • Like
  • Fire
Reactions: 40 users

HopalongPetrovski

I'm Spartacus!
Hello everyone, sorry to interrupt the conversation. I was just wondering how things have been going since Dreddb0t was introduced. I've noticed that it has dealt with all reported posts since inception.

Also, please keep in mind that if you come across any content that you think violates the rules, you can click on the report button and Dreddb0t will make a decision within a minute, regardless of the time or day.
All good Z.
Can you see if he can do anything for the share price?
Maybe smack down some of the shortee’’s? 🤣
 
  • Haha
  • Like
  • Fire
Reactions: 21 users
Hi @Fact Finder - his comment was simply tagging in Blake Eastman from Nonverbal Group, which in itself is interesting - in terms of the founder of this particular company being tagged into a BrainChip

More interestingly is the inclusion of Kyongsik Yun on the Nonverbal Group website

Kyongsik specialises in machine learning at NASA Jet Propulsion Laboratory.

Kyongsik Yun is a technologist at the Jet Propulsion Laboratory, California Institute of Technology, and a senior member of the American Institute of Aeronautics and Astronautics (AIAA).

His research focuses on building brain-inspired technologies and systems, including deep learning computer vision, natural language processing, brain-computer interfaces, and noninvasive remote neuromodulation


What does it all mean? What is the link? I have no idea. The website is not phone friendly, maybe it is more clear viewing on a PC?


View attachment 32527



View attachment 32528

View attachment 32529
View attachment 32533
Thanks @thelittleshort

In desperation I resorted to following the science which has served me well across my investments.

As a result I found this very recently published paper out of China.

While I have included the link I don’t think one needs to read more than the enclosed abstract to understand why AKIDA second generation with vision transformers has excited so many particularly when you appreciate that until AKIDA 2nd gen even Brainchip said that the reason they included CNN in their SNN design was that CNN had a clear advantage in vision processing.

Bringing together SNN with vision transformers means according to this research paper this is no longer the case:

Deep Spiking Neural Networks with High Representation Similarity Model Visual Pathways of Macaque and Mouse​

Liwei Huang, Zhengyu Ma, Liutao Yu, Huihui Zhou, Yonghong Tian
arXiv preprint arXiv:2303.06060, 2023
Deep artificial neural networks (ANNs) play a major role in modeling the visual pathways of primate and rodent. However, they highly simplify the computational properties of neurons compared to their biological counterparts. Instead, Spiking Neural Networks (SNNs) are more biologically plausible models since spiking neurons encode information with time sequences of spikes, just like biological neurons do. However, there is a lack of studies on visual pathways with deep SNNs models. In this study, we model the visual cortex with deep SNNs for the first time, and also with a wide range of state-of-the-art deep CNNs and ViTs for comparison. Using three similarity metrics, we conduct neural representation similarity experiments on three neural datasets collected from two species under three types of stimuli. Based on extensive similarity analyses, we further investigate the functional hierarchy and mechanisms across species. Almost all similarity scores of SNNs are higher than their counterparts of CNNs with an average of 6.6%. Depths of the layers with the highest similarity scores exhibit little differences across mouse cortical regions, but vary significantly across macaque regions, suggesting that the visual processing structure of mice is more regionally homogeneous than that of macaques. Besides, the multi-branch structures observed in some top mouse brain-like neural networks provide computational evidence of parallel processing streams in mice, and the different performance in fitting macaque neural representations under different stimuli exhibits the functional specialization of information processing in macaques. Taken together, our study demonstrates that SNNs could serve as promising candidates to better model and explain the functional hierarchy and mechanisms of the visual system.
View at arxiv.org

This reveal of course does more than just confirm the reason why there is such excitement but it also gives weight to Peter van der Made’s statement that with the release AKIDA 2nd gen the about 3 year lead would extend out to about 5 YEARS.

Once Edge Impulse starts to publicly demonstrate the vision leap made possible by AKIDA 2nd gen things will get very exciting I suspect.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Fire
  • Love
Reactions: 37 users
D

Deleted member 118

Guest
Missed day 1 and 2 highlights



 
  • Like
Reactions: 6 users
An article about TSMC's thoughts about China vs US and it's impact on the rest of the World, and how Taiwan are caught up in the middle of it all:


"China’s semiconductor industry still lags behind that of Taiwan by about five or six years in terms of technology, according to Chang." Morris Chang is the founder of TSMC.
 
  • Like
  • Fire
Reactions: 20 users

TopCat

Regular
So much great information and links being posted here lately that I’m sure this has already been posted and I just can’t keep up , but just in case it hasn’t been.


DALLAS, March 15, 2023 /PRNewswire/ -- To build on innovations that advance intelligence at the edge, Texas Instruments (TI) (Nasdaq: TXN) today introduced a new family of six Arm® Cortex®-based vision processors that allow designers to add more vision and artificial intelligence (AI) processing at a lower cost, and with better energy efficiency, in applications such as video doorbells, machine vision and autonomous mobile robots.

This new family, which includes the AM62A, AM68A and AM69A processors, is supported by open-source evaluation and model development tools, and common software that is programmable through industry-standard application programming interfaces (APIs), frameworks and models. This platform of vision processors, software and tools helps designers easily develop and scale edge AI designs across multiple systems while accelerating time to market. For more information, see www.ti.com/edgeai-pr.

"In order to achieve real-time responsiveness in the electronics that keep our world moving, decision-making needs to happen locally and with better power efficiency," said Sameer Wasson, vice president, Processors, Texas Instruments. "This new processor family of affordable, highly integrated SoCs will enable the future of embedded AI by allowing for more cameras and vision processing in edge applications."
 
  • Like
  • Fire
  • Love
Reactions: 20 users

Diogenese

Top 20
Thanks @thelittleshort

In desperation I resorted to following the science which has served me well across my investments.

As a result I found this very recently published paper out of China.

While I have included the link I don’t think one needs to read more than the enclosed abstract to understand why AKIDA second generation with vision transformers has excited so many particularly when you appreciate that until AKIDA 2nd gen even Brainchip said that the reason they included CNN in their SNN design was that CNN had a clear advantage in vision processing.

Bringing together SNN with vision transformers means according to this research paper this is no longer the case:

Deep Spiking Neural Networks with High Representation Similarity Model Visual Pathways of Macaque and Mouse​

Liwei Huang, Zhengyu Ma, Liutao Yu, Huihui Zhou, Yonghong Tian
arXiv preprint arXiv:2303.06060, 2023
Deep artificial neural networks (ANNs) play a major role in modeling the visual pathways of primate and rodent. However, they highly simplify the computational properties of neurons compared to their biological counterparts. Instead, Spiking Neural Networks (SNNs) are more biologically plausible models since spiking neurons encode information with time sequences of spikes, just like biological neurons do. However, there is a lack of studies on visual pathways with deep SNNs models. In this study, we model the visual cortex with deep SNNs for the first time, and also with a wide range of state-of-the-art deep CNNs and ViTs for comparison. Using three similarity metrics, we conduct neural representation similarity experiments on three neural datasets collected from two species under three types of stimuli. Based on extensive similarity analyses, we further investigate the functional hierarchy and mechanisms across species. Almost all similarity scores of SNNs are higher than their counterparts of CNNs with an average of 6.6%. Depths of the layers with the highest similarity scores exhibit little differences across mouse cortical regions, but vary significantly across macaque regions, suggesting that the visual processing structure of mice is more regionally homogeneous than that of macaques. Besides, the multi-branch structures observed in some top mouse brain-like neural networks provide computational evidence of parallel processing streams in mice, and the different performance in fitting macaque neural representations under different stimuli exhibits the functional specialization of information processing in macaques. Taken together, our study demonstrates that SNNs could serve as promising candidates to better model and explain the functional hierarchy and mechanisms of the visual system.
View at arxiv.org

This reveal of course does more than just confirm the reason why there is such excitement but it also gives weight to Peter van der Made’s statement that with the release AKIDA 2nd gen the about 3 year lead would extend out to about 5 YEARS.

Once Edge Impulse starts to publicly demonstrate the vision leap made possible by AKIDA 2nd gen things will get very exciting I suspect.

My opinion only DYOR
FF

AKIDA BALLISTA
Rule 1 of flat-pack assembly:
När du har provat allt annat, läs instruktionerna.
 
Last edited:
  • Haha
  • Like
  • Love
Reactions: 21 users
Hi @Fact Finder - his comment was simply tagging in Blake Eastman from Nonverbal Group, which in itself is interesting - in terms of the founder of this particular company being tagged into a BrainChip

Confirmation from Michael below that it’s a simple case that he is mates with both Adnan Boz from NVIDIA and Blake Eastman from Nonverbal Group

The silver lining is Michael has over 16,000 followers - so the BrainChip post will be seen by many who would potentially not have otherwise


1679108659190.png
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 31 users
These researchers in the Netherlands set out to create an autonomous drone using Loihi.

They succeeded and published this paper on 13.3.23 which is great but also proved Loihi is by implication no match for AKIDA 1, 1.5 or 2. As a side note they also rule out Nvidia's Jetson Nano as ever being in the race for a whole lot of reasons the least of which is needing 5 watts to 10 watts of power. They also pour cold icy water on analogue SNN coming to their rescue:

DISCUSSION AND CONCLUSION
We presented the first fully neuromorphic vision-to-control pipeline for controlling a freely flying drone. Specifically, we trained a spiking neural network that takes in high-dimensional raw event-based camera data and produces low-level control commands. Real-world experiments demonstrated a successful sim-to-real transfer: the drone can accurately follow various ego-motion setpoints, performing hovering, landing, and lateral maneuvers—even under constant yaw rate.
Our study confirms the potential of a fully neuromorphic vision-to-control pipeline by running on board with an execution frequency of 200 Hz, spending only 27 μJ per network inference. However, there are still important hurdles on the way to reaping the full system benefits of such a pipeline, embedding it on extremely lightweight (e.g., <30 g) drones.
For reaching the full potential, the entire drone sensing, pro- cessing, and actuation hardware should be neuromorphic, from its accelerometer sensors to the processor and motors. Such hardware is currently not available, so we have limited ourselves to the vision-to-control pipeline, ending at thrust and attitude commands. Concerning the neuromorphic processor, the biggest advancement could come from improved I/O bandwidth and interfacing options. The current processor could not be connected to the event-based camera directly via AER, and with our advanced use case, we reached the limits of the number of spikes that can be sent to and received from the neuromorphic processor at the desired high execution frequency. This is also the reason that we have limited ourselves to a linear network controller: the increase in input spikes needed to encode the setpoint and attitude inputs would substantially reduce the ex- ecution frequency of the pipeline. Ultimately, further gains in terms of efficiency could be obtained when moving from digital neuromorphic processors to analog hardware, but this will pose even larger development and deployment challenges.
Despite the above-mentioned limitations, the current work presents a substantial step towards neuromorphic sensing and processing for drones. The results are encouraging, because they show that neuromorphic sensing and processing may bring deep neural networks within reach of small autonomous robots. In time this may allow them to approach the agility, versatility and robustness of animals such as flying insects”


webicon_green.png
https://arxiv.org/pdf/2303.08778

Add what ANT61 is publishing and Brainchip has truly crushed the opposition however most have not yet received the memo to present themselves to the crematorium.

While AKIDA 1.0, 1.5 & 2.0 all address the shortcomings that Loihi posed including direct compatibility with DVS camera feeds just imagine what AKIDA technology and Prophesee's vision sensor will be capable off. And this is before the AKIDA with vision transformers steps up to the plate.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Love
  • Fire
Reactions: 48 users

Diogenese

Top 20
So much great information and links being posted here lately that I’m sure this has already been posted and I just can’t keep up , but just in case it hasn’t been.


DALLAS, March 15, 2023 /PRNewswire/ -- To build on innovations that advance intelligence at the edge, Texas Instruments (TI) (Nasdaq: TXN) today introduced a new family of six Arm® Cortex®-based vision processors that allow designers to add more vision and artificial intelligence (AI) processing at a lower cost, and with better energy efficiency, in applications such as video doorbells, machine vision and autonomous mobile robots.

This new family, which includes the AM62A, AM68A and AM69A processors, is supported by open-source evaluation and model development tools, and common software that is programmable through industry-standard application programming interfaces (APIs), frameworks and models. This platform of vision processors, software and tools helps designers easily develop and scale edge AI designs across multiple systems while accelerating time to market. For more information, see www.ti.com/edgeai-pr.

"In order to achieve real-time responsiveness in the electronics that keep our world moving, decision-making needs to happen locally and with better power efficiency," said Sameer Wasson, vice president, Processors, Texas Instruments. "This new processor family of affordable, highly integrated SoCs will enable the future of embedded AI by allowing for more cameras and vision processing in edge applications."
At least up to mid-2021, with one exception, TI's patents indicate a conviction that ML was a software thing.

The exception, US2022108203A1 MACHINE LEARNING HARDWARE ACCELERATOR 2020-10-01, was little more enlightened, adhering to the von Neumann architecture:

1679110578752.png



In a memory device, a static random access memory (SRAM) circuit includes an array of SRAM cells arranged in rows and columns and configured to store data. The SRAM array is configured to: store a first set of information for a machine learning (ML) process in a lookup table in the SRAM array; and consecutively access, from the lookup table, information from a selected set of the SRAM cells along a row of the SRAM cells. A memory controller circuit is configured to select the set of the SRAM cells based on a second set of information for the ML process.

[0004] In another aspect, a system includes one or more microprocessors coupled to a memory circuit. The memory circuit includes static random access memory (SRAM) circuit including an array of SRAM cells arranged in rows and columns and configured to store data, the SRAM array configured to: store a first set of information for a machine learning (ML) process in a lookup table in the SRAM array; and consecutively access, from the lookup table, information from a selected set of the SRAM cells along a row of the SRAM cells. A memory controller circuit is configured to select the set of the SRAM cells based on a second set of information for the ML process
.
 
  • Like
  • Love
Reactions: 10 users
More interestingly is the inclusion of Kyongsik Yun on the Nonverbal Group website

Kyongsik specialises in machine learning at NASA Jet Propulsion Laboratory.

58D8266A-548C-46EB-9700-379159A3057F.jpeg


*rolls up sleeves…
 
  • Like
  • Haha
  • Fire
Reactions: 24 users
D

Deleted member 118

Guest
So much great information and links being posted here lately that I’m sure this has already been posted and I just can’t keep up , but just in case it hasn’t been.


DALLAS, March 15, 2023 /PRNewswire/ -- To build on innovations that advance intelligence at the edge, Texas Instruments (TI) (Nasdaq: TXN) today introduced a new family of six Arm® Cortex®-based vision processors that allow designers to add more vision and artificial intelligence (AI) processing at a lower cost, and with better energy efficiency, in applications such as video doorbells, machine vision and autonomous mobile robots.

This new family, which includes the AM62A, AM68A and AM69A processors, is supported by open-source evaluation and model development tools, and common software that is programmable through industry-standard application programming interfaces (APIs), frameworks and models. This platform of vision processors, software and tools helps designers easily develop and scale edge AI designs across multiple systems while accelerating time to market. For more information, see www.ti.com/edgeai-pr.

"In order to achieve real-time responsiveness in the electronics that keep our world moving, decision-making needs to happen locally and with better power efficiency," said Sameer Wasson, vice president, Processors, Texas Instruments. "This new processor family of affordable, highly integrated SoCs will enable the future of embedded AI by allowing for more cameras and vision processing in edge applications."
 
  • Like
  • Fire
Reactions: 4 users
Hi @Diogenese you might find this interesting if you have not seen it yet:


You might end up being able to use your ceramic bowl for fruit again.

My opinion only DYOR - just in case you try to blame me for your house burning down.:ROFLMAO:😂🤣
FF

AKIDA BALLISTA
 
  • Like
  • Haha
Reactions: 5 users

Steve10

Regular
Sony Spresense AI MCU's can be developed on Edge Impulse platform.



Spresense 6-core microcontroller board with ultra-low power consumption​

High-performance microcontroller board with hi-res audio, camera input, internal GPS and Edge AI support.

 
  • Like
  • Fire
  • Love
Reactions: 6 users
I have just had another look at the new Brainchip website and on the first page under the heading "IN GOOD COMPANY" appears the following list and a bar to click if you too would like to become a "Partner."

The list on the home page is a banner that revolves but now lists the following companies:

"Prophesee, Renesas, Valeo, NASA, MegaChips. NVISO, EMOTION3D, Edge Impulse, Ai Labs, SiFive, Mercedes Benz, ARM, Intel"

As it is a banner no one is first in line.

When you click on 'Partner' this is what now appears:


Technology Partners




Technology partners improve vertical value with demonstrated pre-integration interoperability. Our neuromorphic IP is processor and OS independent. Let’s demonstrate the power of Essential AI together.
We work with our partners to support integration and market-relevant use cases.
BrainChip is a member of the Arm AI Partner Program, an ecosystem of hardware and software specialists enabling the next generation of intelligent AI solutions.
Untitled-design-4.png

BrainChip is a partner with Intel Foundry Services – IP Alliance. Partners in this alliance collaborate with IFS to enable designers to access high-quality IPs, supporting their design needs and project schedule, while optimizing for performance, power and area. Building upon Intel’s advanced technology offering.
Untitled-design-2-150x125.png

BrainChip partners with Prophesee driving the optimization for computer vision AI performance and efficiency to deliver next generation intelligent platforms for OEM’s looking to integrate event-based vision systems with high levels of AI performance coupled with an ultra-low power framework.
Pictogram_Prophesee_Metavision_DeepBlue.png

Integrating BrainChip’s Akida technology and SiFive’s multi-core capable RISC-V processors will provide a highly efficient solution for integrated edge AI compute.
Untitled-design-5.png


Enablement Partners




Enablement partners optimize the performance of their AI/ML development, modeling, or inference software with Akida integration. Pre-integration provides a vertically complete solution to simplify evaluation and implementation.
We work with our partners to create integrated AI solutions that are rich in capability with unparalleled performance.

Untitled-design-6.png

The Minsky AI Engine from AI Labs and the ultra efficient sensory inference capabilities of BrainChip’s Akida™ provides a compelling and cost-effective solution in system health monitoring in industrial efficiency and productivity. Detecting anomalies, identifying challenges, analyzing impacts are easily addressed with AI Labs and BrainChip working seamlessly together.
Untitled-design-7.png

Combining BrainChip’s Akida technology and the Edge Impulse platform, tools, and services allows current and future customers to achieve their ML objectives with fast and efficient development cycles to get to market quicker and achieve a competitive advantage.
Untitled-design-12.png

BrainChip’s technology combined with emotion3D’s state-of-the-art computer vision and machine learning software for image-based analysis of in-cabin environments enables our mutual customers to achieve an ultra-low-power working environment with on-chip learning while processing everything locally on the device within the vehicle to ensure data privacy.
Untitled-design-13.png

BrainChip and NVISO are targeting battery-powered applications in robotics and mobility devices addressing the need for high levels of AI performance in ultra low power environments. Implementing NVISO’s AI solutions with BrainChip’s Akida drive next generation solutions.

Integration Partners




Integration partners utilize BrainChip’s technology through designing the Akida IP into System on Chip (SoC) products that will be into ready-to-use systems or implementing Akida silicon into ready-to-use modules.

Partnering with BrainChip, MegaChips is incorporating the Akida technology into it’s ASIC solutions service. Enabling the development and support required to design and manufacture integrated circuits and systems on chips with intelligence that will drive AI into the next generation of edge based devices.
Untitled-design-14.png

Teksun focuses on end to end IoT product development and enabling intelligent solutions, such as predictive and preventative maintenance devices, analytics and diagnostics for portable healthcare, and vision based devices for security and surveillance. The partnership between BrainChip and Teksun proliferates intelligence through the Teksun product development channels.
Untitled-design-15.png


University AI Accelerator Program




BrainChip is bringing its neuromorphic technology into higher education institutions via the BrainChip University AI Accelerator Program, which shares technical knowledge, promotes leading-edge discoveries and positions students to be next-generation technology innovators.
BrainChip’s University AI Accelerator Program provides hardware, training, and guidance to students at higher education institutions with existing AI engineering programs. BrainChip’s products can be leveraged by students to support projects in any number of novel use cases or to demonstrate AI enablement. Students participating in the program will have access to real-world, event-based technologies offering unparalleled performance and efficiency to advance their learning through graduation and beyond.
Current university participants include:

download-1.jpg

download-1.png

Rochester_Institute_of_Technology_Seal_2018-1.png

By partnering with BrainChip’s AI Accelerator Program, universities are able to ensure that students have the tools and resources needed to encourage development of cutting-edge technologies that will continue to usher in an era of essential AI solutions.
Have your university become AI smarter, join the University AI Accelerator Program today."



The other day I posted about the wording approved by Brainchip on the Teksun website after it took down the reveal of "Cisco, Toshiba' and another company and I suggested it revealed certain things about Renesas and Mercedes Benz well as you will see from the above they form their own unique category.

What makes Renesas and Mercedes Benz not a Technology Partner, an Enablement Partner or an Integration Partner well Renesas has bought IP and software from Brainchip to produce its own chip/MCU.

We know Mercedes Benz has publicly stated it is building its own chip and of course as Blind Freddie says you need IP to do that.

On the Teksun website Mercedes Benz has partnered just like Renesas to drive "intelligence into next generation devices". Again this is a bit hard without IP already in your hands and guaranteed to be available once you go into production.

We know that this very same entry included ARM with Mercedes Benz and Renesas as partnering with Brainchip to do the same thing yet ARM is described above as a Technology Partner and no longer stands with these two companies.

So it is up to you to decide was one of the IP licences sold to a third party by MegaChips who was originally announced as looking to use AKIDA technology in the automotive space sold to Mercedes Benz?

In my opinion there is now some logic to support this conclusion.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Fire
  • Love
Reactions: 80 users

Diogenese

Top 20
Hi @Diogenese you might find this interesting if you have not seen it yet:


You might end up being able to use your ceramic bowl for fruit again.

My opinion only DYOR - just in case you try to blame me for your house burning down.:ROFLMAO:😂🤣
FF

AKIDA BALLISTA
I always keep water melons in the fruit bowl just in case.
 
  • Haha
  • Like
  • Love
Reactions: 11 users
Top Bottom