BRN Discussion Ongoing

Esq.111

Fascinatingly Intuitive.
Chippers,,

Yet another Mrec tid bit..... reasonably sure not from Co marketing dep.



Esq
 
  • Like
  • Fire
Reactions: 7 users

Esq.111

Fascinatingly Intuitive.
  • Like
  • Love
  • Fire
Reactions: 7 users

Ian

Founding Member
🤔
 
  • Like
  • Fire
  • Love
Reactions: 17 users
Hi FF,

Interesting point.

The terminology used is a bit fluid. Even on the Hailo website they refer to Hailo 8 as an accelerator on one page and a processor elsewhere, but the datasheet does show runtime software for the CPU, but what part this software plays in the operation of Hailo 8 is not clear.

I remember in LdN's days, there was a point made about Akida being an NN processor, not just an accelerator, but that was when it included an ARM Cortex.

Even Akida 1000 needs the CPU to set up the configuration via MPU/CPU software, but this plays no part in the Akida 1000 runtime. I think there is some (minimal) cpu intervention in the TeNNs operation.

The heat sinks are just aluminium fins and would qualify as passive cooling, ie, no fan or pumped coolant.
Hi Diogenese
Not that I don’t trust Hailo but I don’t trust Hailo so I went to Renesas and they say:

“Further amplifying that point, Takeshi Fuse, Vice President of Marketing, High Performance Computing Division of Renesas, noted: “The integration of the Renesas R-Car SoC with the Hailo-8 AI accelerator further enables unprecedented capabilities such as Bird’s-Eye-View 3D perception at an affordable cost for mass market vehicles. Together, we are bringing to life a new era in high performance, affordable automated driving that will benefit the Chinese automotive market specifically and the automotive industry at large.”

Renesas describe it as an Ai Accelerator.

Then going to R Car:

“Fifth-Generation R-Car SoC Platform​

Until the fourth generation, the R-Car SoCs were designed for specific use cases, such as ADAS/Autonomous Driving that requires high AI performance, and gateway solutions with enhanced communication capabilities. Renesas’ fifth-generation R-Car SoC will incorporate chiplet technology to create a flexible platform that can be customized to meet various requirements for each use case. The new platform will offer multiple processor sets from entry-level to higher-end models, and can integrate a variety of IP such as AI accelerators and IP by partners and customers into a single package. This will give users the option to customize designs according to their needs.

Two New Arm-based MCU Platforms for Vehicle Control Applications​

As E/E architecture in vehicles continues to evolve, it becomes increasingly important for domain control units (DCUs) and zone control units to handle both high computing performance and real-time processing. Renesas addresses this challenge by developing an Arm-based 32-bit crossover R-Car MCU platform with built-in NVM (Non Volatile Memory) that can deliver higher performance than traditional MCUs currently offer. Moreover, to build upon the success achieved by the RH850 Family MCUs, Renesas is also extending its vehicle control portfolio with a new R-Car MCU series, which will be also powered by Arm. This means for the first time, automotive system developers will be able to take advantage of the software and vast eco-system of Arm by using these new MCUs to build powertrain, body control, chassis and instrument cluster systems. This expansion will allow Renesas to standardize IP between MCUs and SoCs, thus increasing software usability and preserving engineering investments for its customers.

Renesas plans to release new products subsequently from 2024 onward, following this roadmap.”

So putting these together I don’t think Hailo has let me down this time either and my lack of trust appears to remain well founded.😂🤣😂

My opinion only DYOR
Fact Finder
 
  • Like
  • Love
  • Fire
Reactions: 16 users

Tothemoon24

Top 20

IFS Direct Connect 2024 – February 21, 2024 (San Jose)​


Join BrainChip at IFS Direct Connect 2024, an exclusive all-day event held at the San Jose McEnery Convention Center. Tailored for alliance partners, customers, executives, media, and analysts, this event offers a unique platform for networking and insights.
At this gathering, Intel leadership, industry technologists, and partners, including BrainChip, will showcase their strategies for success. We’ll delve into how these collaborations support the design process and product achievements of our customers. BrainChip will spotlight Akida’s neuromorphic advantage, emphasizing the scalability of our IP within Intel’s Open System Foundry. Discover firsthand how Akida’s power-efficient and innovative technologies are shaping the future of intelligent computing.
 
  • Like
  • Love
  • Fire
Reactions: 44 users
Has anyone found any updates on the VVDN Edge Box? I presume it should be available for sale now?
 
  • Like
Reactions: 9 users

Diogenese

Top 20

IFS Direct Connect 2024 – February 21, 2024 (San Jose)​


Join BrainChip at IFS Direct Connect 2024, an exclusive all-day event held at the San Jose McEnery Convention Center. Tailored for alliance partners, customers, executives, media, and analysts, this event offers a unique platform for networking and insights.
At this gathering, Intel leadership, industry technologists, and partners, including BrainChip, will showcase their strategies for success. We’ll delve into how these collaborations support the design process and product achievements of our customers. BrainChip will spotlight Akida’s neuromorphic advantage, emphasizing the scalability of our IP within Intel’s Open System Foundry. Discover firsthand how Akida’s power-efficient and innovative technologies are shaping the future of intelligent computing.

" We’ll delve into how these collaborations support the design process and product achievements of our customers."

... a peek behind the NDAs?
 
  • Like
  • Fire
  • Haha
Reactions: 23 users

manny100

Regular
The big mystery is the whereabouts or final destination of the Renesas chips that have been taped out. I received a generic nicely no we don't know response.
I doubt we will know until the bloody Financials. Renesas may even be subject to an NDA from their client buying the chips??
.....but someone has them.
 
  • Like
  • Thinking
Reactions: 16 users

GazDix

Regular
Palease!!! (hopefully this ends up going way beyond "analysis").

One of my biggest beefs is traffic lights! (might have mentioned this before 🙄..).
Especially, when there is light traffic..

Waiting at the "traffic" lights, when there is no "traffic" and it's safe to go irks me and is irrational beyond belief.

It's the year 2024 and we're still dealing with these antiquated systems, based on series timing and pressure pads, that some drivers seem too "shy" to drive on to??..

If AKIDA, can fix this one thing, I think it will comand the Love and Respect, of hundreds of millions of people.

But maybe, that's just my view 🤔..
Hi Dingo,

When I lived in a city called Qingdao in China many years back, all the city's traffic lights turned to orange (yellow) from 11pm - 5am.
I was out on the piss often those days and often took taxis during those hours.
Worked a treat.
No amazing tech needed!

But I know what you mean. Especially during quiet day times.
 
  • Like
  • Haha
Reactions: 10 users

TheFunkMachine

seeds have the potential to become trees.
https://www.linkedin.com/posts/infi...4-pWuZ?utm_source=share&utm_medium=member_ios




CFBF31E8-2B04-4925-A720-C389759ECF03.jpeg


With our aligned product and technology roadmaps, we are fast-tracking the time to market for the technologies that will power the vehicles of tomorrow

I kind of like the sound of this.. wild speculation of course. But I do firmly believe Brainchip has a major part to play in processing power in future cars.
 
  • Like
  • Love
  • Fire
Reactions: 19 users

TheFunkMachine

seeds have the potential to become trees.
Hi All

The following are the known Brainchip Aerospace and Defence partnerships and relationships:

NASA
VORAGO
ARM
GLOBAL FOUNDRIES
INTEL FOUNDRY SERVICES
INTELLISENSE
INFORMATION SYSTEMS LABS
US AIR FORCE RESEARCH LAB
QUANTUM VENTURA
US HOMELAND SECURITY
US DEPT OF ENERGY
ANT61
EDGX
EUROPEAN SPACE AGENCY
MICROCHIP
SIFIVE
PROPHESEE

My opinion only DYOR
Fact Finder
Don’t forget BRE

https://circuitcellar.com/newslette...-processors-in-high-performance-applications/
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 20 users

goodvibes

Regular
  • Like
  • Love
  • Fire
Reactions: 19 users

Diogenese

Top 20
Last edited:
  • Haha
  • Like
Reactions: 9 users

Tothemoon24

Top 20



Webinars​

Assembly Automation Solutions

Event-Based Vision: Bringing More Performance and Efficiency to Improve Machine Vision Applications​

February 01, 2024 | 12 PM - 1 PM ET
REGISTER TO WATCH WEBINAR
ABOUT THIS WEBINAR
The growing complexity and challenging operating conditions in industrial machine vision require innovative approaches to capturing and processing the necessary visual information that achieves the objectives of the system. These systems include high-speed inspection cameras that must deliver fast motion capture, robotic grinding or welding systems that need high dynamic scene capture, and smart Edge IoT cameras that must have both speed and accuracy as well as robust object tracking capabilities.
But for the most part, such systems have relied on an increasingly inadequate and incomplete vision method: frame-based vision. This method struggles to address many important challenges, such as capturing data in a high-speed, continuous fashion in dynamic scenes; working effectively in challenging lighting conditions; and functioning in operating environments where compute and power resources are limited.
Most important is a lack of completeness: Frame-based vision was never meant to address the challenges introduced by today’s industrial use cases. In frame-based image capture, an entire image (i.e. the light intensity at each pixel) is recorded at a pre-assigned interval, known as the frame rate. While this works well in representing the ‘real world’ when displayed on a screen, recording the entire image at every time increment oversamples all the parts of the image that have not changed.
Event-Based Vision introduces a new approach that, like our eyes and brains, uses independent receptors collecting all the essential information, and nothing else. With 10-1000x less data generated, >120dB dynamic range and microsecond time resolution (over 10k images per second equivalent), event-based vision opens vast new potential in areas such as industrial automation, robotics, security and surveillance, mobile, IoT and AR/VR.
Over the past several years, event cameras that leverage neuromorphic techniques have gained a strong foothold in machine vision applications in industrial automation, robotics, automotive and other areas. These are all applications where better performance in dynamic scenes, capturing fast-moving subjects and operating in low-light conditions are critical. In addition to benefits in terms of power consumption, data efficiency and dynamic range, event-based vision addresses a fundamental limitation of traditional camera techniques: how light is captured.
Among the advantages of event cameras are:
  • Blur-free images as there is no exposure time
  • High-speed data capture: 10K fps resolution equivalent
  • High resolution of up to 1280x720px
  • High dynamic range (>120dB)
  • Shutter-free operation: no global or rolling shutter needed
Uses case discussed include:
  • Object tracking that leverages the low data rate and sparse information provided by event-based sensors to track objects with low compute power. Well suited for pick and place, robot guidance and trajectory monitoring;
  • Fluid monitoring that uses event-based optical flows to perform fluid monitoring in real time and analyze unwanted dynamics due to residue build-up, spot contaminants or unwanted air or gas bubbles. Well-suited for continuous liquid flow monitoring in food and beverage, oil and gas, and biological processes;
  • Vibration monitoring that enables the monitoring of vibration frequencies continuously, remotely, with pixel precision by tracking the temporal evolution of every pixel in a scene. For each event, the pixel coordinates, the polarity of the change and the exact timestamp are recorded, thus providing a global, continuous understanding of vibration patterns. Used for predictive maintenance tasks such as motion monitoring, vibration monitoring and frequency analysis;
  • Particle and object sizes monitoring: Event cameras can better control, count and measure the size of objects moving at very high speed in a channel or a conveyor. Implemented in systems for high-speed counting, batch homogeneity and gauging.
Viewers will be able to learn about development tools, algorithms and open-source resources that can accelerate the understanding, experimentation and implementation of embedded event-based vision capabilities in machine vision systems
Prophesee is the inventor of the world’s most advanced neuromorphic vision systems. Composed of patented Metavision® sensors and algorithms, its event-based vision technology enables machines to process visual information more efficiently, thoroughly and in challenging conditions
Key Takeaways:
  • Introduction to the concept of event-based vision and how it compares to traditional frame-based methods
  • Understanding of key benefits of event-based vision in terms of power, performance dynamic range, integration with other data acquisition technologies
  • Overview of development tools and methods that can be used to integrate event-based vision into MV systems
  • Examples of common use cases that leverage the advantages of event-based vision in industrial applications
REGISTER TO WATCH WEBINAR
 
  • Like
Reactions: 8 users

Tothemoon24

Top 20
IMG_8280.jpeg



Socionext showed a very impressive low power, 60GHz radar for child presence detection which included the footwell. They are also using Radar for security monitoring such as theft detection and prevention. They were also showing their gesture controlled touchless display, which was interesting to see, with touchless applications having a major role in increasing user safety. Socionext have joined us at InCabin before and will be back for InCabin Europe, so we are excited to be seeing more from them in Barcelona!
 
  • Like
  • Fire
  • Love
Reactions: 19 users
 
  • Like
  • Fire
Reactions: 12 users

cosors

👀

David Steenari​

On-Board Payload Data Processing Engineer at European Space Agency - ESALuleå University of Technology European Space Agency - ESANoordwijk-Binnen, Zuid-Holland, Nederland Meer dan 500 connecties
Connectie makenBericht

Info​

Working at ESA on on-board data handling; high-performance on-board processing; on-board AI/ML; modular data handling systems and architectures. Lead organisation of OBDP2021 and OBDP2019.

Regards
Fact Finder
@JoMo68 the circles around Brainchip and Talga are getting tighter. What do my eyes spot - Lulea. Talga works with that university 😅

To the others, Lulea has perhaps 45,000 inhabitants. The European Space Centre is located near Kiruna in the far north. There is also a very large data centre in Lulea. And of course -> Talga 😊
 
  • Like
  • Fire
  • Love
Reactions: 16 users

Esq.111

Fascinatingly Intuitive.

Morning Pom down under,

Good to watch again , probably nothing but towards the end I note that Todd & the interviewer are both wearing , what I presume are , name tags around their necks ...... with. SONY emblazond on the straps.

Maybe thay both just came back from the SONY stand and forgot to take their tags off ....... or .....

Regards,
Esq.
 
Last edited:
  • Like
  • Fire
  • Wow
Reactions: 15 users
Top Bottom