BRN Discussion Ongoing

Labsy

Regular
Maaaaassssssive price sensitive announcement pushing sp to $3 +++++ pleeeeeeease..... pleeeeeeease oh infinity and beyond powers of the universe ....... Pleeease.....
Perhaps Zuckerberg, Elon or Bezos advocating the use of neuromorphic in their new tech ...............
Let's goooo!!!!!!! 🚀🚀🚀🚀🚀🚀👌🙏🙏🙏🙏

Edit: This week.......pleeeeeeease .....cmoooooon!!!! Yeaaaaaaah!! Woooo!!!
 
  • Like
  • Haha
  • Fire
Reactions: 34 users

toasty

Regular

View attachment 64961
Not a lot of point if they can't communicate with it..............
 
  • Like
Reactions: 1 users

IloveLamp

Top 20
Not a lot of point if they can't communicate with it..............

Yes because you would know better than them.......
1000012982.gif
 
Last edited:
  • Haha
Reactions: 12 users
Yes because you would know better than them View attachment 64962
Did the Optimus Satellite, have Beacon on board, is the big question?..

ANT61 are encouraging organisations to use it, so it must be ready for implemention?

20240617_150050.jpg


You would think that they would've taken the opportunity with the Optimus Satellite, as a first use case and not real good, if they couldn't use it, to establish a connection..

Or maybe it's all part of their plan..

"Hey, we just established contact with Optimus, using Beacon. Good thing we had that particular piece of hardware onboard!"..
 
  • Like
  • Fire
  • Thinking
Reactions: 9 users

Kachoo

Regular
Did the Optimus Satellite, have Beacon on board, is the big question?..

ANT61 are encouraging organisations to use it, so it must be ready for implemention?

View attachment 64963

You would think that they would've taken the opportunity with the Optimus Satellite, as a first use case and not real good, if they couldn't use it, to establish a connection..

Or maybe it's all part of their plan..

"Hey, we just established contact with Optimus, using Beacon. Good thing we had that particular piece of hardware onboard!"..
No Optimus did not have the Beacon. How silly to have the beacon but not use it. I suspect that the design was well before the beacon came to be hence no Beacon.
 
  • Like
Reactions: 5 users
No Optimus did not have the Beacon. How silly to have the beacon but not use it. I suspect that the design was well before the beacon came to be hence no Beacon.
You know that as fact Kachoo?
 
  • Fire
  • Like
Reactions: 2 users

Tothemoon24

Top 20

IMG_9119.jpeg

Navigating the Challenges and Complexities in Automotive Cameras​

Published on
June 11, 2024

blog Banner

Introduction​

The automotive industry is undergoing a technological revolution, with advanced driver assistance systems (ADAS) and self-driving vehicles becoming increasingly prevalent. Central to these innovations are automotive cameras, which serve as the “eyes” of modern vehicles. They play a critical role in enhancing safety, providing real-time data for navigation, and enabling autonomous functionalities.
Many envision self-driving cars as vehicles that effortlessly chauffeur them to their destinations, allowing them to use travel time for work, relaxation, or entertainment, much like riding a bus without any exhaustion. However, despite their advanced capabilities, automotive cameras face numerous challenges that impact their performance and reliability.
To make this vision a reality, the sensor systems in these vehicles must function reliably regardless of the time of day, weather conditions, lighting, and road conditions.
This article examines the challenges posed by the requirements for ADAS and self-driving vehicles and explores how VVDN Technologies addresses these challenges with their expertise in camera technology.

Types of Automotive Cameras​

  • Rear-View Cameras: Installed at the back of the vehicle, these assist drivers in reversing and parking by providing a clear view of the area behind the vehicle.
  • Surround-View Cameras: These systems combine images from multiple cameras placed around the vehicle to create a 360-degree view, enabling park assistance and maneuvering in tight spaces.
  • Forward-Facing Cameras: Positioned at the front, these cameras are crucial for ADAS functionalities like lane-keeping assistance, traffic sign recognition, and collision avoidance.
  • Driver Monitoring Cameras: These cameras monitor the driver’s attention and alertness, detecting signs of drowsiness or distraction to enhance safety.
  • eMirror Cameras: These cameras replace traditional side mirrors with advanced digital displays, offering improved visibility and reduced blind spots, even in challenging weather conditions.
  • Night Vision Cameras: Utilizing infrared technology, these cameras improve visibility in low-light conditions, helping detect pedestrians, animals, and other obstacles not visible with standard headlights.
cameraautomotive-1-1024x557.jpg

Challenges in Automotive Cameras​

  • Contamination: Camera lenses can be obstructed by dirt, water, or other contaminants. Solutions include protective covers that open only when necessary or positioning cameras behind windshield wipers.
  • Lane Detection: Recognizing lane markings is challenging due to similar-looking structures, regional differences, varying colors, and weather conditions.
  • Light Assistance: Distinguishing moving vehicles from static objects like streetlights and reflectors is complex, especially when dealing with partially defective lights or motorcycles.
  • High Dynamic Range: Cameras struggle with visibility in extreme lighting conditions, such as direct sunlight, tunnel exits, and oncoming headlights.
  • Flicker: Modern LED light sources flicker at different frequencies, complicating the continuous image analysis required for vehicle cameras.
  • Stray light: High contrasts can cause unwanted reflections and light scattering in the lens or camera housing, creating visibility issues.
  • Environmental Factors: Cameras must operate reliably in all temperatures (-40 to 100°C) and weather conditions (rain, fog, snow). Image noise increases with temperature, affecting performance.
  • Regional differences: Traffic signs and road markings vary by region, requiring cameras to recognize and adapt to these differences.
  • An infinite number of objects: The variety of objects and their changing perspectives pose a significant challenge for accurate classification and detection.

Limitations of Automotive Cameras​

Cameras have inherent limitations in three specific areas:
  • Limited field of view.
  • Struggles with accurate depth perception (>20m).
  • Reduced visibility in fog, rain, and low-light conditions.
To combat these limitations, vehicles with high automation incorporate additional sensors:
  • LiDAR: Offers precise 3D mapping and distance measurement, enhancing of object detection and spatial awareness.
  • RADAR: Complements cameras by providing robust detection of objects and accurate distance measurements, especially in adverse weather conditions.
  • Ultrasonic Sensors: Used for close-range detection, particularly useful for parking assistance systems.
  • Thermal Imaging Cameras: Used to detect living and heated objects, enhancing safety by identifying pedestrians and animals in low-visibility conditions.

Solutions to Overcome Challenges​

  • Advanced Algorithms: Implementing sophisticated image processing algorithms, including machine learning and AI, to improve object detection and recognition under various conditions.
  • Sensor Fusion: Integrating data from multiple sensors (cameras, radar, LiDAR) to create a more accurate and comprehensive understanding of the vehicle’s environment.
  • Robust Design: Designing camera systems with protective housings, self-cleaning mechanisms, and better optical design to prevent lens obstruction and enhance image quality.
  • Enhanced Calibration Techniques: Developing automated and dynamic calibration systems that adjust camera alignment in real-time, maintaining accuracy even after impacts or vibrations.
  • Data Integration: Combining data from multiple cameras and sensors to create a comprehensive understanding of the vehicle’s surroundings.
In addition to these solutions, several other parameters need to be optimized to ensure high-quality camera output before installation in vehicles. These include:
  • Measurement of OECF (Opto Electronic Conversion Function)
  • Noise
  • Resolution, Including Spherical Aberrations.
  • White Balance
  • Edge Darkening in Intensity and Color
  • Chromatic Aberration
  • Stray Light
  • Color Reproduction.
  • Defective Pixels and Inclusions on the Sensor
  • Flicker

VVDN Expertise in Camera Technology​

VVDN Technologies, with over a decade of experience in designing and developing cameras for various industries, including automotive, offers comprehensive in-house capabilities covering software design, hardware design, mechanical design, testing, validation, and manufacturing. Equipped with a world-class ISP tuning lab, VVDN performs rigorous objective and subjective testing to guarantee top-quality performance. Specializing in designing and integrating AI/ML models,VVDN has expertise in sensor fusion algorithms, combining data from LiDAR, RADAR, thermal, IR, UWB, and ultrasonic sensors.
By combining deep industry experience with comprehensive in-house capabilities and cutting-edge technology, VVDN Technologies delivers advanced automotive camera solutions that meet the highest standards of performance and reliability.
To learn more about our offerings and discuss how we can collaborate to meet your camera requirements, please contact us atinfo@vvdntech.com
 
  • Like
  • Fire
  • Love
Reactions: 14 users

rgupta

Regular
Did the Optimus Satellite, have Beacon on board, is the big question?..

ANT61 are encouraging organisations to use it, so it must be ready for implemention?

View attachment 64963

You would think that they would've taken the opportunity with the Optimus Satellite, as a first use case and not real good, if they couldn't use it, to establish a connection..

Or maybe it's all part of their plan..

"Hey, we just established contact with Optimus, using Beacon. Good thing we had that particular piece of hardware onboard!"..
No not there ant61 confirmed that
 
  • Like
Reactions: 5 users
Maaaaassssssive price sensitive announcement pushing sp to $3 +++++ pleeeeeeease..... pleeeeeeease oh infinity and beyond powers of the universe ....... Pleeease.....
Perhaps Zuckerberg, Elon or Bezos advocating the use of neuromorphic in their new tech ...............
Let's goooo!!!!!!! 🚀🚀🚀🚀🚀🚀👌🙏🙏🙏🙏

Edit: This week.......pleeeeeeease .....cmoooooon!!!! Yeaaaaaaah!! Woooo!!!
Dreamer
 

Kachoo

Regular
You know that as fact Kachoo?
That was confirmed to me that they did not have the Beacon installed would have been nice to trial.
 
  • Like
  • Fire
Reactions: 5 users

Oops if posted ……

IMG_1973.jpeg
 
  • Like
  • Fire
Reactions: 8 users

Taproot

Regular
You know that as fact Kachoo?

Read the comments section of todays Ant61 post.
The Beacon wasn't ready for Optimus.
Unfortunately a little ironic that the company that sent our Brain into space is the same company that has just successfully tested and proved up fail proof Satellite communication technology.
But don't stress, She'll be right. We just need to wait patiently for Labsy's blockbuster announcement, which should come any day now.
 
  • Like
  • Haha
  • Fire
Reactions: 7 users
Looks like plenty of funding going on for Australian space companies:


"Adelaide-based satellite tech company Myriota has confirmed a $1.5m investment from the Australian Space Agency will be used develop an off-world communications system.

The grant is part of a funding round as part of the Australian Space Agency’s Moon to Mars Initiative Demonstrator Mission program, first published in March."

"This project is another example of how Australian innovation can contribute to global space missions, while ultimately enhancing the critical technologies that can improve lives here on Earth,” says Enrico Palermo, head of the Australian Space Agency."

"the University of Western Australia ($4.4m) also received funding for a range of initiatives as part of the Moon to Mars program."
 
  • Like
  • Fire
  • Thinking
Reactions: 10 users

The Pope

Regular
Use the report button and stop arguing with the peanuts people ffs. Wasn't that the whole point of tsex?

Otherwise you might as well go and indulge them on the crapper
Yes because you would know better than them.......
View attachment 64962
 
  • Like
  • Love
  • Fire
Reactions: 9 users

Diogenese

Top 20
Regarding time-frames to market I grabbed this from a post almost two years ago. In this video of Arm's Rene Haas being interviewed by Bloomberg business channels Emily Chang, Mr Haas is saying that product timelines take 3-4 years. See specifically the 3 minute 40 second mark on the following video. I think Mr. Haas knows what he is talking about.

Although he states "3 to 4 years", I personally would think 4 years is likely more pertinent to Brainchips situation, so you are in the ballpark, The Pope.

That would make Renesas or Megachips, for example, producing something with Akida technology in it probably around 2026 I speculate. I hope I am wrong and that the revenue numbers Sean Hehir wants us all to watch start to increase exponentially before then.
As far as anything from Intel, and when.......who knows?


Thanks, ... dippY
Hi dippy,

The time-to-market will vary with the application.

for example, SynSense is already being used in toys, but Mercedes has indicated that AI in silicon is still some way off.

Even with automotive, I would think that something like the primary ADAS functions will require longer qualification times than something like "Hey Mercedes!". I would think that driver alertness would be considered more mission-critical as a legislated safety requirement in Europe. That said, Akida 2 has been available in software for a couple of years, and testing software may not be as difficult as testing hardware.

The good news is that we have been working with Valeo for 5 years, but the first 3 would have been with Akida 1. So the clock has been reset to 2022 for Akida2, but I'm hoping there will be a degree of carry-over from Akida 1.

Similarly, we have been working with MB for several years, and Akida 2 simulation software has been available for 2 years.

It is necessary to develop model databases to run NNs, and, as is becoming clearer, we seem to be working closely with Edge Impulse on this, using ML to automate the adaptation of user data to NN models on-chip, bypassing the conventional CNN route of reformatting the whole model in the cloud.

I suspect that Mercedes with its software defined vehicle is further advanced with the use of Akida simulation software in "Hey Mercedes!" and other in-cabin applications, and, as Magnus Ostberg advised, I'm "staying tuned!" I think the lure of TeNNs will prove persuasive.

Indeed, we also know that Valeo's SCALA 3 lidar comes with image processing software.

There are, of course, many applications which are not safety-critical which may adopt Akida silicon earlier than Automotive.
 
  • Like
  • Fire
  • Love
Reactions: 45 users

BrainShit

Regular
I wanna see a deal with TATA....

1000034761.jpg
 
  • Like
  • Love
  • Fire
Reactions: 44 users

charles2

Regular
Same general area as GMAC....big bucks I'd guess if one can get it right

 
  • Like
  • Fire
Reactions: 15 users

SiDEvans

Regular
Use the report button and stop arguing with the peanuts people ffs. Wasn't that the whole point of tsex?

Otherwise you might as well go and indulge them on the crapper
Well said! This site is going the same way now. 90% of the posts are just bickering about nothing.
 
  • Like
  • Love
Reactions: 12 users
  • Haha
Reactions: 10 users

Tothemoon24

Top 20
CLESROBOTICS3D PRINTINGIOTSEMICONDUCTORSAEROSPACEMORE >

Analyzing Human Behavior with Edge AI​

author avatar

Yasir Mahmood
17 Jun, 2024
FOLLOW
Sponsored by

Credit: Here.com

Credit: Here.com

Integration of AI human behavioral software with Akida neuromorphic computing can drive machines to understand human behavior.​

Artificial Intelligence
- Machine Learning
- Neuromorphic Computing
The significance of analyzing human behavior for cutting-edge applications, such as autonomous vehicles, healthcare, and retail, cannot be overstated. In the past, companies relied on centralized cloud servers, which proved to be both costly and introduced latency issues for crucial tasks.
To address this, businesses have shifted their focus to integrating distributed computing architecture within their infrastructure. Edge artificial intelligence is gaining momentum due to its capability to process data near its source, thereby significantly reducing latency and minimizing the need to transmit sensitive information over the internet. This, in turn, reduces the risk of data breaches and enhances privacy.
Real-time analysis of human behavior has significantly improved decision-making processes. Monitoring driver behavior can greatly enhance road safety. This understanding of human behavior enables companies to develop more intuitive and user-friendly interfaces.
In response to this need, BrainChip and NVISO Group Limited have joined forces to develop an advanced AI-powered system for real-time human behavioral analysis. The system integrates BrainChip's Akida IP and processors, which are based on neuromorphic computing, with NVISO Group's AI software, specializing in analyzing a wide range of human behaviors.
This article will discuss the core technologies that enable BrainChip’s human behavioral analysis solution to operate efficiently.

Brain-Inspired Architecture​

Neuromorphic systems are engineered to mimic the neural networks of the human brain, with neurons and synapses as their fundamental units. In contrast to the traditional von Neumann architecture, which separates processing and memory units, neuromorphic systems combine these functions.
BrainChip drew inspiration from this concept to develop its own IP and processor chips based on neuromorphic computing. Thanks to their significant parallelism, these hardware systems can handle many parallel operations, similar to the brain's ability to process multiple tasks simultaneously.
For human behavioral analysis, the BrainChip Akida IP and processors are capable of processing data in real-time, allowing for immediate analysis and response. This is particularly valuable for applications that require quick decision-making, such as driver monitoring systems (DMS) and interactive consumer electronics.
The AI-enabled human behavioral analysis solution uses BrainChip’s engineered highly energy-efficient hardware system, making it well-suited for edge deployments. It is designed for integration within resource-constrained environments, and its efficiency is achieved through event-based processing.

Why neuromorphic computing?​

In edge-based applications, generating substantial data can pose significant challenges for cloud processing and global connectivity.
For its hardware systems, BrainChip has introduced a novel neural network architecture known as Temporal Event-based Neural Networks (TENNs). These lightweight neural networks demonstrate superior efficiency in processing temporal data compared to traditional Recurrent Neural Networks (RNN) such as Long Short-Term Memory (LSRM) or Gated Recurrent Units (GRU).

eyJidWNrZXQiOiJ3ZXZvbHZlci1wcm9qZWN0LWltYWdlcyIsImtleSI6ImZyb2FsYS8xNzE4MTgzMjk3MzY0LTE3MTgxODMyOTczNjQucG5nIiwiZWRpdHMiOnsicmVzaXplIjp7IndpZHRoIjo5NTAsImZpdCI6ImNvdmVyIn19fQ==
Akida neural processor architecture [Image Credit: BrainChip]
The second-generation Akida neural processor platform has highly efficient 3-dimensional (3D) convolution functions, enabling it to handle compute-intensive tasks on devices with limited memory and battery resources. This enhancement makes it ideal for ADAS applications.
The Akida processors support on-chip learning, allowing the system to adapt and learn from new data without needing to connect to the cloud, crucial for analyzing human behavior. The hardware can be integrated into various devices, making them flexible for deployment across multiple sectors, including retail, healthcare, and automotive.

The Role of AI Software​

The jointly developed human behavior analysis solution uses AI software from NVISO Group that can detect and analyze facial expressions, allowing for the determination of emotions. This system monitors users' real-time state by observing head and body pose, eye tracking, and gaze.
The software processes acquired images through preprocessing to enhance data quality and relevance. It leverages various neural networks and machine learning models to extract essential features from the preprocessed images.
By accurately analyzing and responding to human behaviors, the system enhances the user experience, enabling more seamless and intuitive interactions with intelligent devices. Its continuous learning and adaptation to user behaviors ensure personalized and context-aware responses.
NVISO has created an evaluation kit for its human behavior AI SDK, which runs on the BrainChip Akida neuromorphic processing platform. “In our goal to drive machines to understand people and their behaviors, we have partnered with BrainChip to develop a high-performance system that enables efficient and effective human interaction with intelligent systems,” says Virpi Pennanen, CEO of NVISO Group[2].

Conclusion​

The integration of AI human behavioral software with Akida neuromorphic computing has led to the development of a system that empowers companies to build intelligent systems. With the increasing presence of autonomous machines, these systems can interact with humans by identifying and interpreting movement using the capabilities of artificial intelligence.
This co-developed human behavioral analysis solution was showcased at the CES 2024 and IFS Direct Connect 2024[1,2].

References​

[1] BrainChip. BrainChip Demonstrates Human Behavior Detection at IFS Direct Connect 2024. February 15, 2024. https://brainchip.com/brainchip-demonstrates-human-behavior-detection-at-ifs-direct-connect-2024/. Accessed on May 28, 2024.
[2] BrainChip. BrainChip and NVISO Group Demonstrate AI-Enabled Human Behavioral Analysis at CES 2024. January 5, 2024. https://brainchip.com/brainchip-and...nabled-human-behavioral-analysis-at-ces-2024/. Accessed on May 28, 2024.
 
  • Like
  • Fire
  • Love
Reactions: 55 users
Top Bottom