BRN Discussion Ongoing

dippY22

Regular
Well 11 December 2022 isn’t just joined but I take your point it may well be in 2025 if this 3 year time frame is an absolute given with intel knowledge and development in the EDGE AI space

Also take in the time frame before BRN joined the IFS it will be getting up there timeframe wise of this 3 year development period before products produced. I’m even thinking given Intel involvement in Edge AI and one of the big boys the 3 year timeframe could be compressed. As always DYOR

https://brainchip.com/brainchip-joins-intel-foundry-services-to-advance-neuromorphic-ai-at-the-edge/#:~:text=%E2%80%93%20December%2011%2C%202022%20%E2%80%93BrainChip,innovation%20on%20Intel's%20foundry%20manufacturing
Regarding time-frames to market I grabbed this from a post almost two years ago. In this video of Arm's Rene Haas being interviewed by Bloomberg business channels Emily Chang, Mr Haas is saying that product timelines take 3-4 years. See specifically the 3 minute 40 second mark on the following video. I think Mr. Haas knows what he is talking about.

Although he states "3 to 4 years", I personally would think 4 years is likely more pertinent to Brainchips situation, so you are in the ballpark, The Pope.

That would make Renesas or Megachips, for example, producing something with Akida technology in it probably around 2026 I speculate. I hope I am wrong and that the revenue numbers Sean Hehir wants us all to watch start to increase exponentially before then.
As far as anything from Intel, and when.......who knows?


Thanks, ... dippY
 
  • Like
Reactions: 10 users

MegaportX

Regular
  • Fire
  • Like
  • Wow
Reactions: 6 users

Labsy

Regular
Pleeease universe.......... Pleeeeeeease a masssssive announcement this week.... Pleeeeeeease ......
... Pleeeeeeease

Let's go chippers!!! 👌🙏🙏🙏🚀🚀🚀🚀😅
Just putting it out into the infinity, nether, what ever you Wana call it... (If it works, you're all welcome in advance)
 
  • Like
  • Haha
  • Love
Reactions: 42 users

Labsy

Regular
national lampoons christmas vacation GIF
Christmas Vacation Reaction GIF
 
  • Haha
  • Like
Reactions: 6 users

7für7

Regular
Pleeease universe.......... Pleeeeeeease a masssssive announcement this week.... Pleeeeeeease ......
... Pleeeeeeease

Let's go chippers!!! 👌🙏🙏🙏🚀🚀🚀🚀😅
Just putting it out into the infinity, nether, what ever you Wana call it... (If it works, you're all welcome in advance)
1718581214218.gif
 
  • Like
  • Haha
  • Fire
Reactions: 6 users
Pleeease universe.......... Pleeeeeeease a masssssive announcement this week.... Pleeeeeeease ......
... Pleeeeeeease

Let's go chippers!!! 👌🙏🙏🙏🚀🚀🚀🚀😅
Just putting it out into the infinity, nether, what ever you Wana call it... (If it works, you're all welcome in advance)
1718585948163.png
 
  • Haha
  • Love
  • Like
Reactions: 13 users

Draed

Regular
Everything.... BRN announcement excited..... then "notification regarding unquoted security".... 😫😵‍💫😵
 
  • Haha
Reactions: 2 users

7für7

Regular
Everything.... BRN announcement excited..... then "notification regarding unquoted security".... 😫😵‍💫😵
Maybe labsy should be more specific… like “a massive price sensitive announcement regarding huge gains”

@Labsy, try it again 🤷🏻‍♂️
 
  • Haha
  • Sad
  • Like
Reactions: 3 users
Bit of a big banana, if he was the main guy in developing TENNs..

From the recent TENNs presentation, the presenter sounded like he was saying, there was a "main" person involved, who hated the name TENNs.

It sounds a little like Rudy is fishing there, in the last 2 paragraphs?..
 
  • Like
  • Thinking
Reactions: 3 users

IloveLamp

Top 20

1000016445.jpg
 
  • Like
  • Love
  • Fire
Reactions: 18 users

Labsy

Regular
Maaaaassssssive price sensitive announcement pushing sp to $3 +++++ pleeeeeeease..... pleeeeeeease oh infinity and beyond powers of the universe ....... Pleeease.....
Perhaps Zuckerberg, Elon or Bezos advocating the use of neuromorphic in their new tech ...............
Let's goooo!!!!!!! 🚀🚀🚀🚀🚀🚀👌🙏🙏🙏🙏

Edit: This week.......pleeeeeeease .....cmoooooon!!!! Yeaaaaaaah!! Woooo!!!
 
  • Like
  • Haha
  • Fire
Reactions: 34 users

toasty

Regular

View attachment 64961
Not a lot of point if they can't communicate with it..............
 
  • Like
Reactions: 1 users

IloveLamp

Top 20
Not a lot of point if they can't communicate with it..............

Yes because you would know better than them.......
1000012982.gif
 
Last edited:
  • Haha
Reactions: 12 users
Yes because you would know better than them View attachment 64962
Did the Optimus Satellite, have Beacon on board, is the big question?..

ANT61 are encouraging organisations to use it, so it must be ready for implemention?

20240617_150050.jpg


You would think that they would've taken the opportunity with the Optimus Satellite, as a first use case and not real good, if they couldn't use it, to establish a connection..

Or maybe it's all part of their plan..

"Hey, we just established contact with Optimus, using Beacon. Good thing we had that particular piece of hardware onboard!"..
 
  • Like
  • Fire
  • Thinking
Reactions: 9 users

Kachoo

Regular
Did the Optimus Satellite, have Beacon on board, is the big question?..

ANT61 are encouraging organisations to use it, so it must be ready for implemention?

View attachment 64963

You would think that they would've taken the opportunity with the Optimus Satellite, as a first use case and not real good, if they couldn't use it, to establish a connection..

Or maybe it's all part of their plan..

"Hey, we just established contact with Optimus, using Beacon. Good thing we had that particular piece of hardware onboard!"..
No Optimus did not have the Beacon. How silly to have the beacon but not use it. I suspect that the design was well before the beacon came to be hence no Beacon.
 
  • Like
Reactions: 5 users
No Optimus did not have the Beacon. How silly to have the beacon but not use it. I suspect that the design was well before the beacon came to be hence no Beacon.
You know that as fact Kachoo?
 
  • Fire
  • Like
Reactions: 2 users

Tothemoon24

Top 20

IMG_9119.jpeg

Navigating the Challenges and Complexities in Automotive Cameras​

Published on
June 11, 2024

blog Banner

Introduction​

The automotive industry is undergoing a technological revolution, with advanced driver assistance systems (ADAS) and self-driving vehicles becoming increasingly prevalent. Central to these innovations are automotive cameras, which serve as the “eyes” of modern vehicles. They play a critical role in enhancing safety, providing real-time data for navigation, and enabling autonomous functionalities.
Many envision self-driving cars as vehicles that effortlessly chauffeur them to their destinations, allowing them to use travel time for work, relaxation, or entertainment, much like riding a bus without any exhaustion. However, despite their advanced capabilities, automotive cameras face numerous challenges that impact their performance and reliability.
To make this vision a reality, the sensor systems in these vehicles must function reliably regardless of the time of day, weather conditions, lighting, and road conditions.
This article examines the challenges posed by the requirements for ADAS and self-driving vehicles and explores how VVDN Technologies addresses these challenges with their expertise in camera technology.

Types of Automotive Cameras​

  • Rear-View Cameras: Installed at the back of the vehicle, these assist drivers in reversing and parking by providing a clear view of the area behind the vehicle.
  • Surround-View Cameras: These systems combine images from multiple cameras placed around the vehicle to create a 360-degree view, enabling park assistance and maneuvering in tight spaces.
  • Forward-Facing Cameras: Positioned at the front, these cameras are crucial for ADAS functionalities like lane-keeping assistance, traffic sign recognition, and collision avoidance.
  • Driver Monitoring Cameras: These cameras monitor the driver’s attention and alertness, detecting signs of drowsiness or distraction to enhance safety.
  • eMirror Cameras: These cameras replace traditional side mirrors with advanced digital displays, offering improved visibility and reduced blind spots, even in challenging weather conditions.
  • Night Vision Cameras: Utilizing infrared technology, these cameras improve visibility in low-light conditions, helping detect pedestrians, animals, and other obstacles not visible with standard headlights.
cameraautomotive-1-1024x557.jpg

Challenges in Automotive Cameras​

  • Contamination: Camera lenses can be obstructed by dirt, water, or other contaminants. Solutions include protective covers that open only when necessary or positioning cameras behind windshield wipers.
  • Lane Detection: Recognizing lane markings is challenging due to similar-looking structures, regional differences, varying colors, and weather conditions.
  • Light Assistance: Distinguishing moving vehicles from static objects like streetlights and reflectors is complex, especially when dealing with partially defective lights or motorcycles.
  • High Dynamic Range: Cameras struggle with visibility in extreme lighting conditions, such as direct sunlight, tunnel exits, and oncoming headlights.
  • Flicker: Modern LED light sources flicker at different frequencies, complicating the continuous image analysis required for vehicle cameras.
  • Stray light: High contrasts can cause unwanted reflections and light scattering in the lens or camera housing, creating visibility issues.
  • Environmental Factors: Cameras must operate reliably in all temperatures (-40 to 100°C) and weather conditions (rain, fog, snow). Image noise increases with temperature, affecting performance.
  • Regional differences: Traffic signs and road markings vary by region, requiring cameras to recognize and adapt to these differences.
  • An infinite number of objects: The variety of objects and their changing perspectives pose a significant challenge for accurate classification and detection.

Limitations of Automotive Cameras​

Cameras have inherent limitations in three specific areas:
  • Limited field of view.
  • Struggles with accurate depth perception (>20m).
  • Reduced visibility in fog, rain, and low-light conditions.
To combat these limitations, vehicles with high automation incorporate additional sensors:
  • LiDAR: Offers precise 3D mapping and distance measurement, enhancing of object detection and spatial awareness.
  • RADAR: Complements cameras by providing robust detection of objects and accurate distance measurements, especially in adverse weather conditions.
  • Ultrasonic Sensors: Used for close-range detection, particularly useful for parking assistance systems.
  • Thermal Imaging Cameras: Used to detect living and heated objects, enhancing safety by identifying pedestrians and animals in low-visibility conditions.

Solutions to Overcome Challenges​

  • Advanced Algorithms: Implementing sophisticated image processing algorithms, including machine learning and AI, to improve object detection and recognition under various conditions.
  • Sensor Fusion: Integrating data from multiple sensors (cameras, radar, LiDAR) to create a more accurate and comprehensive understanding of the vehicle’s environment.
  • Robust Design: Designing camera systems with protective housings, self-cleaning mechanisms, and better optical design to prevent lens obstruction and enhance image quality.
  • Enhanced Calibration Techniques: Developing automated and dynamic calibration systems that adjust camera alignment in real-time, maintaining accuracy even after impacts or vibrations.
  • Data Integration: Combining data from multiple cameras and sensors to create a comprehensive understanding of the vehicle’s surroundings.
In addition to these solutions, several other parameters need to be optimized to ensure high-quality camera output before installation in vehicles. These include:
  • Measurement of OECF (Opto Electronic Conversion Function)
  • Noise
  • Resolution, Including Spherical Aberrations.
  • White Balance
  • Edge Darkening in Intensity and Color
  • Chromatic Aberration
  • Stray Light
  • Color Reproduction.
  • Defective Pixels and Inclusions on the Sensor
  • Flicker

VVDN Expertise in Camera Technology​

VVDN Technologies, with over a decade of experience in designing and developing cameras for various industries, including automotive, offers comprehensive in-house capabilities covering software design, hardware design, mechanical design, testing, validation, and manufacturing. Equipped with a world-class ISP tuning lab, VVDN performs rigorous objective and subjective testing to guarantee top-quality performance. Specializing in designing and integrating AI/ML models,VVDN has expertise in sensor fusion algorithms, combining data from LiDAR, RADAR, thermal, IR, UWB, and ultrasonic sensors.
By combining deep industry experience with comprehensive in-house capabilities and cutting-edge technology, VVDN Technologies delivers advanced automotive camera solutions that meet the highest standards of performance and reliability.
To learn more about our offerings and discuss how we can collaborate to meet your camera requirements, please contact us atinfo@vvdntech.com
 
  • Like
  • Fire
  • Love
Reactions: 14 users
Top Bottom