BRN Discussion Ongoing

Bravo

If ARM was an arm, BRN would be its biceps💪!
TSMC - Volkswagen - General Motors...

200.gif



Extract Only
He disclosed that Volkswagen's CEO recently met with top executives from TSMC, GlobalFoundries, and Qualcomm to discuss semiconductor production capabilities and technologies. He said Volkswagen's top executives are deeply involved in overall semiconductor supply chains.


In the speech, Hellenthal did not mention Volkswagen’s own chip development. However, analysts say that Volkswagen might design semiconductors and outsource their production to TSMC in the future.


TSMC is working on cooperation with global automakers other than Volkswagen. At the end of November last year, General Motors (GM) announced that it will cooperate with TSMC to develop semiconductors for vehicles.

 
  • Like
  • Fire
  • Love
Reactions: 30 users
F

Filobeddo

Guest
OH MY GOSHKINS!!!! What are the 3 families of chips I wonder? What chips can multi-task I wonder? 🧠 🍟



General Motors is restructuring the way it manages its semiconductor chips to ensure it has enough for its vehicles.
In an interview with The Associated Press, CEO Mary Barra said GM will, by 2025, move to three families of chips that the company will buy and control itself.

She also said these chips will be able to do multiple tasks, eliminating the need for dozens of chips in every vehicle.

And Bosch pumping in 3 bill euros into their own chip production for the next 3 years 👌




And Bosch pumping in 3 bill euros into their own chip production for the next 3 years 👌
 
  • Like
  • Fire
  • Love
Reactions: 24 users
noun: costumer
  1. a person or company that makes or supplies theatrical or fancy-dress costumes.
    "a theatrical costumier"
 
  • Haha
Reactions: 1 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
  • Haha
  • Like
  • Fire
Reactions: 18 users

Newk R

Regular
There seems to be a continuous amount of around 400k sells below $1.20. Very annoying!😠
 
  • Like
  • Haha
Reactions: 9 users
F

Filobeddo

Guest
There seems to be a continuous amount of around 400k sells below $1.20. Very annoying!😠

Hopefully soon to be just roadkill 😉
 
  • Like
  • Fire
  • Haha
Reactions: 15 users

Labsy

Regular
There seems to be a continuous amount of around 400k sells below $1.20. Very annoying!😠
I reckon just the day traders punting on a natural consolidation towards end of Fridays trade and trying to time the peak. I think we are having a great run. Next week will be even better! ;)
 
  • Like
  • Fire
Reactions: 13 users

mrgds

Regular
closing price will be 1.295
:eek: ..................... "NOOOOOO , that cant be,
Our "FUZZY WUZZY FRIEND" says its $1.25 .................

( happy if your wrong today "fuzzy") :cool:

AKIDA ( not so fuzzy ) BALLISTA
 
  • Like
  • Fire
  • Haha
Reactions: 6 users

Diogenese

Top 20

Neuromorphic Sensing: Coming Soon to Consumer Products​

By Sally Ward-Foxton 07.20.2022 0
Share Post





What does “neuromorphic” mean today?
“You will get 10 different answers from 10 different people,” laughed Luca Verre, CEO of Prophesee. “As companies take the step from ‘this is what we believe’ to ‘how can we make this a reality,’ what neuromorphic means will change.”

Most companies doing neuromorphic sensing and computing have a similar vision in mind, he said, but implementations and strategies will be different based on varying product, market, and investment constraints.
ADVERTISEMENT
SCROLL DOWN TO CONTINUE READING
“The reason why… all these companies are working [on neuromorphic technologies] is because there is a fundamental belief that the biological model has superior characteristics compared to the conventional,” he said. “People make different assumptions on product, on system integration, on business opportunities, and they make different implementations… But fundamentally, the belief is the same.”
Prophesee CEO Luca Verre Luca Verre (Source: Prophesee)
Verre’s vision is that neuromorphic technologies can bring technology closer to human beings, which ultimately makes for a more immersive experience and allows technologies such as autonomous driving and augmented reality to be adopted faster.

“When people understand the technology behind it is closer to the way we work, and fundamentally natural, this is an incredible source of reassurance,” he said.

WHICH MARKETS FIRST?

Prophesee is already several years into its mission to commercialize the event–based camera using its proprietary dynamic vision sensor technology. The company has collaborated with camera leader Sony to make a compact, high–resolution event–based camera module, the IMX 636. In this module, the photodiode layer is stacked directly on top of the CMOS layer using Sony’s 3D die stacking process.
According to Verre, the sector closest to commercial adoption of this technology is industrial machine vision.
“Industrial is a leading segment today because historically we pushed our third–generation camera into this segment, which was a bigger sensor and more tuned for this type of application,” he said. “Industrial has historically been a very active machine vision segment, in fact, it is probably one of the segments that adopted the CCD and CMOS technologies at the very beginning… definitely a key market.”
Prophesee Sony IMX 636 event-based sensor Prophesee’s and Sony’s jointly developed IMX 6363 event-based sensor (Source: Prophesee)
The second key market for the IMX 636 is consumer technologies, driven by the shrink in size enabled by Sony’s die–stacking process. Consumer applications include IoT cameras, surveillance cameras, action cameras, drones, consumer robots, and even smartphones. In many cases, the event–based camera is used alongside a full–frame camera, detecting motion so that image processing can be applied to capture better quality images, even when the subject is moving.
“The reason is very simple: event–based cameras are great to understand motion,” he said. “This is what they are meant for. Frame–based cameras are more suited to understanding static information. The combination of the dynamic information from an event–based camera and static information from a frame–based camera is complementary if you want to capture a picture or video in a scene where there’s something moving.”
Event data can be combined with full–frame images to correct any blur on the frame, especially for action cameras and surveillance cameras.
“We clearly see some traction in this in this area, which of course is very promising because the volume typically associated with this application can be quite substantial compared to industrial vision,” he said.
Prophesee is also working with a customer on automotive driver monitoring solutions, where Verre said event–based cameras bring advantages in terms of low light performance, sensitivity, and fast detection. Applications here include eye blinking detection, tracking or face tracking, and micro–expression detection.

APPROACH TO COMMERCIALIZATION

Prophesee's EV4 evaluation kit's EV4 evaluation kit Prophesee’s EV4 evaluation kit (Source: Prophesee)
Prophesee has been working hard on driving commercialization of event–based cameras. The company recently released a new evaluation kit (EVK4) for the IMX 636. This kit is designed for industrial vision with a rugged housing but will work for all applications (Verre said several hundred of these kits have been sold). The company’s Metavision SDK for event–based vision has also recently been open–sourced in order to reduce friction in the adoption of event–based technology. The Metavision community has around 5,000 registered members today.
“The EDK is a great tool to further push and evangelize the technology, and it comes in a very typical form factor,” Verre said. “The SDK hides the perception of complexity that every engineer or researcher may have when testing or exploring a new technology… Think about engineers that have been working for a couple of decades on processing images that now see events… they don’t want to be stretched too much out of their comfort zone.”
New to the Metavision SDK is a simulator to convert full frames into events to help designers transition between the way they work today and the event domain. Noting a reluctance of some designers to move away from full frames, Verre said the simulator is intended to show them there’s nothing magic about events.
“[Events are] just a way of capturing information from the scene that contains much more temporal precision compared to images, and is actually much more relevant, because typically you get only what is changing,” he said.
The simulator can also reconstruct image full frames from event data, which he says people find reassuring.
“The majority of customers don’t pose this challenge any longer because they understand that they need to see from a different perspective, similar to when they use technology like time of flight or ultrasound,” he said. “The challenge is when their perception is that this is another image sensor… for this category of customer, we made this tool that can show them the way to transition stepwise to this new sensing modality… it is a mindset shift that may take some time, but it will come.”
Applications realized in the Prophesee developer communityinclude restoring some sight for the blind, detecting and classifying contaminants in medical samples, particle tracking in research, robotic touch sensors, and tracking space debris.

HARDWARE ROADMAP

In terms of roadmap, Prophesee plans to continue development of both hardware and software, alongside new evaluation kits, development kits, and reference designs. This may include system reference designs which combine Prohpesee sensors with specially developed processors. For example, Prohpesee partner iCatch has developed an AI vision processor SoC that interfaces natively with the IMX 636 and features an on–chip event decoder. Japanese AI core provider DMP is also working with Prophesee on an FPGA–based system, and there are more partnerships in the works, said Verre.
Prophesee Sony IMX 636 event-based camera Prophesee and Sony IMX 636 is a fourth-generation product. Prophesee said future generations will reduce pixel pitch and ease integration with conventional computing platforms (Source: Prophesee)
“We see that there is growing interest from ecosystem partners at the SoC level, but also the software level, that are interested in building new solutions based on Prophesee technology,” he said. “This type of asset is important for the community, because it is another step towards the full solution — they can get the sensor, camera, computing platform, and software to develop an entire solution.”
Where does event–based sensor hardware go from here? Verre cited two key directions the technology will move in. The first is further reduction of pixel size (pixel pitch) and overall reduction of the sensor to make it suitable for compact consumer applications such as wearables. The second is facilitating the integration of event–based sensing with conventional SoC platforms.
Working with computing companies will be critically important to ensure next–generation sensors natively embed the capability to interface with the computing platform, which simplifies the task at the system level. The result will be smarter sensors, with added intelligence at the sensor level.
“We think events make sense, so let’s do more pre-processing inside the sensor itself, because it’s where you can make the least compromise,” Verre said. “The closer you get to the acquisition of the information, the better off you are in terms of efficiency and low latency. You also avoid the need to encode and transmit the data. So this is something that we are pursuing.”
As foundries continue to make progress in the 3D stacking process, stacking in two or even three layers using the most advanced CMOS processes can help bring more intelligence down to the pixel level.
How much intelligence in the pixel is the right amount?
Verre said it’s a compromise between increasing the cost of silicon and having sufficient intelligence to make sure the interface with conventional computing platforms is good enough.
“Sensors don’t typically use advanced process nodes, 28nm or 22nm at most,” he said. “Mainstream SoCs use 12nm, 7nm, 5nm, and below, so they’re on technology nodes that can compress the digital component extremely well. The size versus cost equation means at a certain point it’s more efficient, more economical [to put the intelligence] in the SoC.”
There is also a certain synergy to combining event–based sensors with neuromorphic computing architectures.
“The ultimate goal of neuromorphic technology is to have both the sensing and processing neuromorphic or event–based, but we are not yet there in terms of maturity of this type of solution,” he said. “We are very active in this area to prepare for the future — we are working with Intel, SynSense, and other partners in this area — but in the short term, the mainstream market is occupied by conventional SoC platforms.
Prophesee’s approach here is pragmatic. Verre said the company’s aim is to try to minimize any compromises to deliver benefits that are superior to conventional solutions.
“Ultimately we believe that events should naturally stream asynchronously to a compute architecture that is also asynchronous in order to benefit fully in terms of latency and power,” he said. “But we need to be pragmatic and stage this evolution, and really capitalize on the existing platforms out there and work with key partners in this space that are willing to invest in software–hardware developments and to optimize certain solution for certain markets.”

The company has collaborated with camera leader Sony to make a compact, high–resolution event–based camera module, the IMX 636. In this module, the photodiode layer is stacked directly on top of the CMOS layer using Sony’s 3D die stacking process.

The ultimate goal of neuromorphic technology is to have both the sensing and processing neuromorphic or event–based, but we are not yet there in terms of maturity of this type of solution,” he said. “We are very active in this area to prepare for the future — we are working with Intel, SynSense, and other partners in this area — but in the short term, the mainstream market is occupied by conventional SoC platforms.

So can the Akida IP be adapted for manufacture using the Sony 3d die stacking process?
 
  • Like
  • Thinking
  • Love
Reactions: 26 users

alwaysgreen

Top 20
See bel
$1.2950 CLOSE OUT

A couple of killer whales are lurking in the depths :ROFLMAO::ROFLMAO::ROFLMAO::ROFLMAO:
If we hit $1.29 today I'll do a nudie run down my street and post it on here.

I hope you are right...
 
  • Like
  • Haha
  • Fire
Reactions: 27 users
F

Filobeddo

Guest
  • Haha
  • Like
Reactions: 9 users

MDhere

Regular
Seriously how good are these 2 pages!

AKIDA RULES 😃
20220721_115731.jpg
20220721_115747.jpg
 
  • Like
  • Love
Reactions: 54 users

Esq.111

Fascinatingly Intuitive.
Chippers,

Next step up incoming

Esq.
 
  • Like
  • Wow
  • Thinking
Reactions: 22 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
See bel

If we hit $1.29 today I'll do a nudie run down my street and post it on here.

I hope you are right...

I hope you live on a bush block.
 
  • Haha
  • Like
  • Fire
Reactions: 21 users

TopCat

Regular
This could be our link to Untether AI and General Motors


Robert Beachler, Vice President of Product
Mr. Beachler joins Untether AI in the role of Vice President of Product. A Silicon Valley veteran and proven senior executive with industry leaders such as Altera, Xilinx, and BrainChip, he brings to the company a wealth of experience in the development and marketing of FPGAs, software tools, vision processors and artificial intelligence acceleration devices.
 
  • Like
  • Wow
  • Love
Reactions: 30 users
F

Filobeddo

Guest
  • Haha
  • Like
Reactions: 4 users

mrgds

Regular
I hope you live on a bush block.
im hoping the video will be from the front , as to see "his/her bush block"...............:D
 
  • Haha
  • Like
  • Thinking
Reactions: 15 users
  • Haha
  • Like
  • Thinking
Reactions: 11 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
  • Haha
  • Like
Reactions: 9 users

Diogenese

Top 20

Neuromorphic Sensing: Coming Soon to Consumer Products​

By Sally Ward-Foxton 07.20.2022 0
Share Post





What does “neuromorphic” mean today?
“You will get 10 different answers from 10 different people,” laughed Luca Verre, CEO of Prophesee. “As companies take the step from ‘this is what we believe’ to ‘how can we make this a reality,’ what neuromorphic means will change.”

Most companies doing neuromorphic sensing and computing have a similar vision in mind, he said, but implementations and strategies will be different based on varying product, market, and investment constraints.
ADVERTISEMENT
SCROLL DOWN TO CONTINUE READING
“The reason why… all these companies are working [on neuromorphic technologies] is because there is a fundamental belief that the biological model has superior characteristics compared to the conventional,” he said. “People make different assumptions on product, on system integration, on business opportunities, and they make different implementations… But fundamentally, the belief is the same.”
Prophesee CEO Luca Verre Luca Verre (Source: Prophesee)
Verre’s vision is that neuromorphic technologies can bring technology closer to human beings, which ultimately makes for a more immersive experience and allows technologies such as autonomous driving and augmented reality to be adopted faster.

“When people understand the technology behind it is closer to the way we work, and fundamentally natural, this is an incredible source of reassurance,” he said.

WHICH MARKETS FIRST?

Prophesee is already several years into its mission to commercialize the event–based camera using its proprietary dynamic vision sensor technology. The company has collaborated with camera leader Sony to make a compact, high–resolution event–based camera module, the IMX 636. In this module, the photodiode layer is stacked directly on top of the CMOS layer using Sony’s 3D die stacking process.
According to Verre, the sector closest to commercial adoption of this technology is industrial machine vision.
“Industrial is a leading segment today because historically we pushed our third–generation camera into this segment, which was a bigger sensor and more tuned for this type of application,” he said. “Industrial has historically been a very active machine vision segment, in fact, it is probably one of the segments that adopted the CCD and CMOS technologies at the very beginning… definitely a key market.”
Prophesee Sony IMX 636 event-based sensor Prophesee’s and Sony’s jointly developed IMX 6363 event-based sensor (Source: Prophesee)
The second key market for the IMX 636 is consumer technologies, driven by the shrink in size enabled by Sony’s die–stacking process. Consumer applications include IoT cameras, surveillance cameras, action cameras, drones, consumer robots, and even smartphones. In many cases, the event–based camera is used alongside a full–frame camera, detecting motion so that image processing can be applied to capture better quality images, even when the subject is moving.
“The reason is very simple: event–based cameras are great to understand motion,” he said. “This is what they are meant for. Frame–based cameras are more suited to understanding static information. The combination of the dynamic information from an event–based camera and static information from a frame–based camera is complementary if you want to capture a picture or video in a scene where there’s something moving.”
Event data can be combined with full–frame images to correct any blur on the frame, especially for action cameras and surveillance cameras.
“We clearly see some traction in this in this area, which of course is very promising because the volume typically associated with this application can be quite substantial compared to industrial vision,” he said.
Prophesee is also working with a customer on automotive driver monitoring solutions, where Verre said event–based cameras bring advantages in terms of low light performance, sensitivity, and fast detection. Applications here include eye blinking detection, tracking or face tracking, and micro–expression detection.

APPROACH TO COMMERCIALIZATION

Prophesee's EV4 evaluation kit's EV4 evaluation kit Prophesee’s EV4 evaluation kit (Source: Prophesee)
Prophesee has been working hard on driving commercialization of event–based cameras. The company recently released a new evaluation kit (EVK4) for the IMX 636. This kit is designed for industrial vision with a rugged housing but will work for all applications (Verre said several hundred of these kits have been sold). The company’s Metavision SDK for event–based vision has also recently been open–sourced in order to reduce friction in the adoption of event–based technology. The Metavision community has around 5,000 registered members today.
“The EDK is a great tool to further push and evangelize the technology, and it comes in a very typical form factor,” Verre said. “The SDK hides the perception of complexity that every engineer or researcher may have when testing or exploring a new technology… Think about engineers that have been working for a couple of decades on processing images that now see events… they don’t want to be stretched too much out of their comfort zone.”
New to the Metavision SDK is a simulator to convert full frames into events to help designers transition between the way they work today and the event domain. Noting a reluctance of some designers to move away from full frames, Verre said the simulator is intended to show them there’s nothing magic about events.
“[Events are] just a way of capturing information from the scene that contains much more temporal precision compared to images, and is actually much more relevant, because typically you get only what is changing,” he said.
The simulator can also reconstruct image full frames from event data, which he says people find reassuring.
“The majority of customers don’t pose this challenge any longer because they understand that they need to see from a different perspective, similar to when they use technology like time of flight or ultrasound,” he said. “The challenge is when their perception is that this is another image sensor… for this category of customer, we made this tool that can show them the way to transition stepwise to this new sensing modality… it is a mindset shift that may take some time, but it will come.”
Applications realized in the Prophesee developer communityinclude restoring some sight for the blind, detecting and classifying contaminants in medical samples, particle tracking in research, robotic touch sensors, and tracking space debris.

HARDWARE ROADMAP

In terms of roadmap, Prophesee plans to continue development of both hardware and software, alongside new evaluation kits, development kits, and reference designs. This may include system reference designs which combine Prohpesee sensors with specially developed processors. For example, Prohpesee partner iCatch has developed an AI vision processor SoC that interfaces natively with the IMX 636 and features an on–chip event decoder. Japanese AI core provider DMP is also working with Prophesee on an FPGA–based system, and there are more partnerships in the works, said Verre.
Prophesee Sony IMX 636 event-based camera Prophesee and Sony IMX 636 is a fourth-generation product. Prophesee said future generations will reduce pixel pitch and ease integration with conventional computing platforms (Source: Prophesee)
“We see that there is growing interest from ecosystem partners at the SoC level, but also the software level, that are interested in building new solutions based on Prophesee technology,” he said. “This type of asset is important for the community, because it is another step towards the full solution — they can get the sensor, camera, computing platform, and software to develop an entire solution.”
Where does event–based sensor hardware go from here? Verre cited two key directions the technology will move in. The first is further reduction of pixel size (pixel pitch) and overall reduction of the sensor to make it suitable for compact consumer applications such as wearables. The second is facilitating the integration of event–based sensing with conventional SoC platforms.
Working with computing companies will be critically important to ensure next–generation sensors natively embed the capability to interface with the computing platform, which simplifies the task at the system level. The result will be smarter sensors, with added intelligence at the sensor level.
“We think events make sense, so let’s do more pre-processing inside the sensor itself, because it’s where you can make the least compromise,” Verre said. “The closer you get to the acquisition of the information, the better off you are in terms of efficiency and low latency. You also avoid the need to encode and transmit the data. So this is something that we are pursuing.”
As foundries continue to make progress in the 3D stacking process, stacking in two or even three layers using the most advanced CMOS processes can help bring more intelligence down to the pixel level.
How much intelligence in the pixel is the right amount?
Verre said it’s a compromise between increasing the cost of silicon and having sufficient intelligence to make sure the interface with conventional computing platforms is good enough.
“Sensors don’t typically use advanced process nodes, 28nm or 22nm at most,” he said. “Mainstream SoCs use 12nm, 7nm, 5nm, and below, so they’re on technology nodes that can compress the digital component extremely well. The size versus cost equation means at a certain point it’s more efficient, more economical [to put the intelligence] in the SoC.”
There is also a certain synergy to combining event–based sensors with neuromorphic computing architectures.
“The ultimate goal of neuromorphic technology is to have both the sensing and processing neuromorphic or event–based, but we are not yet there in terms of maturity of this type of solution,” he said. “We are very active in this area to prepare for the future — we are working with Intel, SynSense, and other partners in this area — but in the short term, the mainstream market is occupied by conventional SoC platforms.
Prophesee’s approach here is pragmatic. Verre said the company’s aim is to try to minimize any compromises to deliver benefits that are superior to conventional solutions.
“Ultimately we believe that events should naturally stream asynchronously to a compute architecture that is also asynchronous in order to benefit fully in terms of latency and power,” he said. “But we need to be pragmatic and stage this evolution, and really capitalize on the existing platforms out there and work with key partners in this space that are willing to invest in software–hardware developments and to optimize certain solution for certain markets.”

The company has collaborated with camera leader Sony to make a compact, high–resolution event–based camera module, the IMX 636. In this module, the photodiode layer is stacked directly on top of the CMOS layer using Sony’s 3D die stacking process.

So can the Akida IP be adapted for manufacture using the Sony 3d die stacking process?
 
  • Like
  • Thinking
  • Fire
Reactions: 8 users
Top Bottom