BRN Discussion Ongoing

Diogenese

Top 20

Neuromorphic Sensing: Coming Soon to Consumer Products​

By Sally Ward-Foxton 07.20.2022 0
Share Post





What does “neuromorphic” mean today?
“You will get 10 different answers from 10 different people,” laughed Luca Verre, CEO of Prophesee. “As companies take the step from ‘this is what we believe’ to ‘how can we make this a reality,’ what neuromorphic means will change.”

Most companies doing neuromorphic sensing and computing have a similar vision in mind, he said, but implementations and strategies will be different based on varying product, market, and investment constraints.
ADVERTISEMENT
SCROLL DOWN TO CONTINUE READING
“The reason why… all these companies are working [on neuromorphic technologies] is because there is a fundamental belief that the biological model has superior characteristics compared to the conventional,” he said. “People make different assumptions on product, on system integration, on business opportunities, and they make different implementations… But fundamentally, the belief is the same.”
Prophesee CEO Luca Verre Luca Verre (Source: Prophesee)
Verre’s vision is that neuromorphic technologies can bring technology closer to human beings, which ultimately makes for a more immersive experience and allows technologies such as autonomous driving and augmented reality to be adopted faster.

“When people understand the technology behind it is closer to the way we work, and fundamentally natural, this is an incredible source of reassurance,” he said.

WHICH MARKETS FIRST?

Prophesee is already several years into its mission to commercialize the event–based camera using its proprietary dynamic vision sensor technology. The company has collaborated with camera leader Sony to make a compact, high–resolution event–based camera module, the IMX 636. In this module, the photodiode layer is stacked directly on top of the CMOS layer using Sony’s 3D die stacking process.
According to Verre, the sector closest to commercial adoption of this technology is industrial machine vision.
“Industrial is a leading segment today because historically we pushed our third–generation camera into this segment, which was a bigger sensor and more tuned for this type of application,” he said. “Industrial has historically been a very active machine vision segment, in fact, it is probably one of the segments that adopted the CCD and CMOS technologies at the very beginning… definitely a key market.”
Prophesee Sony IMX 636 event-based sensor Prophesee’s and Sony’s jointly developed IMX 6363 event-based sensor (Source: Prophesee)
The second key market for the IMX 636 is consumer technologies, driven by the shrink in size enabled by Sony’s die–stacking process. Consumer applications include IoT cameras, surveillance cameras, action cameras, drones, consumer robots, and even smartphones. In many cases, the event–based camera is used alongside a full–frame camera, detecting motion so that image processing can be applied to capture better quality images, even when the subject is moving.
“The reason is very simple: event–based cameras are great to understand motion,” he said. “This is what they are meant for. Frame–based cameras are more suited to understanding static information. The combination of the dynamic information from an event–based camera and static information from a frame–based camera is complementary if you want to capture a picture or video in a scene where there’s something moving.”
Event data can be combined with full–frame images to correct any blur on the frame, especially for action cameras and surveillance cameras.
“We clearly see some traction in this in this area, which of course is very promising because the volume typically associated with this application can be quite substantial compared to industrial vision,” he said.
Prophesee is also working with a customer on automotive driver monitoring solutions, where Verre said event–based cameras bring advantages in terms of low light performance, sensitivity, and fast detection. Applications here include eye blinking detection, tracking or face tracking, and micro–expression detection.

APPROACH TO COMMERCIALIZATION

Prophesee's EV4 evaluation kit's EV4 evaluation kit Prophesee’s EV4 evaluation kit (Source: Prophesee)
Prophesee has been working hard on driving commercialization of event–based cameras. The company recently released a new evaluation kit (EVK4) for the IMX 636. This kit is designed for industrial vision with a rugged housing but will work for all applications (Verre said several hundred of these kits have been sold). The company’s Metavision SDK for event–based vision has also recently been open–sourced in order to reduce friction in the adoption of event–based technology. The Metavision community has around 5,000 registered members today.
“The EDK is a great tool to further push and evangelize the technology, and it comes in a very typical form factor,” Verre said. “The SDK hides the perception of complexity that every engineer or researcher may have when testing or exploring a new technology… Think about engineers that have been working for a couple of decades on processing images that now see events… they don’t want to be stretched too much out of their comfort zone.”
New to the Metavision SDK is a simulator to convert full frames into events to help designers transition between the way they work today and the event domain. Noting a reluctance of some designers to move away from full frames, Verre said the simulator is intended to show them there’s nothing magic about events.
“[Events are] just a way of capturing information from the scene that contains much more temporal precision compared to images, and is actually much more relevant, because typically you get only what is changing,” he said.
The simulator can also reconstruct image full frames from event data, which he says people find reassuring.
“The majority of customers don’t pose this challenge any longer because they understand that they need to see from a different perspective, similar to when they use technology like time of flight or ultrasound,” he said. “The challenge is when their perception is that this is another image sensor… for this category of customer, we made this tool that can show them the way to transition stepwise to this new sensing modality… it is a mindset shift that may take some time, but it will come.”
Applications realized in the Prophesee developer communityinclude restoring some sight for the blind, detecting and classifying contaminants in medical samples, particle tracking in research, robotic touch sensors, and tracking space debris.

HARDWARE ROADMAP

In terms of roadmap, Prophesee plans to continue development of both hardware and software, alongside new evaluation kits, development kits, and reference designs. This may include system reference designs which combine Prohpesee sensors with specially developed processors. For example, Prohpesee partner iCatch has developed an AI vision processor SoC that interfaces natively with the IMX 636 and features an on–chip event decoder. Japanese AI core provider DMP is also working with Prophesee on an FPGA–based system, and there are more partnerships in the works, said Verre.
Prophesee Sony IMX 636 event-based camera Prophesee and Sony IMX 636 is a fourth-generation product. Prophesee said future generations will reduce pixel pitch and ease integration with conventional computing platforms (Source: Prophesee)
“We see that there is growing interest from ecosystem partners at the SoC level, but also the software level, that are interested in building new solutions based on Prophesee technology,” he said. “This type of asset is important for the community, because it is another step towards the full solution — they can get the sensor, camera, computing platform, and software to develop an entire solution.”
Where does event–based sensor hardware go from here? Verre cited two key directions the technology will move in. The first is further reduction of pixel size (pixel pitch) and overall reduction of the sensor to make it suitable for compact consumer applications such as wearables. The second is facilitating the integration of event–based sensing with conventional SoC platforms.
Working with computing companies will be critically important to ensure next–generation sensors natively embed the capability to interface with the computing platform, which simplifies the task at the system level. The result will be smarter sensors, with added intelligence at the sensor level.
“We think events make sense, so let’s do more pre-processing inside the sensor itself, because it’s where you can make the least compromise,” Verre said. “The closer you get to the acquisition of the information, the better off you are in terms of efficiency and low latency. You also avoid the need to encode and transmit the data. So this is something that we are pursuing.”
As foundries continue to make progress in the 3D stacking process, stacking in two or even three layers using the most advanced CMOS processes can help bring more intelligence down to the pixel level.
How much intelligence in the pixel is the right amount?
Verre said it’s a compromise between increasing the cost of silicon and having sufficient intelligence to make sure the interface with conventional computing platforms is good enough.
“Sensors don’t typically use advanced process nodes, 28nm or 22nm at most,” he said. “Mainstream SoCs use 12nm, 7nm, 5nm, and below, so they’re on technology nodes that can compress the digital component extremely well. The size versus cost equation means at a certain point it’s more efficient, more economical [to put the intelligence] in the SoC.”
There is also a certain synergy to combining event–based sensors with neuromorphic computing architectures.
“The ultimate goal of neuromorphic technology is to have both the sensing and processing neuromorphic or event–based, but we are not yet there in terms of maturity of this type of solution,” he said. “We are very active in this area to prepare for the future — we are working with Intel, SynSense, and other partners in this area — but in the short term, the mainstream market is occupied by conventional SoC platforms.
Prophesee’s approach here is pragmatic. Verre said the company’s aim is to try to minimize any compromises to deliver benefits that are superior to conventional solutions.
“Ultimately we believe that events should naturally stream asynchronously to a compute architecture that is also asynchronous in order to benefit fully in terms of latency and power,” he said. “But we need to be pragmatic and stage this evolution, and really capitalize on the existing platforms out there and work with key partners in this space that are willing to invest in software–hardware developments and to optimize certain solution for certain markets.”

The company has collaborated with camera leader Sony to make a compact, high–resolution event–based camera module, the IMX 636. In this module, the photodiode layer is stacked directly on top of the CMOS layer using Sony’s 3D die stacking process.

So can the Akida IP be adapted for manufacture using the Sony 3d die stacking process?
 
  • Like
  • Thinking
  • Fire
Reactions: 8 users

buena suerte :-)

BOB Bank of Brainchip
  • Fire
  • Like
Reactions: 7 users
Chippers,

Next step up incoming

Esq.
Certainly get the felling this is going to be a tough tug of war to get to $1.20 let alone past it.

The last couple of days, we didn’t see the usual traders lunch break. BRN drove straight through walls even through the the lunch time lull period.

Let’s see what happens here. We are at a tipping point so to speak. This $1.20 level has been the toughest hurdle in this current run on the SP. If it breaks down the $1.20 the Tech could be on the money as there will be a stampede to the next hurdle at $1.30.

Fascinating watching this unfold.
 
  • Like
  • Fire
  • Love
Reactions: 28 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
im hoping the video will be from the front , as to see "his/her bush block"...............:D




200.gif
 
  • Haha
  • Love
Reactions: 8 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
See bel

If we hit $1.29 today I'll do a nudie run down my street and post it on here.

I hope you are right...


Perfect timing for a HUGE announcement to drop unexpectedly out of blue.



peter-griffin-family-guy.gif
 
  • Like
  • Haha
  • Fire
Reactions: 18 users

Wags

Regular
The company has collaborated with camera leader Sony to make a compact, high–resolution event–based camera module, the IMX 636. In this module, the photodiode layer is stacked directly on top of the CMOS layer using Sony’s 3D die stacking process.

So can the Akida IP be adapted for manufacture using the Sony 3d die stacking process?
Gee, I wouldn't have a clue.
I would refer this to that Dodgey Knees fella.
Of all the known unknowns, he's the smartest I know. :cool:
 
  • Like
  • Haha
  • Fire
Reactions: 13 users

mrgds

Regular
  • Haha
Reactions: 7 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Gee, I wouldn't have a clue.
I would refer this to that Dodgey Knees fella.
Of all the known unknowns, he's the smartest I know. :cool:

Got a bit excited, then stopped getting excited, then read some of it and then got excited again. I have absolutely no idea what I'm looking at 🥴though and I don't even know if it's relevant, so this ones for Dodgy Knees to decipher @Diogenese

Screen Shot 2022-07-21 at 12.37.32 pm.png


Screen Shot 2022-07-21 at 12.38.28 pm.png



 
  • Like
  • Thinking
  • Fire
Reactions: 11 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
  • Haha
  • Like
  • Fire
Reactions: 13 users
D

Deleted member 118

Guest
  • Haha
  • Like
Reactions: 6 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
What am I missing here?




RFTA and I both stumbled on the same thing (albeit different info) .. about something called "Akita Elpida", in an attempt to establish whether Akida IP could be adapted for manufacture using the Sony 3d die stacking process.
 
  • Like
  • Thinking
Reactions: 6 users

Newk R

Regular
See bel

If we hit $1.29 today I'll do a nudie run down my street and post it on here.

I hope you are right...
Please finish at $1.285.....😂
 
  • Haha
  • Like
  • Fire
Reactions: 8 users
D

Deleted member 118

Guest
RFTA and I both stumbled on the same thing .. Akita Elpida (albeit different info).. in terms of whether Akida IP could be adapted for manufacture using the Sony 3d die stacking process.

 
  • Haha
  • Like
Reactions: 5 users

MDhere

Regular
  • Like
  • Haha
Reactions: 4 users
Stress much :ROFLMAO:

62d8c48ddb37b366100974.gif
 
  • Like
  • Haha
  • Love
Reactions: 22 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Which Brissie street may that be :ROFLMAO:


What's your address @alwaysgreen because @MDhere really doesn't want to miss the show. She promises she won't take too many photos of you, nothing incriminating, only a few tasteful images to share around privately here on TSE.
 
  • Haha
  • Like
Reactions: 15 users
Top Bottom