BRN Discussion Ongoing

Just playing devil's advocate ....... so what if Sean H doesn't reach his previous AGM mooted expectations, then imo he will most likely be saying that he was unable to reach Co goals because of continuing global influences.... so what will happen to him after that..... imo most likely nothing else and things will just continue to run it's course regardless.
How can we say they have not already been achieved?

Income was not given as a performance indicator by which to judge his and the Boards performance.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Fire
  • Thinking
Reactions: 15 users

cassip

Regular
Hello from Germany,

at the moment ice rain in the south, streets slippery as hell due to the low temperature.

Question: is possible BRN price action slippery terrain for shorters?

Have a good day all and take care
Cassip
 
  • Like
  • Haha
  • Love
Reactions: 26 users
S

Straw

Guest
Hello from Germany,

at the moment ice rain in the south, streets slippery as hell due to the low temperature.

Question: is possible BRN price action slippery terrain for shorters?

Have a good day all and take care
Cassip
Oh I really hope so.
I guess it depends how ethical the instos are (open to correction).

The best we can do in Southern NSW is temps in the low 20s (centigrade) though that is a lot cooler than 35C which would be more normal.
Been very consistently nice but breezy. The only third hand/anecdotal reference I have from anyone for a season similar was 1978 courtesy of an old farmer. It has rarely been above 25C and we are in our first month of Summer.
 
Last edited by a moderator:
  • Like
  • Love
Reactions: 6 users

robsmark

Regular
How can we say they have not already been achieved?

Income was not given as a performance indicator by which to judge his and the Boards performance.

My opinion only DYOR
FF

AKIDA BALLISTA
Hmmmm… I think it’s a given that revenue is the kind of performance we are all expecting. Three years into commercialisation, and a year into his role of CEO - Sean and the board know what shareholders expect very well, and a few partnerships wouldn’t cut the mustard in my opinion.

That being said, I remain optimistic and excited to see what he is to be judged by. I think 2023 could be a fantastic year for this company.
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 24 users

Violin1

Regular
Oh I really hope so.
I guess it depends how ethical the instos are (open to correction).

The best we can do in Southern NSW is temps in the low 20s (centigrade) though that is a lot cooler than 35C which would be more normal.
Been very consistently nice but breezy.
Hasn't helped my painting before Christmas!
 
  • Sad
  • Like
Reactions: 5 users

TopCat

Regular
On ph and road at the mo so no chance to check the links and presentation PDFs etc but found this on the Sony EU side which was earlier this month by looks and a bunch of links on the page to strategy, R&D, AI / AR etc.

Don't know if any snippets of interesting info there.


R&D Strategy Briefing and Sony Technology Exchange Fair(STEF)
Today, Sony Group Corporation (“Sony”) held an R&D strategy briefing in conjunction with the opening of “Sony Technology Exchange Fair 2022” (“STEF 2022”), a cross-group technology exchange event.

Sony’s Purpose is to “fill the world with emotion, through the power of creativity and technology”. At the R&D strategy briefing, Sony set forth its basic approach to R&D to realize this purpose over the long term and to create technologies that will support Sony’s business.

“STEF 2022” introduced a selection of the latest technologies and related efforts of Sony Group’s diverse businesses and R&D organizations under the theme of “Technology that inspires emotion”. STEF is an annual technology exchange event within the Sony Group, which was initiated in 1973 with the aim of creating new value through the exchange of ideas and exposure to a wide range of technologies developed by each business and R&D organization. To commemorate the 50th anniversary of the event, Sony made a portion of the technology exhibition public for the first time.


Also, this one below from earlier this year to do with SSS that came from their Aitrios links (Sony AI).

Several presentation videos on imaging and sensing from what I can see.

Sony Semiconductor Solutions Group released videos of the event​

"The 1st Sense the Wonder Day" to share thoughts with stakeholders.​

February 16, 2022

On January 25th, Sony Semiconductor Solutions Corporation (SSS) held "Sense the Wonder Day," an event to share with a wide range of stakeholders, including employees, the concept behind the company's new corporate slogan, "Sense the Wonder."

"Sense the Wonder" is not only express the existence of the SSS Group, but also conveys the SSS Group's desire to "feel more curiosity" and "make the world more full of surprises and excitement.”

On the day of the event, more than 8,500 people, including not only SSS Group and Sony Group employees, but also partner companies, students, and members of the press, participated in the online live distribution. SSS President and CEO Terushi Shimizu, shared his thoughts on the new corporate slogan, and introduced the people and technologies that support the fields of imaging and sensing.

Thanks FMF, I just watched a YouTube presentation from STEF. It was the opening presentation from Hiroaki Kitano, Executive Vice President and CTO. Towards the end of the presentation he mentions their goal to create an R&D ecosystem, to bring together different companies and technologies to advance our futures and sustainability. Would be great if Brainchip were invited to join their ecosystem 🤔🤞

“Hiroaki Kitano took the stage at the R&D Strategy Briefing and shared his thoughts on Sony Group’s R&D Mission, which is to “Push our civilization forward and make this planet sustainable”. He then defined sensing, AI, digital virtual spaces as key technical domains in R&D that will play a central role in expanding Sony’s business in the future, and explained his aims to transform Sony into an AI and data-driven company through the collaboration between technology in these three domains. To achieve this goal, he also mentioned he will strengthen Sony’s R&D team, which maximizes the potential of each individual member, while at the same time endeavor to make Sony’s R&D team the “the engine of innovation” that constantly innovate the company.”
 
  • Like
  • Fire
  • Love
Reactions: 14 users
S

Straw

Guest
Hasn't helped my painting before Christmas!
Ya and all my new seedlings which should be powering away are just sitting there with an exclamation bubble above them stating 'What the.......?"
 
  • Haha
  • Like
Reactions: 6 users
I know fact finder always says 1% of this or that but surely BRN can achieve 5% of all that 🤞
All the talk on here sounds like where got a far greater share than that
 

Diogenese

Top 20
Brainchip agrees!
Yes. I think the BrainChip/SiFive partnership started a FOMO snowball picking up ARM and Intel with a bit of help from MegaChips.

The Intel Foundry Services IP Alliance:
https://www.intel.com/content/www/us/en/foundry/ifs-accelerator/ip-alliance.html

We could soon see products including Brainchip/ARM SNNs made by Intel Foundry Services (IFS) competing with products including Brainchip/SiFive SNNs made by IFS, and both competing with products including Brainchip/Intel SNNs made by IFS.

Just as Brainchip covered the RISC-IV and RISC-V bases with ARM and SiFive, Intel has now spread its bets to cover the red and the black.
 
  • Like
  • Fire
  • Love
Reactions: 56 users

Learning

Learning to the Top 🕵‍♂️
Yes. I think the BrainChip/SiFive partnership started a FOMO snowball picking up ARM and Intel with a bit of help from MegaChips.

The Intel Foundry Services IP Alliance:
https://www.intel.com/content/www/us/en/foundry/ifs-accelerator/ip-alliance.html

We could soon see products including Brainchip/ARM SNNs made by Intel Foundry Services (IFS) competing with products including Brainchip/SiFive SNNs made by IFS, and both competing with products including Brainchip/Intel SNNs made by IFS.

Just as Brainchip covered the RISC-IV and RISC-V bases with ARM and SiFive, Intel has now spread its bets to cover the red and the black.
So from your perspective Dio,

Should BrainChip position itself into the Intel's Pathfinder Ecosystem also, to spread Brainchip's wings bigger?


Learning
 
  • Like
  • Fire
Reactions: 10 users

RobjHunt

Regular
I totally agree DB.
How the hell making a profit from a falling share price is legal beggers belief because of the obvious - manipulation.
The problem is with shorters' is that over the years they have perfected the art of manipulation and unless there is an exceptional announcement, they will continue to bend us over.................
But what can us poor old retailers do? Don't let the pricks get any of your shares.
All good things come to those who wait.🥳
Baron..
Correct!

Pantene.
 
  • Like
  • Fire
Reactions: 5 users

Diogenese

Top 20
So from your perspective Dio,

Should BrainChip position itself into the Intel's Pathfinder Ecosystem also, to spread Brainchip's wings bigger?


Learning
Supersleuthing Learning,

Sifive have been collaborating for "a multi-year history", so this would be a solid launch pad to incorporate Akida.

SiFive

“Intel and SiFive, the founder and leader in RISC-V computing, share a multi-year history of collaboration, and we are pleased that Intel selected the SiFive® PerformanceTM P550 core both as the heart of the Horse Creek development platform and for use with Intel® Pathfinder FPGA-based development tools,” said Phil Dworsky, SiFive Global Head of Strategic Alliances. “Intel® Pathfinder for RISC-V* gives software developers a head start in preparation for the highly anticipated Horse Creek boards, which are on track for delivery this year. SiFive is excited to work with Intel on this project, to engage with mutual customers, and together to fuel innovation in the fast-growing RISC-V ecosystem.”

https://pathfinder.intel.com/news/i...new-capabilities-for-pre-silicon-development/

In fact, they could make an immediate start with the Akida simulator (nee ADE) in MetaTF.

SANTA CLARA, C.A. — August 30, 2022 — Intel® Pathfinder for RISC-V* is launching today to transform the way SOC architects and system software developers define new products. It allows for a variety of RISC-V cores and other IP to be instantiated on FPGA and simulator platforms, with the ability to run industry leading operating systems and tool chains within a unified IDE.

Sifive is still very much the new kid on the block, so having Intel's endorsement will supercharge its market penetration, and AI/one-shot on chip ML is all the rage.

I think we are looking at event horizon spaghettifying acceleration.
 
  • Like
  • Fire
  • Love
Reactions: 51 users

goodvibes

Regular
Pat Gelsinger, Intel: Our real-time deepfake detection platform uses FakeCatcher technology to analyze biometric signs like 'blood flow' in video pixels—a world-first and a prime example of @Intel's work in responsible AI.

 
  • Like
  • Fire
Reactions: 14 users

Deadpool

hyper-efficient Ai
Yes. I think the BrainChip/SiFive partnership started a FOMO snowball picking up ARM and Intel with a bit of help from MegaChips.

The Intel Foundry Services IP Alliance:
https://www.intel.com/content/www/us/en/foundry/ifs-accelerator/ip-alliance.html

We could soon see products including Brainchip/ARM SNNs made by Intel Foundry Services (IFS) competing with products including Brainchip/SiFive SNNs made by IFS, and both competing with products including Brainchip/Intel SNNs made by IFS.

Just as Brainchip covered the RISC-IV and RISC-V bases with ARM and SiFive, Intel has now spread its bets to cover the red and the black.
A 3 way Dodgy, I like it, I like it a lot.

Sport Drag GIF by Puppy Bowl
Whatever way you look at it, it's just a win, win, win for BRN
 
  • Haha
  • Like
  • Fire
Reactions: 19 users

Learning

Learning to the Top 🕵‍♂️
Supersleuthing Learning,

Sifive have been collaborating for "a multi-year history", so this would be a solid launch pad to incorporate Akida.

SiFive

“Intel and SiFive, the founder and leader in RISC-V computing, share a multi-year history of collaboration, and we are pleased that Intel selected the SiFive® PerformanceTM P550 core both as the heart of the Horse Creek development platform and for use with Intel® Pathfinder FPGA-based development tools,” said Phil Dworsky, SiFive Global Head of Strategic Alliances. “Intel® Pathfinder for RISC-V* gives software developers a head start in preparation for the highly anticipated Horse Creek boards, which are on track for delivery this year. SiFive is excited to work with Intel on this project, to engage with mutual customers, and together to fuel innovation in the fast-growing RISC-V ecosystem.”

https://pathfinder.intel.com/news/i...new-capabilities-for-pre-silicon-development/

In fact, they could make an immediate start with the Akida simulator (nee ADE) in MetaTF.

SANTA CLARA, C.A. — August 30, 2022 — Intel® Pathfinder for RISC-V* is launching today to transform the way SOC architects and system software developers define new products. It allows for a variety of RISC-V cores and other IP to be instantiated on FPGA and simulator platforms, with the ability to run industry leading operating systems and tool chains within a unified IDE.

Sifive is still very much the new kid on the block, so having Intel's endorsement will supercharge its market penetration, and AI/one-shot on chip ML is all the rage.

I think we are looking at event horizon spaghettifying acceleration.
Thanks Dio,

I really think BrainChip is a suitable candidate to join Intel's Pathfinder Ecosystem, as both of BrainChip's partner; Renesas & SiFive is within the Pathfinder Ecosystem.

As you say, "AI/one-shot on chip ML is all the rage" ❤️


Learning,
Learning everyday.
 
  • Like
  • Fire
Reactions: 17 users
Supersleuthing Learning,

Sifive have been collaborating for "a multi-year history", so this would be a solid launch pad to incorporate Akida.

SiFive

“Intel and SiFive, the founder and leader in RISC-V computing, share a multi-year history of collaboration, and we are pleased that Intel selected the SiFive® PerformanceTM P550 core both as the heart of the Horse Creek development platform and for use with Intel® Pathfinder FPGA-based development tools,” said Phil Dworsky, SiFive Global Head of Strategic Alliances. “Intel® Pathfinder for RISC-V* gives software developers a head start in preparation for the highly anticipated Horse Creek boards, which are on track for delivery this year. SiFive is excited to work with Intel on this project, to engage with mutual customers, and together to fuel innovation in the fast-growing RISC-V ecosystem.”

https://pathfinder.intel.com/news/i...new-capabilities-for-pre-silicon-development/

In fact, they could make an immediate start with the Akida simulator (nee ADE) in MetaTF.

SANTA CLARA, C.A. — August 30, 2022 — Intel® Pathfinder for RISC-V* is launching today to transform the way SOC architects and system software developers define new products. It allows for a variety of RISC-V cores and other IP to be instantiated on FPGA and simulator platforms, with the ability to run industry leading operating systems and tool chains within a unified IDE.

Sifive is still very much the new kid on the block, so having Intel's endorsement will supercharge its market penetration, and AI/one-shot on chip ML is all the rage.

I think we are looking at event horizon spaghettifying acceleration.
Hey @Diogenese

Have you ever done a bit of a dive into iCatch Tech?

The other partner / collaborator of Prophesee who released their info around April this year and we did ours around June.

I was looking into them and one of their news items about the IMX636 which is Sony's sensor, then coupled with Metavision and iCatch V57 SoC.

I found some info sheets as the AI details was a bit light on elsewhere. The Vi37 references CNN if I read right but the V57 and others just reference a NPU IP.

Appears not using low bit weights 8-16(?)

But Akida can run higher though if wanted yeah?

Just musing about any possible way they could use our IP as CNN2SNN on their chip which stacks with Sony IMX636 sensor and uses Metavision SDK or whether we would have to come through the Metavision side?

News link below and site with chip products.

News


Products


Couple snips from news.

Now iCatch has collaborated with Prophesee on Event-based Metavision sensing projects that integrated iCatch’s V57 AI vision processor with the new Sony Semiconductor Solutions ("SSS") stacked Event-based Vision Sensor IMX636, realized in collaboration between SSS and Prophesee.

iCatch built the OpenMVCam development platform for all algorithm partners and ODM customers to design a variety of AI products, systems and applications on many market segments like surveillance, smart healthcare, in-cabin monitor system, smart home, industrial automation, smart city and so on.

SSS’s Event-based Vision Sensor can be the input source for iCatch AI vision processor, based on iCatch built-in NPU acceleration engine and proprietary CV processing hardware engine. It can integrate Prophesee Metavision Intelligence SDK and other 3rd party’s machine vision algorithms to support end customers’ AI vision applications such as DMS and OMS in automotive in-cabin, patient/elder fall detection in home healthcare and hospitals, intruder detection in home surveillance, anomaly detection in industrial automation, gesture control in smart home appliance, eye tracking in AR/VR and so on.
 
  • Like
  • Fire
Reactions: 12 users
Was just skimming through some recent EI vids on YT and this one uploaded 18hrs ago.

They are doing some modelling on Texas Inst and wasn't about using BRN however spotted MetaTF Model now in Beta mode on the Impulse Studio...woo hoo


IMG_20221214_220813.jpg


Spotted around the 32min mark.

 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 29 users

Diogenese

Top 20
Hey @Diogenese

Have you ever done a bit of a dive into iCatch Tech?

The other partner / collaborator of Prophesee who released their info around April this year and we did ours around June.

I was looking into them and one of their news items about the IMX636 which is Sony's sensor, then coupled with Metavision and iCatch V57 SoC.

I found some info sheets as the AI details was a bit light on elsewhere. The Vi37 references CNN if I read right but the V57 and others just reference a NPU IP.

Appears not using low bit weights 8-16(?)

But Akida can run higher though if wanted yeah?

Just musing about any possible way they could use our IP as CNN2SNN on their chip which stacks with Sony IMX636 sensor and uses Metavision SDK or whether we would have to come through the Metavision side?

News link below and site with chip products.

News


Products


Couple snips from news.

Now iCatch has collaborated with Prophesee on Event-based Metavision sensing projects that integrated iCatch’s V57 AI vision processor with the new Sony Semiconductor Solutions ("SSS") stacked Event-based Vision Sensor IMX636, realized in collaboration between SSS and Prophesee.

iCatch built the OpenMVCam development platform for all algorithm partners and ODM customers to design a variety of AI products, systems and applications on many market segments like surveillance, smart healthcare, in-cabin monitor system, smart home, industrial automation, smart city and so on.

SSS’s Event-based Vision Sensor can be the input source for iCatch AI vision processor, based on iCatch built-in NPU acceleration engine and proprietary CV processing hardware engine. It can integrate Prophesee Metavision Intelligence SDK and other 3rd party’s machine vision algorithms to support end customers’ AI vision applications such as DMS and OMS in automotive in-cabin, patient/elder fall detection in home healthcare and hospitals, intruder detection in home surveillance, anomaly detection in industrial automation, gesture control in smart home appliance, eye tracking in AR/VR and so on.
Hi Fmf,

I didn't find any NN related patents for icatch.

The V57 is an Image Signal processor which includes an NN core:

1671028373723.png




A group of CV engines which comprise of an optical flow processing (OPF) accelerator, matrix arithmetic engine (MAE) with DSP functions and high-performance NPU engine, can manipulate the image and video data in real-time to support edge computing and intelligent processing, such as pedestrian detection, vechicle detection, distance extraction, behavior recognition, gesture recognition, or even action recognition, and much more. The embedded USB3.2 Gen1 device interface and support ultra high-speed and high-resolution video data transfer.



2.3. Image process acceleration engine

 Matrix operation engines

 Scaling up/down engine

 De-warping engine for lens distortion correction (LDC) and multi[1]view affine transformation

 Motion detection engine

 HW accelerated optical flow engine for DVS sensor applications

 Pre/Post processing for NPU acceleration



2.5. Neural network accelerator

 High performance 1.2 TOPS NPU engine

Supports weight/bias quantization using UINT8, INT8, INT16,

Float16, BFloat16 and Post-training quantization

 MAE engine – pre/post DSP accelerator



It does not include SNN.

It would be a case of either/or as far as the iCatch NN is concerned.

They would have been working with Prophesee for some time


https://www.prophesee.ai/2022/04/19/icatch-prophesee-collaboration-ai-vision-processor/

iCatch and Prophesee collaborated on development of AI vision processor natively compatible with Prophesee Event-based Metavision® sensing technologies.​


iCatch Technology (iCatch) has focused on image signal processing (ISP) technology and Camera SOC for over two decades and aggressively invested in research and development in more machine learning (ML) related technology and application.

Now iCatch has collaborated with Prophesee on Event-based Metavision sensing projects that integrated iCatch’s V57 AI vision processor with the new Sony Semiconductor Solutions (“SSS”) stacked Event-based Vision Sensor IMX636, realized in collaboration between SSS and Prophesee.
iCatch built the OpenMVCam development platform for all algorithm partners and ODM customers to design a variety of AI products, systems and applications on many market segments like surveillance, smart healthcare, in-cabin monitor system, smart home, industrial automation, smart city and so on.

iCatch also provides a high customized service and an excellent image quality system to become the eyes of all of machine vision equipment and smart devices in the future.

SSS’s Event-based Vision Sensor can be the input source for iCatch AI vision processor, based on iCatch built-in NPU acceleration engine and proprietary CV processing hardware engine. It can integrate Prophesee Metavision Intelligence SDK and other 3rd party’s machine vision algorithms to support end customers’ AI vision applications such as DMS and OMS in automotive in-cabin, patient/elder fall detection in home healthcare and hospitals, intruder detection in home surveillance, anomaly detection in industrial automation, gesture control in smart home appliance, eye tracking in AR/VR and so on
.

So it looks like you have found the missing link in the Sony/Prophesee/? triangle.

But it was after this that Prophesee confessed its undying love for Akida. So, as I speculated when the Sony/Prophesee collaboration was announced, unfortunately they are probably contractually bound to iCatch for the first born.

So, as someone speculated above, was the Apple CEO inspecting iCatch's offspring, or was there a newer and better love child in the cradle?

You've got to ask yourself one question - would Sony or Prophesee be happy showing off an inferior solution?
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 28 users
Hi Fmf,

I didn't find any NN related patents for icatch.

The V57 is an Image Signal processor which includes an NN core:

View attachment 24378



A group of CV engines which comprise of an optical flow processing (OPF) accelerator, matrix arithmetic engine (MAE) with DSP functions and high-performance NPU engine, can manipulate the image and video data in real-time to support edge computing and intelligent processing, such as pedestrian detection, vechicle detection, distance extraction, behavior recognition, gesture recognition, or even action recognition, and much more. The embedded USB3.2 Gen1 device interface and support ultra high-speed and high-resolution video data transfer.



2.3. Image process acceleration engine

 Matrix operation engines

 Scaling up/down engine

 De-warping engine for lens distortion correction (LDC) and multi[1]view affine transformation

 Motion detection engine

 HW accelerated optical flow engine for DVS sensor applications

 Pre/Post processing for NPU acceleration



2.5. Neural network accelerator

 High performance 1.2 TOPS NPU engine

Supports weight/bias quantization using UINT8, INT8, INT16,

Float16, BFloat16 and Post-training quantization

 MAE engine – pre/post DSP accelerator



It does not include SNN.

It would be a case of either/or as far as the iCatch NN is concerned.

They would have been working with Prophesee for some time


https://www.prophesee.ai/2022/04/19/icatch-prophesee-collaboration-ai-vision-processor/

iCatch and Prophesee collaborated on development of AI vision processor natively compatible with Prophesee Event-based Metavision® sensing technologies.​


iCatch Technology (iCatch) has focused on image signal processing (ISP) technology and Camera SOC for over two decades and aggressively invested in research and development in more machine learning (ML) related technology and application.

Now iCatch has collaborated with Prophesee on Event-based Metavision sensing projects that integrated iCatch’s V57 AI vision processor with the new Sony Semiconductor Solutions (“SSS”) stacked Event-based Vision Sensor IMX636, realized in collaboration between SSS and Prophesee.
iCatch built the OpenMVCam development platform for all algorithm partners and ODM customers to design a variety of AI products, systems and applications on many market segments like surveillance, smart healthcare, in-cabin monitor system, smart home, industrial automation, smart city and so on.

iCatch also provides a high customized service and an excellent image quality system to become the eyes of all of machine vision equipment and smart devices in the future.

SSS’s Event-based Vision Sensor can be the input source for iCatch AI vision processor, based on iCatch built-in NPU acceleration engine and proprietary CV processing hardware engine. It can integrate Prophesee Metavision Intelligence SDK and other 3rd party’s machine vision algorithms to support end customers’ AI vision applications such as DMS and OMS in automotive in-cabin, patient/elder fall detection in home healthcare and hospitals, intruder detection in home surveillance, anomaly detection in industrial automation, gesture control in smart home appliance, eye tracking in AR/VR and so on
.

So it looks like you have found the missing link in the Sony/Prophesee/? triangle.

But it was after this that Prophesee confessed its undying love for Akida. So, as I speculated when the Sony/Prophesee collaboration was announced, unfortunately they are probably contractually bound to iCatch for the first born.

So, as someone speculated above, was the Apple CEO inspecting iCatch's offspring, or was there a newer and better love child in the cradle?

You've got to ask yourself one question - would Sony or Prophesee be happy showing off an inferior solution?
Thanks D

Like you, I had a quick patent skim and couldn't find anything to do with AI per se either.

I did notice the CNN reference on the Vi37 but then just NPU on others as I said hence the question on CNN2SNN if viable.

When you say missing link I presume you mean the AI processing component (not us in that IMX) to create that package?

I'll try dig around for earliest Prophesee/ iCatch connection mentions maybe tomoz.

Begs question maybe then of how long working with iCatch and how long working with us before either public news releases and was there any parallel Testing / Dev?
 
  • Like
  • Fire
Reactions: 13 users
Hi Fmf,

I didn't find any NN related patents for icatch.

The V57 is an Image Signal processor which includes an NN core:

View attachment 24378



A group of CV engines which comprise of an optical flow processing (OPF) accelerator, matrix arithmetic engine (MAE) with DSP functions and high-performance NPU engine, can manipulate the image and video data in real-time to support edge computing and intelligent processing, such as pedestrian detection, vechicle detection, distance extraction, behavior recognition, gesture recognition, or even action recognition, and much more. The embedded USB3.2 Gen1 device interface and support ultra high-speed and high-resolution video data transfer.



2.3. Image process acceleration engine

 Matrix operation engines

 Scaling up/down engine

 De-warping engine for lens distortion correction (LDC) and multi[1]view affine transformation

 Motion detection engine

 HW accelerated optical flow engine for DVS sensor applications

 Pre/Post processing for NPU acceleration



2.5. Neural network accelerator

 High performance 1.2 TOPS NPU engine

Supports weight/bias quantization using UINT8, INT8, INT16,

Float16, BFloat16 and Post-training quantization

 MAE engine – pre/post DSP accelerator



It does not include SNN.

It would be a case of either/or as far as the iCatch NN is concerned.

They would have been working with Prophesee for some time


https://www.prophesee.ai/2022/04/19/icatch-prophesee-collaboration-ai-vision-processor/

iCatch and Prophesee collaborated on development of AI vision processor natively compatible with Prophesee Event-based Metavision® sensing technologies.​


iCatch Technology (iCatch) has focused on image signal processing (ISP) technology and Camera SOC for over two decades and aggressively invested in research and development in more machine learning (ML) related technology and application.

Now iCatch has collaborated with Prophesee on Event-based Metavision sensing projects that integrated iCatch’s V57 AI vision processor with the new Sony Semiconductor Solutions (“SSS”) stacked Event-based Vision Sensor IMX636, realized in collaboration between SSS and Prophesee.
iCatch built the OpenMVCam development platform for all algorithm partners and ODM customers to design a variety of AI products, systems and applications on many market segments like surveillance, smart healthcare, in-cabin monitor system, smart home, industrial automation, smart city and so on.

iCatch also provides a high customized service and an excellent image quality system to become the eyes of all of machine vision equipment and smart devices in the future.

SSS’s Event-based Vision Sensor can be the input source for iCatch AI vision processor, based on iCatch built-in NPU acceleration engine and proprietary CV processing hardware engine. It can integrate Prophesee Metavision Intelligence SDK and other 3rd party’s machine vision algorithms to support end customers’ AI vision applications such as DMS and OMS in automotive in-cabin, patient/elder fall detection in home healthcare and hospitals, intruder detection in home surveillance, anomaly detection in industrial automation, gesture control in smart home appliance, eye tracking in AR/VR and so on
.

So it looks like you have found the missing link in the Sony/Prophesee/? triangle.

But it was after this that Prophesee confessed its undying love for Akida. So, as I speculated when the Sony/Prophesee collaboration was announced, unfortunately they are probably contractually bound to iCatch for the first born.

So, as someone speculated above, was the Apple CEO inspecting iCatch's offspring, or was there a newer and better love child in the cradle?

You've got to ask yourself one question - would Sony or Prophesee be happy showing off an inferior solution?
Just found these earlier references and distribution partner. No mention of iCatch?

From


Linked to here



IMX636 and IMX637 event-based vision
29 November 2021

Macnica ATD Europe today announced to offer the Event-based vision sensor (“EVS”) from its long-term distribution partner, Sony Semiconductor Solutions Corporation (“Sony”). The two sensors IMX636 and IMX637 were made possible through collaboration between Sony and Prophesee, another distribution partner of Macnica ATD Europe.

EVS realizes high-speed data output with low latency by limiting the output data to luminance changes from each pixel, combined with information on pixel position coordinates and time. Only the pixels that have detected a change in luminance for the object can output data, allowing the sensor to immediately detect the luminance changes with high-speed, low-latency, high-temporal-resolution while operating with low power consumption. It represents a whole new approach compared to the commonly used frame-based method, where the entire image is output at certain intervals determined by the frame rate.

Application fields where the two new models of EVS specifically exploit their advantages over frame-based sensors are for example sensing changes in sparks produced during welding and metal cutting; and sensing slight changes in vibration and detecting abnormalities for use in predictive maintenance.

Macnica ATD Europe as a prime distribution partner of both Sony and Prophesee covers the full cooperation on the EVS technology and offers technical support as well as the free loan of evaluation kits in different versions from first “hands on” evaluation to full performance evaluation (available in Q4). The cooperation also covers the Metavision® Intelligence Suite from Prophesee, an event signal processing software optimized for the sensors performance, which is available through Macnica ATD Europe. Combining Sony’s event-based vision sensors with this software will enable efficient application development and provide solutions for various use cases.

To here


Partners

 
  • Like
  • Fire
  • Love
Reactions: 15 users
Top Bottom