Oh I really hope so.Hello from Germany,
at the moment ice rain in the south, streets slippery as hell due to the low temperature.
Question: is possible BRN price action slippery terrain for shorters?
Have a good day all and take care
Cassip
Hmmmm… I think it’s a given that revenue is the kind of performance we are all expecting. Three years into commercialisation, and a year into his role of CEO - Sean and the board know what shareholders expect very well, and a few partnerships wouldn’t cut the mustard in my opinion.How can we say they have not already been achieved?
Income was not given as a performance indicator by which to judge his and the Boards performance.
My opinion only DYOR
FF
AKIDA BALLISTA
Hasn't helped my painting before Christmas!Oh I really hope so.
I guess it depends how ethical the instos are (open to correction).
The best we can do in Southern NSW is temps in the low 20s (centigrade) though that is a lot cooler than 35C which would be more normal.
Been very consistently nice but breezy.
Thanks FMF, I just watched a YouTube presentation from STEF. It was the opening presentation from Hiroaki Kitano, Executive Vice President and CTO. Towards the end of the presentation he mentions their goal to create an R&D ecosystem, to bring together different companies and technologies to advance our futures and sustainability. Would be great if Brainchip were invited to join their ecosystemOn ph and road at the mo so no chance to check the links and presentation PDFs etc but found this on the Sony EU side which was earlier this month by looks and a bunch of links on the page to strategy, R&D, AI / AR etc.
Don't know if any snippets of interesting info there.
R&D Strategy Briefing and Sony Technology Exchange Fair(STEF)
Today, Sony Group Corporation (“Sony”) held an R&D strategy briefing in conjunction with the opening of “Sony Technology Exchange Fair 2022” (“STEF 2022”), a cross-group technology exchange event.
Sony’s Purpose is to “fill the world with emotion, through the power of creativity and technology”. At the R&D strategy briefing, Sony set forth its basic approach to R&D to realize this purpose over the long term and to create technologies that will support Sony’s business.
“STEF 2022” introduced a selection of the latest technologies and related efforts of Sony Group’s diverse businesses and R&D organizations under the theme of “Technology that inspires emotion”. STEF is an annual technology exchange event within the Sony Group, which was initiated in 1973 with the aim of creating new value through the exchange of ideas and exposure to a wide range of technologies developed by each business and R&D organization. To commemorate the 50th anniversary of the event, Sony made a portion of the technology exhibition public for the first time.
Also, this one below from earlier this year to do with SSS that came from their Aitrios links (Sony AI).
Several presentation videos on imaging and sensing from what I can see.
Sony Semiconductor Solutions Group released videos of the event
"The 1st Sense the Wonder Day" to share thoughts with stakeholders.
February 16, 2022
On January 25th, Sony Semiconductor Solutions Corporation (SSS) held "Sense the Wonder Day," an event to share with a wide range of stakeholders, including employees, the concept behind the company's new corporate slogan, "Sense the Wonder."
"Sense the Wonder" is not only express the existence of the SSS Group, but also conveys the SSS Group's desire to "feel more curiosity" and "make the world more full of surprises and excitement.”
On the day of the event, more than 8,500 people, including not only SSS Group and Sony Group employees, but also partner companies, students, and members of the press, participated in the online live distribution. SSS President and CEO Terushi Shimizu, shared his thoughts on the new corporate slogan, and introduced the people and technologies that support the fields of imaging and sensing.
![]()
Sony Semiconductor Solutions Group released videos of the event<br>"The 1st Sense the Wonder Day" to share thoughts with stakeholders. | Feature|Sony Semiconductor Solutions Group
Introducing special contents of Sony Semiconductor Solutions Group.www.sony-semicon.com
Ya and all my new seedlings which should be powering away are just sitting there with an exclamation bubble above them stating 'What the.......?"Hasn't helped my painting before Christmas!
All the talk on here sounds like where got a far greater share than thatI know fact finder always says 1% of this or that but surely BRN can achieve 5% of all that![]()
Yes. I think the BrainChip/SiFive partnership started a FOMO snowball picking up ARM and Intel with a bit of help from MegaChips.Brainchip agrees!
So from your perspective Dio,Yes. I think the BrainChip/SiFive partnership started a FOMO snowball picking up ARM and Intel with a bit of help from MegaChips.
The Intel Foundry Services IP Alliance:
https://www.intel.com/content/www/us/en/foundry/ifs-accelerator/ip-alliance.html
We could soon see products including Brainchip/ARM SNNs made by Intel Foundry Services (IFS) competing with products including Brainchip/SiFive SNNs made by IFS, and both competing with products including Brainchip/Intel SNNs made by IFS.
Just as Brainchip covered the RISC-IV and RISC-V bases with ARM and SiFive, Intel has now spread its bets to cover the red and the black.
Correct!I totally agree DB.
How the hell making a profit from a falling share price is legal beggers belief because of the obvious - manipulation.
The problem is with shorters' is that over the years they have perfected the art of manipulation and unless there is an exceptional announcement, they will continue to bend us over.................
But what can us poor old retailers do? Don't let the pricks get any of your shares.
All good things come to those who wait.
Baron..
Supersleuthing Learning,So from your perspective Dio,
Should BrainChip position itself into the Intel's Pathfinder Ecosystem also, to spread Brainchip's wings bigger?
Learning
A 3 way Dodgy, I like it, I like it a lot.Yes. I think the BrainChip/SiFive partnership started a FOMO snowball picking up ARM and Intel with a bit of help from MegaChips.
The Intel Foundry Services IP Alliance:
https://www.intel.com/content/www/us/en/foundry/ifs-accelerator/ip-alliance.html
We could soon see products including Brainchip/ARM SNNs made by Intel Foundry Services (IFS) competing with products including Brainchip/SiFive SNNs made by IFS, and both competing with products including Brainchip/Intel SNNs made by IFS.
Just as Brainchip covered the RISC-IV and RISC-V bases with ARM and SiFive, Intel has now spread its bets to cover the red and the black.
Thanks Dio,Supersleuthing Learning,
Sifive have been collaborating for "a multi-year history", so this would be a solid launch pad to incorporate Akida.
SiFive
“Intel and SiFive, the founder and leader in RISC-V computing, share a multi-year history of collaboration, and we are pleased that Intel selected the SiFive® PerformanceTM P550 core both as the heart of the Horse Creek development platform and for use with Intel® Pathfinder FPGA-based development tools,” said Phil Dworsky, SiFive Global Head of Strategic Alliances. “Intel® Pathfinder for RISC-V* gives software developers a head start in preparation for the highly anticipated Horse Creek boards, which are on track for delivery this year. SiFive is excited to work with Intel on this project, to engage with mutual customers, and together to fuel innovation in the fast-growing RISC-V ecosystem.”
https://pathfinder.intel.com/news/i...new-capabilities-for-pre-silicon-development/
In fact, they could make an immediate start with the Akida simulator (nee ADE) in MetaTF.
SANTA CLARA, C.A. — August 30, 2022 — Intel® Pathfinder for RISC-V* is launching today to transform the way SOC architects and system software developers define new products. It allows for a variety of RISC-V cores and other IP to be instantiated on FPGA and simulator platforms, with the ability to run industry leading operating systems and tool chains within a unified IDE.
Sifive is still very much the new kid on the block, so having Intel's endorsement will supercharge its market penetration, and AI/one-shot on chip ML is all the rage.
I think we are looking at event horizon spaghettifying acceleration.
Hey @DiogeneseSupersleuthing Learning,
Sifive have been collaborating for "a multi-year history", so this would be a solid launch pad to incorporate Akida.
SiFive
“Intel and SiFive, the founder and leader in RISC-V computing, share a multi-year history of collaboration, and we are pleased that Intel selected the SiFive® PerformanceTM P550 core both as the heart of the Horse Creek development platform and for use with Intel® Pathfinder FPGA-based development tools,” said Phil Dworsky, SiFive Global Head of Strategic Alliances. “Intel® Pathfinder for RISC-V* gives software developers a head start in preparation for the highly anticipated Horse Creek boards, which are on track for delivery this year. SiFive is excited to work with Intel on this project, to engage with mutual customers, and together to fuel innovation in the fast-growing RISC-V ecosystem.”
https://pathfinder.intel.com/news/i...new-capabilities-for-pre-silicon-development/
In fact, they could make an immediate start with the Akida simulator (nee ADE) in MetaTF.
SANTA CLARA, C.A. — August 30, 2022 — Intel® Pathfinder for RISC-V* is launching today to transform the way SOC architects and system software developers define new products. It allows for a variety of RISC-V cores and other IP to be instantiated on FPGA and simulator platforms, with the ability to run industry leading operating systems and tool chains within a unified IDE.
Sifive is still very much the new kid on the block, so having Intel's endorsement will supercharge its market penetration, and AI/one-shot on chip ML is all the rage.
I think we are looking at event horizon spaghettifying acceleration.
Hi Fmf,Hey @Diogenese
Have you ever done a bit of a dive into iCatch Tech?
The other partner / collaborator of Prophesee who released their info around April this year and we did ours around June.
I was looking into them and one of their news items about the IMX636 which is Sony's sensor, then coupled with Metavision and iCatch V57 SoC.
I found some info sheets as the AI details was a bit light on elsewhere. The Vi37 references CNN if I read right but the V57 and others just reference a NPU IP.
Appears not using low bit weights 8-16(?)
But Akida can run higher though if wanted yeah?
Just musing about any possible way they could use our IP as CNN2SNN on their chip which stacks with Sony IMX636 sensor and uses Metavision SDK or whether we would have to come through the Metavision side?
News link below and site with chip products.
News
![]()
iCatch Technology
iCatch Technology is a fabless IC design company, we have a excellent expertise in image signal processing, machine vision, IC design and system integration.www.icatchtek.com
Products
![]()
iCatch Technology
iCatch Technology is a fabless IC design company, we have a excellent expertise in image signal processing, machine vision, IC design and system integration.www.icatchtek.com
Couple snips from news.
Now iCatch has collaborated with Prophesee on Event-based Metavision sensing projects that integrated iCatch’s V57 AI vision processor with the new Sony Semiconductor Solutions ("SSS") stacked Event-based Vision Sensor IMX636, realized in collaboration between SSS and Prophesee.
iCatch built the OpenMVCam development platform for all algorithm partners and ODM customers to design a variety of AI products, systems and applications on many market segments like surveillance, smart healthcare, in-cabin monitor system, smart home, industrial automation, smart city and so on.
SSS’s Event-based Vision Sensor can be the input source for iCatch AI vision processor, based on iCatch built-in NPU acceleration engine and proprietary CV processing hardware engine. It can integrate Prophesee Metavision Intelligence SDK and other 3rd party’s machine vision algorithms to support end customers’ AI vision applications such as DMS and OMS in automotive in-cabin, patient/elder fall detection in home healthcare and hospitals, intruder detection in home surveillance, anomaly detection in industrial automation, gesture control in smart home appliance, eye tracking in AR/VR and so on.
Thanks DHi Fmf,
I didn't find any NN related patents for icatch.
The V57 is an Image Signal processor which includes an NN core:
View attachment 24378
A group of CV engines which comprise of an optical flow processing (OPF) accelerator, matrix arithmetic engine (MAE) with DSP functions and high-performance NPU engine, can manipulate the image and video data in real-time to support edge computing and intelligent processing, such as pedestrian detection, vechicle detection, distance extraction, behavior recognition, gesture recognition, or even action recognition, and much more. The embedded USB3.2 Gen1 device interface and support ultra high-speed and high-resolution video data transfer.
2.3. Image process acceleration engine
Matrix operation engines
Scaling up/down engine
De-warping engine for lens distortion correction (LDC) and multi[1]view affine transformation
Motion detection engine
HW accelerated optical flow engine for DVS sensor applications
Pre/Post processing for NPU acceleration
2.5. Neural network accelerator
High performance 1.2 TOPS NPU engine
Supports weight/bias quantization using UINT8, INT8, INT16,
Float16, BFloat16 and Post-training quantization
MAE engine – pre/post DSP accelerator
It does not include SNN.
It would be a case of either/or as far as the iCatch NN is concerned.
They would have been working with Prophesee for some time
https://www.prophesee.ai/2022/04/19/icatch-prophesee-collaboration-ai-vision-processor/
iCatch and Prophesee collaborated on development of AI vision processor natively compatible with Prophesee Event-based Metavision® sensing technologies.
iCatch Technology (iCatch) has focused on image signal processing (ISP) technology and Camera SOC for over two decades and aggressively invested in research and development in more machine learning (ML) related technology and application.
Now iCatch has collaborated with Prophesee on Event-based Metavision sensing projects that integrated iCatch’s V57 AI vision processor with the new Sony Semiconductor Solutions (“SSS”) stacked Event-based Vision Sensor IMX636, realized in collaboration between SSS and Prophesee.
iCatch built the OpenMVCam development platform for all algorithm partners and ODM customers to design a variety of AI products, systems and applications on many market segments like surveillance, smart healthcare, in-cabin monitor system, smart home, industrial automation, smart city and so on.
iCatch also provides a high customized service and an excellent image quality system to become the eyes of all of machine vision equipment and smart devices in the future.
SSS’s Event-based Vision Sensor can be the input source for iCatch AI vision processor, based on iCatch built-in NPU acceleration engine and proprietary CV processing hardware engine. It can integrate Prophesee Metavision Intelligence SDK and other 3rd party’s machine vision algorithms to support end customers’ AI vision applications such as DMS and OMS in automotive in-cabin, patient/elder fall detection in home healthcare and hospitals, intruder detection in home surveillance, anomaly detection in industrial automation, gesture control in smart home appliance, eye tracking in AR/VR and so on.
So it looks like you have found the missing link in the Sony/Prophesee/? triangle.
But it was after this that Prophesee confessed its undying love for Akida. So, as I speculated when the Sony/Prophesee collaboration was announced, unfortunately they are probably contractually bound to iCatch for the first born.
So, as someone speculated above, was the Apple CEO inspecting iCatch's offspring, or was there a newer and better love child in the cradle?
You've got to ask yourself one question - would Sony or Prophesee be happy showing off an inferior solution?
Just found these earlier references and distribution partner. No mention of iCatch?Hi Fmf,
I didn't find any NN related patents for icatch.
The V57 is an Image Signal processor which includes an NN core:
View attachment 24378
A group of CV engines which comprise of an optical flow processing (OPF) accelerator, matrix arithmetic engine (MAE) with DSP functions and high-performance NPU engine, can manipulate the image and video data in real-time to support edge computing and intelligent processing, such as pedestrian detection, vechicle detection, distance extraction, behavior recognition, gesture recognition, or even action recognition, and much more. The embedded USB3.2 Gen1 device interface and support ultra high-speed and high-resolution video data transfer.
2.3. Image process acceleration engine
Matrix operation engines
Scaling up/down engine
De-warping engine for lens distortion correction (LDC) and multi[1]view affine transformation
Motion detection engine
HW accelerated optical flow engine for DVS sensor applications
Pre/Post processing for NPU acceleration
2.5. Neural network accelerator
High performance 1.2 TOPS NPU engine
Supports weight/bias quantization using UINT8, INT8, INT16,
Float16, BFloat16 and Post-training quantization
MAE engine – pre/post DSP accelerator
It does not include SNN.
It would be a case of either/or as far as the iCatch NN is concerned.
They would have been working with Prophesee for some time
https://www.prophesee.ai/2022/04/19/icatch-prophesee-collaboration-ai-vision-processor/
iCatch and Prophesee collaborated on development of AI vision processor natively compatible with Prophesee Event-based Metavision® sensing technologies.
iCatch Technology (iCatch) has focused on image signal processing (ISP) technology and Camera SOC for over two decades and aggressively invested in research and development in more machine learning (ML) related technology and application.
Now iCatch has collaborated with Prophesee on Event-based Metavision sensing projects that integrated iCatch’s V57 AI vision processor with the new Sony Semiconductor Solutions (“SSS”) stacked Event-based Vision Sensor IMX636, realized in collaboration between SSS and Prophesee.
iCatch built the OpenMVCam development platform for all algorithm partners and ODM customers to design a variety of AI products, systems and applications on many market segments like surveillance, smart healthcare, in-cabin monitor system, smart home, industrial automation, smart city and so on.
iCatch also provides a high customized service and an excellent image quality system to become the eyes of all of machine vision equipment and smart devices in the future.
SSS’s Event-based Vision Sensor can be the input source for iCatch AI vision processor, based on iCatch built-in NPU acceleration engine and proprietary CV processing hardware engine. It can integrate Prophesee Metavision Intelligence SDK and other 3rd party’s machine vision algorithms to support end customers’ AI vision applications such as DMS and OMS in automotive in-cabin, patient/elder fall detection in home healthcare and hospitals, intruder detection in home surveillance, anomaly detection in industrial automation, gesture control in smart home appliance, eye tracking in AR/VR and so on.
So it looks like you have found the missing link in the Sony/Prophesee/? triangle.
But it was after this that Prophesee confessed its undying love for Akida. So, as I speculated when the Sony/Prophesee collaboration was announced, unfortunately they are probably contractually bound to iCatch for the first born.
So, as someone speculated above, was the Apple CEO inspecting iCatch's offspring, or was there a newer and better love child in the cradle?
You've got to ask yourself one question - would Sony or Prophesee be happy showing off an inferior solution?