BRN Discussion Ongoing

ItsKane

Regular
G’day FF, how is your $2 ish share price looking by Christmas? my $4.25 is looking very unrealistic 😬
This time last year $1 felt so far, yet it was so close.
 
  • Like
  • Love
  • Fire
Reactions: 18 users

TopCat

Regular
Came across this while looking into Qualcomm’s Ride platform. Was trying to find a link between Qualcomm and Prophesee and autonomous vehicles. Not sure of the date though. View attachment 18165
Could the new vision application come from Prophesee? 🤔 and the new ADAS SoC use Akida 🤔 . Could that be the same hardware foundation mentioned?

Qualcomm is announcing a new generation of its Snapdragon ADAS SoC and an entirely new family of SoC’s aimed just at automotive vision applications.


the Snapdragon Ride Vision is a new class of SoC. The Vision SoC will use the same technology as the ADAS SoC but is optimized for vision applications.

But most important is that the new Vision SoCs will be software-compatible with the ADAS SoC and Accelerator because they use the same hardware foundation.
 
  • Like
  • Thinking
Reactions: 8 users
G’day FF, how is your $2 ish share price looking by Christmas? my $4.25 is looking very unrealistic 😬
Ask me at Christmas. 🤣😂🤣 Puto had not decided to conquer Europe starting with the Ukraine when I made my prediction.

His present idea that he will nuke the Ukraine, Berlin and London as punishment if the West does not let him win could probably reduce the chances of $2.75.

All that before we get to ‘what will I do today to destroy investor confidence and the pound’ Prime Minister Truss. Throw in the US Federal Reserve and crush inflation at any cost policy and things become very uncertain.

Now the correct size of the next 4C lump, a couple of significant patent announcements and a few extra significant ecosystems partners and who knows we may all have a whole turkey for Christmas.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Fire
  • Love
Reactions: 58 users

JK200SX

Regular
Ali Muhammad

Ali Muhammad​

Tampere, Pirkanmaa, Finland​

1,604 followers 500+ connections​

Join to connect
EUROPEAN DYNAMICS EUROPEAN DYNAMICS
Tampere University of Technology Tampere University of Technology

About​

I am passionate about the modern robotics and opportunities it can bring to us. I am interested to understand how digitalization and robotics together can be most beneficial for the future societies. I formulate international industry and academia projects on these topics with multi-disciplinary and multi-cultural teams to make robotics and technologies as part of the solution for grand challenges faced by our generation.

Top name this bloke,
I bet he floats like a bee and stings like a butterfly :)

(sorry, couldn't help myself)
 
  • Haha
  • Like
Reactions: 14 users

equanimous

Norse clairvoyant shapeshifter goddess
Came across this while looking into Qualcomm’s Ride platform. Was trying to find a link between Qualcomm and Prophesee and autonomous vehicles. Not sure of the date though. View attachment 18165
Possibilities...

locationsMarkham CAN

time typeFull time

posted onPosted 8 Days Ago

job requisition id3037856


Company:

Qualcomm Canada ULC

Job Area:

Engineering Group, Engineering Group > ASICS Engineering

General Summary:

Qualcomm is the world's leading developer of next-generation of always connected Edge-AI processing technology and is committed to building a world-class organization that will lead the industry. Be part of the team developing next generation Audio and Always On / Edge AI subsystems and platforms. The ASIC Edge AI Systems Architect is responsible for system architecture definition activities supporting a sophisticated multimedia Low Power Audio/Edge AI subsystem across a broad range of mobile, XR, compute, and automotive products.

locationsMarkham CAN

time typeFull time

posted onPosted 9 Days Ago

job requisition id3040917


Company:

Qualcomm Canada ULC

Job Area:

Engineering Group, Engineering Group > ASICS Engineering

General Summary:
Be a part of the Audio team responsible for the development of Next Generation Audio and Artificial Intelligence (AI) processing that is key to delivering the latest voice and audio features expected in the mobile, compute, automotive and IoT markets. The Audio and AI sub system is responsible for high fidelity audio across an array of interfaces as well as smart voice assistant HW accelerators that are contextual aware based on a suite of sensors. With a dedicated DSP, our industry leading power efficient solutions allow for continual operation between all power states with minimal impact on battery life.

An ideal candidate is an embedded Firmware Engineer interested in helping to validate and develop world-class low power audio and AI solutions. You will work within a multi-disciplinary, multi-site team of architects, designers and verification engineers and will be responsible for silicon validation and reference firmware development for next-gen audio IP cores within the Audio and AI sub system.


 
  • Like
  • Thinking
Reactions: 6 users

equanimous

Norse clairvoyant shapeshifter goddess
Ask me at Christmas. 🤣😂🤣 Puto had not decided to conquer Europe starting with the Ukraine when I made my prediction.

His present idea that he will nuke the Ukraine, Berlin and London as punishment if the West does not let him win could probably reduce the chances of $2.75.

All that before we get to ‘what will I do today to destroy investor confidence and the pound’ Prime Minister Truss. Throw in the US Federal Reserve and crush inflation at any cost policy and things become very uncertain.

Now the correct size of the next 4C lump, a couple of significant patent announcements and a few extra significant ecosystems partners and who knows we may all have a whole turkey for Christmas.

My opinion only DYOR
FF

AKIDA BALLISTA
Brainchip has the ability to surprise us at xmas time. I told santa I was a good boy this year
 
  • Like
  • Haha
  • Love
Reactions: 19 users

HopalongPetrovski

I'm Spartacus!
Brainchip has the ability to surprise us at xmas time. I told santa I was a good boy this year
Last time I was in Santa's lap he gave me a hot tip! 🤣

merry_christmas_by_skandthegreat-d6wws8c.jpg
 
  • Haha
  • Like
  • Sad
Reactions: 13 users

equanimous

Norse clairvoyant shapeshifter goddess
This is an interesting person to follow

Kashu Yamazaki
1665045902680.png

University of Arkansas
Verified email at email.uark.edu - Homepage
Computer VisionRobotics


TITLECITED BYYEAR
Deep reinforcement learning in computer vision: a comprehensive survey
N Le, VS Rathour, K Yamazaki, K Luu, M Savvides
Artificial Intelligence Review, 1-87
252021
Offset Curves Loss for Imbalanced Problem in Medical Segmentation
N Le, T Le, K Yamazaki, TD Bui, K Luu, M Savvides
International Conference on Pattern Recognition (ICPR 2020)
102020
A Multi-task Contextual Atrous Residual Network for Brain Tumor Detection & Segmentation
N Le, K Yamazaki, D Truong, KG Quach, M Savvides
International Conference on Pattern Recognition (ICPR 2020)
102020
Narrow band active contour attention model for medical segmentation
N Le, T Bui, VK Vo-Ho, K Yamazaki, K Luu
Diagnostics 11 (8), 1393
62021
Development of a Soft Robot Based Photodynamic Therapy for Pancreatic Cancer
Y Li, Y Liu, K Yamazaki, M Bai, Y Chen
IEEE/ASME Transactions on Mechatronics 26 (6), 2977-2985
62021
Agent-Environment Network for Temporal Action Proposal Generation
VK Vo-Ho, N Le, K Yamazaki, A Sugimoto, MT Tran
ICASSP 2021-2021 IEEE International Conference on Acoustics, Speech and …
52021
Spiking neural networks and their applications: A Review
K Yamazaki, VK Vo-Ho, D Bulsara, N Le
Brain Sciences 12 (7), 863
42022
ABN: agent-aware boundary networks for temporal action proposal generation
K Vo, K Yamazaki, S Truong, MT Tran, A Sugimoto, N Le
IEEE Access 9, 126431-126445
42021
Invertible Residual Network with Regularization for Effective Medical Image Segmentation
K Yamazaki, VS Rathour, T Le
SPIE Mediacl Imaging
42021
Minimally invasive intraperitoneal photodynamic therapy using a new soft robot system
Y Liu, K Yamazaki, D Zhang, Y Li, M Su, Q Xie, Y Chen, M Bai
Optical Methods for Tumor Treatment and Detection: Mechanisms and Techniques …
22020
AEI: Actors-Environment Interaction with Adaptive Attention for Temporal Action Proposals Generation
K Vo, H Joo, K Yamazaki, S Truong, K Kitani, MT Tran, N Le
The British Machine Vision Conference (BMVC)
12021
Gearbox Development for an Emergency Brake System of the Wind Turbine
E Sirotkin, K Yamazaki, A Miroshnichenko
IOP Conference Series: Earth and Environmental Science 459 (2), 022010
12020
Neural architecture search for medical image applications
VK Vo-Ho, K Yamazaki, H Hoang, MT Tran, N Le
Meta-Learning with Medical Imaging and Health Informatics Applications, 369-384
2023
VLCap: Vision-Language with Contrastive Learning for Coherent Video Paragraph Captioning
K Yamazaki, S Truong, K Vo, M Kidd, C Rainwater, K Luu, N Le
IEEE International Conference on Image Processing (ICIP 2022)
2022
Meta-Learning of NAS for Few-shot Learning in Medical Image Applications
VK Vo-Ho, K Yamazaki, H Hoang, MT Tran, N Le
arXiv preprint arXiv:2203.08951
2022
Invertible residual network with regularization for effective volumetric segmentation
K Yamazaki, VS Rathour, THN Le
Medical Imaging 2021: Image Processing 11596, 269-275
2022
Towards Sensorimotor Coupling of a Spiking Neural Network and Deep Reinforcement Learning for Robotics Application
K Yamazaki
2020
 
  • Like
Reactions: 4 users

TheFunkMachine

seeds have the potential to become trees.
I retr
I read someone else alluding this to potentially be Synsense and not Akida. Have we explored that further? Does the synsense tech boast 100x power efficiency over GPUs ?

If this is Not Akida, then it is cause for concern, especially as we know Mercedes has boasted 5-10x power savings on their “hey Mercedes” applications.

In my opinion this has to be Akida as I have t heard of anyone else boasting these kind of results, but I don’t want to ignore potential competitors if there is any.
I retract my concerns. As FF pointed out Akida has been stated to be 1000x more efficient then GPUs so the statement made by Mercedes fits perfectly when stated that it is 5-10x more power efficient then other systems. (Not GPUs, but other AI chips)

It’s great to be a Share Holder!

And how great is this announcement about the Prophecy Brainchip Partnership. Another great partner that will eventually lead to great products and greater revenue;)

I don’t know what peoples opinions on when revenues from royalty starts flowing but I believe it was stated somewhere recently in an interview (not sure when posted) that said that Akida will be in product round about 12-18 months from the interview. It is important to keep this in mind as not to get frustrated. In my opinion It will be well worth the wait. SP will go up and down, but don’t underestimate revenue from the likes of IP license fees in the next few Q before we start seeing Royalties. 🎉

This is of course only possible if the world doesn’t blow up first. Which in that case, I can’t wait to meet you Jesus 😂
 
  • Like
  • Love
  • Haha
Reactions: 17 users

Violin1

Regular
Hi @Violin1
I just finished listening again for the fourth time trying to hear what you heard and the only reference to two years related to their work on the Metavision platform.

However this time I took the time to hear through Luca Verre's heavy accent what he was saying on one issue and what he said really needs to be taken notice of, and it is, that when he and his partner started out building their vision sensor they knew they were building a "house of straw".

They knew that the human eye was only one part of the equation as it took in the light and converted it but it was the brain that took over and then processed and acted upon what the eye had seen. So too was this the case with their sensor it was only half of the equation until they found Brainchip.

They had been until this point using optimised old school technology to process what they were collecting with their sensor and in many cases were not achieving the optimum performance but AKIDA resolved that issue and completed their vision.

Now harking back to SynSense. @Diogenese has pointed out probably half a dozen times here that they are not fit to wipe AKIDA's feet to be quite blunt (my interpretation of his science), but as a result of the podcast if you take the list of existing Prophesee partners we now know as a fact that both Sony and Renault have been pulling up short of being able to fully exploit Prophesee's technology without AKIDA.

If you add in Nviso and the claims it has made about what it can achieve using AKIDA to process its Applications it becomes blindly obvious that long term visionary investors have got hold of the comet's tail and while it might currently be pretty stable it will in the foreseeable future dramatically increase speed due to the effect of the gravitational forces as it rounds the Sun and it catapults back across the universe. So hold on very tight.

Next years AGM is getting closer with each passing day.

My opinion only DYOR
FF

AKIDA BALLISTA
Thanks for that @Fact Finder
I've had a couple more listens too. I think it was late night excitement that confused Luca's discussion for me. His descriptions of earlier work and Akida making the whole gave me the impression of earlier collaboration. I also ended up having a very quick re-look at Jack Shelby's comment about Brainchip and Akida that was a couple of weeks earlier than the partnership announcement - so clearly it is the last few months not earlier as I hypothesised. Still, as you describe, now that it has started it will likely sweep right across Prophesese's offerings.

Every day puts in place another building block and yesterday's podcast was a big one. I still think 2023 is where we'll see real change but listening to Luca just solidified the foundations of my investment structure!

Bbbbbbb....ballista
 
  • Like
  • Fire
  • Haha
Reactions: 19 users

Sam

Nothing changes if nothing changes
Ask me at Christmas. 🤣😂🤣 Puto had not decided to conquer Europe starting with the Ukraine when I made my prediction.

His present idea that he will nuke the Ukraine, Berlin and London as punishment if the West does not let him win could probably reduce the chances of $2.75.

All that before we get to ‘what will I do today to destroy investor confidence and the pound’ Prime Minister Truss. Throw in the US Federal Reserve and crush inflation at any cost policy and things become very uncertain.

Now the correct size of the next 4C lump, a couple of significant patent announcements and a few extra significant ecosystems partners and who knows we may all have a whole turkey for Christmas.

My opinion only DYOR
FF

AKIDA BALLISTA
February, March I was trying to work out how many Mercedes’ I was going to cram into my shed😂 come on now, we were all getting very excited, thinking to ourselves “is this it? Is it about to go to that life changing figure, but alas, here we are waiting to celebrate brainchip hitting that dollar mark again…. Yes the war and the ever threatening recession does not help the cause… I’m excited to read the next quarterly though, good bit of perspective and possibly take a breath.
 
  • Like
  • Love
Reactions: 8 users

Diogenese

Top 20
I am only posting this article because I know I can rely on everyone here to remain calm and not read too much into what Luca Verre says about the project with Sony back in October last year regarding putting processing into the sensor:

Image Sensors World

News and discussions about image sensors

Home Image Sensor Companies Ecosystem Companies Companies Genealogy ▼

Thursday, October 14, 2021

Prophesee CEO on Future Event-Driven Sensor Improvements​


IEEE Spectrum publishes an interview with Prophesee CEO Luca Verre. There is an interesting part about the company's next generation event-driven sensor:

"For the next generation, we are working along three axes. One axis is around the reduction of the pixel pitch. Together with Sony, we made great progress by shrinking the pixel pitch from the 15 micrometers of Generation 3 down to 4.86 micrometers with generation 4. But, of course, there is still some large room for improvement by using a more advanced technology node or by using the now-maturing stacking technology of double and triple stacks. [The sensor is a photodiode chip stacked onto a CMOS chip.] You have the photodiode process, which is 90 nanometers, and then the intelligent part, the CMOS part, was developed on 40 nanometers, which is not necessarily a very aggressive node. Going for more aggressive nodes like 28 or 22 nm, the pixel pitch will shrink very much.

The benefits are clear: It's a benefit in terms of cost; it's a benefit in terms of reducing the optical format for the camera module, which means also reduction of cost at the system level; plus it allows integration in devices that require tighter space constraints. And then of course, the other related benefit is the fact that with the equivalent silicon surface, you can put more pixels in, so the resolution increases.The event-based technology is not following necessarily the same race that we are still seeing in the conventional [color camera chips]; we are not shooting for tens of millions of pixels. It's not necessary for machine vision, unless you consider some very niche exotic applications.

The second axis is around the further integration of processing capability. There is an opportunity to embed more processing capabilities inside the sensor to make the sensor even smarter than it is today. Today it's a smart sensor in the sense that it's processing the changes [in a scene]. It's also formatting these changes to make them more compatible with the conventional [system-on-chip] platform. But you can even push this reasoning further and think of doing some of the local processing inside the sensor [that's now done in the SoC processor].

The third one is related to power consumption. The sensor, by design, is actually low-power, but if we want to reach an extreme level of low power, there's still a way of optimizing it. If you look at the IMX636 gen 4, power is not necessarily optimized. In fact, what is being optimized more is the throughput. It's the capability to actually react to many changes in the scene and be able to correctly timestamp them at extremely high time precision. So in extreme situations where the scenes change a lot, the sensor has a power consumption that is equivalent to conventional image sensor, although the time precision is much higher. You can argue that in those situations you are running at the equivalent of 1000 frames per second or even beyond. So it's normal that you consume as much as a 10 or 100 frame-per-second sensor.[A lower power] sensor could be very appealing, especially for consumer devices or wearable devices where we know that there are functionalities related to eye tracking, attention monitoring, eye lock, that are becoming very relevant.
"

My opinion only but so DYOR
FF

AKIDA BALLISTA
Hi FF,

The SynSense/Prophesee partnership was announced in October 2021, so it's probable that Luca was talking about SynSense when he spoke about the second axis - "integration of the processing capability".

I really think it would not have taken them a week to realize that Akida was the missing link, and then they would have needed to do all the DD testing. Of course, sorting out the details of the partnership would have taken much longer, but I can't see that Prophesee would have been actively testing Akida last year.

In fact, our association with Benosman dates from April 2022, when PvdM made contact with him, although Benosman disavows any current contact with Prophesee. PvdM said:
"I hope they will connect those event-based cameras to one of our Akida chips" while expressing his wish to be the rocket man.

Posted by @thelittleshort :
@Fact Finder you know if this went anywhere?

Benosman is co-founder of grAI Matter Labs, but makes a point of disassociating himself with that and other companies he has founded

A possible collaboration by former “rivals”?

There’s a link to a YouTube video “Why is Neuromorphic Event-based Engineering the future of AI?” below in which the BrainChip logo appears on a slide (5:55) - however we are not mentioned explicitly


53712EF5-7746-47F9-97B9-4819E99D6A01.jpeg

033DCC25-24A8-4121-A3CE-B143FEDD8FD0.jpeg




A88CDB8C-926A-4981-9D1C-85E1D8C87B32.jpeg

20CDB3A6-75D9-4E3C-AB6A-E9393BF1A409.jpeg




CB32AC87-71F8-445A-AA01-2A580F0BCB57.jpeg
 
  • Like
  • Fire
  • Love
Reactions: 29 users

Deadpool

hyper-efficient Ai
I am only posting this article because I know I can rely on everyone here to remain calm and not read too much into what Luca Verre says about the project with Sony back in October last year regarding putting processing into the sensor:

Image Sensors World

News and discussions about image sensors

Home Image Sensor Companies Ecosystem Companies Companies Genealogy ▼

Thursday, October 14, 2021

Prophesee CEO on Future Event-Driven Sensor Improvements​


IEEE Spectrum publishes an interview with Prophesee CEO Luca Verre. There is an interesting part about the company's next generation event-driven sensor:

"For the next generation, we are working along three axes. One axis is around the reduction of the pixel pitch. Together with Sony, we made great progress by shrinking the pixel pitch from the 15 micrometers of Generation 3 down to 4.86 micrometers with generation 4. But, of course, there is still some large room for improvement by using a more advanced technology node or by using the now-maturing stacking technology of double and triple stacks. [The sensor is a photodiode chip stacked onto a CMOS chip.] You have the photodiode process, which is 90 nanometers, and then the intelligent part, the CMOS part, was developed on 40 nanometers, which is not necessarily a very aggressive node. Going for more aggressive nodes like 28 or 22 nm, the pixel pitch will shrink very much.

The benefits are clear: It's a benefit in terms of cost; it's a benefit in terms of reducing the optical format for the camera module, which means also reduction of cost at the system level; plus it allows integration in devices that require tighter space constraints. And then of course, the other related benefit is the fact that with the equivalent silicon surface, you can put more pixels in, so the resolution increases.The event-based technology is not following necessarily the same race that we are still seeing in the conventional [color camera chips]; we are not shooting for tens of millions of pixels. It's not necessary for machine vision, unless you consider some very niche exotic applications.

The second axis is around the further integration of processing capability. There is an opportunity to embed more processing capabilities inside the sensor to make the sensor even smarter than it is today. Today it's a smart sensor in the sense that it's processing the changes [in a scene]. It's also formatting these changes to make them more compatible with the conventional [system-on-chip] platform. But you can even push this reasoning further and think of doing some of the local processing inside the sensor [that's now done in the SoC processor].

The third one is related to power consumption. The sensor, by design, is actually low-power, but if we want to reach an extreme level of low power, there's still a way of optimizing it. If you look at the IMX636 gen 4, power is not necessarily optimized. In fact, what is being optimized more is the throughput. It's the capability to actually react to many changes in the scene and be able to correctly timestamp them at extremely high time precision. So in extreme situations where the scenes change a lot, the sensor has a power consumption that is equivalent to conventional image sensor, although the time precision is much higher. You can argue that in those situations you are running at the equivalent of 1000 frames per second or even beyond. So it's normal that you consume as much as a 10 or 100 frame-per-second sensor.[A lower power] sensor could be very appealing, especially for consumer devices or wearable devices where we know that there are functionalities related to eye tracking, attention monitoring, eye lock, that are becoming very relevant.
"

My opinion only but so DYOR
FF

AKIDA BALLISTA
@Fact Finder has done it again. If only Banjo Patterson was alive today, I’m sure he would have penned a poem immortalizing the great champion of Brn, that is Fact Finder. Well done, once again. May your persona and especially your sleuthing abilities be captured by the descendants of the true believers, and remain in Brn folklore for eternity.

It's great to be a shareholder:cool:
 
  • Like
  • Haha
  • Love
Reactions: 25 users
Hi FF,

The SynSense/Prophesee partnership was announced in October 2021, so it's probable that Luca was talking about SynSense when he spoke about the second axis - "integration of the processing capability".

I really think it would not have taken them a week to realize that Akida was the missing link, and then they would have needed to do all the DD testing. Of course, sorting out the details of the partnership would have taken much longer, but I can't see that Prophesee would have been actively testing Akida last year.

In fact, our association with Benosman dates from April 2022, when PvdM made contact with him, although Benosman disavows any current contact with Prophesee. PvdM said:
"I hope they will connect those event-based cameras to one of our Akida chips" while expressing his wish to be the rocket man.

Posted by @thelittleshort :
@Fact Finder you know if this went anywhere?

Benosman is co-founder of grAI Matter Labs, but makes a point of disassociating himself with that and other companies he has founded

A possible collaboration by former “rivals”?

There’s a link to a YouTube video “Why is Neuromorphic Event-based Engineering the future of AI?” below in which the BrainChip logo appears on a slide (5:55) - however we are not mentioned explicitly


53712EF5-7746-47F9-97B9-4819E99D6A01.jpeg

033DCC25-24A8-4121-A3CE-B143FEDD8FD0.jpeg




A88CDB8C-926A-4981-9D1C-85E1D8C87B32.jpeg

20CDB3A6-75D9-4E3C-AB6A-E9393BF1A409.jpeg




CB32AC87-71F8-445A-AA01-2A580F0BCB57.jpeg
Sounds a plausible theory but the familiarity between Rob Telson and Luca Vesse it is also possible that before Peter here in Australia contacted Ryad the person who claims to have no contact or involvement with Prophesee Rob Telson as Vice President of Sales as he then was contacted Prophesee after Anil read about SynSense and said something like we could do that better than them.

No opinion just another possibility as I don’t believe that SynSrnse would have fooled Luca Vesse into believing they were a solution.

FF

AKIDA BALLISTA
 
  • Like
  • Fire
  • Love
Reactions: 12 users
On that note I thought it would be helpful to provide the following list of published Prophesee partners for @Diogenese:

1. Sony

2. Century Ark

3. IMAGO - SpaceX seems to be a customer

4. Datalogic

5. LUCID

6. CIS Corporation

7. FRAMOS Imaging

8. MV Technology

9. Renault

10. Xiaomi

11. Qualcomm

By extension these are all now part of the growing Brainchip Ecosystem.

My opinion only DYOR
FF

AKIDA BALLISTA
Hi @Diogenese

I have added a couple more.

Regards
FF

AKIDA BALLISTA
 
  • Like
  • Love
  • Fire
Reactions: 13 users
Reflecting on the wonderful Podcast yesterday with Luca, he referred to three practical uses for Prophesee’s technology; Barcode scanning, Automotive and smartphone cameras. Rob makes a specific point in the podcast about the smartphone usage and says that Listeners need to know and understand that this is where Brainchip (with Prophesee) wants to go (min 19:45). Luca specifically mentions blur-less video by combining them with frame-based cameras.

We know Prophesee has a partnership with Sony and has been working with them in this area. I was reminded of an older promotional video put out by Sony, demonstrating and promoting blur-less video when combined with frame-based cameras. This video has been posted previously but it caught my attention in light of the podcast. The entire promotional video is worth watching again as it looks like an AKIDA promotional video (I am slightly biased) but have a look from min 2:26.

 
  • Like
  • Love
  • Fire
Reactions: 33 users

Sirod69

bavarian girl ;-)
prophesee on LinkedIn:

It’s the last day of VISION at Messe Stuttgart. Come checkout our new lineup of demonstrations and products featuring our partners MVTec Software GmbH, FRAMOS, Xperi Inc., LUCID Vision Labs, Inc., IMAGO Technologies GmbH and Century Arks.

Visit us at Booth 8B29 to experience first-hand how we reveal the invisible for demanding #machinevision applications with our #neuromorphic #Metavision® platform.

Hear from our expert Gareth Powell 🧑‍💻 at a special presentation “A World Between Frames” today at 12:20 PM at Booth C70.

Time to join our fast-growing 🌍 community of 5,000+ developers driving the paradigm shift with #eventcameras!

#VISIONSTR #machinevision #computervision #industrialautomation #eventcamera
 
  • Like
  • Fire
  • Love
Reactions: 27 users
S

Straw

Guest
It seems the bots have developed a stutter.
3:56pm onwards is just appalling.
Correction, the people licenced to and encouraged by the ASX to undertake that activity are appalling. How is it that a market operator can be anything but publicly run and at least plausably accountable?
 
Last edited by a moderator:
  • Like
  • Love
Reactions: 16 users

Baisyet

Regular
Hi @Lex555
One of the things that litigation taught me and which applies to my approach to investing is ensure you have access to a reliable expert and know when to step back and accept their opinion.

Yesterday I spent some time looking at the Prophesee partners and decided that based on Luca Vesse and Ryad Benosman AKIDA will be included as part of the full stack of Prophesee’s event based vision sensing solutions SO at some point every single one will be using AKIDA.

The only question is when, not if, and it is entirely possible that if the Brainchip and Prophesee initial engagement goes back far enough that a product like your camera could be the first commercial product.

This raises the problem of how do we know ‘when’ the full stack is being implemented.

As both Prophesee and Brainchip are spike engaged technologies it is going to be near impossible for lay persons to know when this has occurred unless there is an explicit statement from Prophesee or Brainchip.

The only solution otherwise will be to rely upon an independent external expert to interpret specification sheets or patents.

Enter @Diogenese retired Engineer and resident TSEx expert.

Of course our expert is going to be hamstrung to some extent because he will be totally reliant upon what the publicly available documents contain so I suspect we will still be forced to look to earnings as an indicator on most occasions.

My opinion only DYOR
FF

AKIDA BALLISTA
From Luca

 
  • Like
  • Fire
  • Love
Reactions: 12 users
Top Bottom