BRN Discussion Ongoing

BaconLover

Founding Member
Qualcomm mentioned 2023 launch but Prophesee has mentioned 2024.

Based on all the new phones models being released in February/March most likely Akida/Prophesee will be in top end models of Samsung, Oppo, Vivo etc next year.

So if the new models will be hitting stores in February/March 2024 they must go into production a few months earlier towards the end of CY2023.

In order to use Akida IP for production there will have to be a licence agreement signed between now & say October/November this year. All depends on production commencement date.

My understanding was that they don't ''have'' to sign an agreement directly with BRN, if they use the likes of Megachips or Renesas and get our IP, in which case Brainchip would get the royalties from the commercial products manufactured/sold.
 
Last edited:
  • Like
  • Fire
Reactions: 21 users

AusEire

Founding Member. It's ok to say No to Dot Joining
Hi @miaeffect

I’ve been following this tax grab with much disappointment.

I’ve gone without many things and saved for many decades to accumulate my wealth. I left home at 17 with $250 to my name and have worked hard for what I’ve got. For example for the last 20 years I’ve contributed %15 of my wage to my super. And now I am taking an educated risk to grow it on the ASX in my SMSF.

I imagine this new tax regime would have a substantial effect on those with larger holdings of BRN as they cash some out to live.

$3M is too low in this day and age to raise the tax threshold and I wouldn’t consider $3M overly wealthy in comparison to cost of living increases and house prices. If you have been smart with your money that’s not too ambitious over a 40 year savings plan.

My daughter for example although only 20 has started a modest plan which is quite achievable aimed at having $7M by the time she is 60; and I’m not sure that will be enough in 40 years time.

I would have thought somewhere between $5-10M increasing with CPI, myself would be more reasonable before raising the tax threshold.


This will also affect those who do a little bit of trading in their super; whether it to be to accumulate more or make a bit of profit on the side. The margins will increase from %15-30 so I see it as too big a risk to try and time/judge those thresholds.


I’m hoping it becomes a hot political issue and changed between now and 2025 to something more reasonable.

Just for the record; I don’t mind paying some tax but I do hate how much of it is wasted and mismanaged, but that’s another issue!


:)

P.s. I’m hoping by 2031 when I can cash some out I will be able to live off the dividends each year but it will obviously affect my bottom line!
I've had a few back and forths about this via twitter over the last few days.
Most people cannot see the bigger picture here. This is an attack on Super. Plain and simple.

If it's accepted it makes it acceptable to make further changes and dare I say it lower the cap!

The big bad richman/woman is the reason the Gov has no money and it's easy to play the naive with this msg.

Never forget what Chalmers called super. The Honeypot.
 
  • Like
  • Fire
  • Love
Reactions: 18 users

Steve10

Regular
My understanding was that they don't ''have'' to sign an agreement directly with BRN, if they use the likes of Megachips or Renesas and get our IP, which in case Brainchip would get the royalties from the commercial products manufactured/sold.

I don't think Qualcomm will be buying IP from anybody else apart from directly via BRN.

Qualcomm are ruthless & negotiate the lowest royalty. They have done the same with ARM.

Qualcomm acquired Nuvia who were on a separate licence agreement with ARM. Qualcomm tried to use their licence agreement with ARM to pay lower royalties on Nuvia chips & are now being sued by ARM.

It all started when Qualcomm announced and finally acquired processor design startup Nuvia in 2021. Nuvia was developing a new CPU architecture that it claims is superior to anything in the market. Qualcomm has publicly stated that it will use Nuvia designs and the team for its entire portfolio, including smartphones, tablets, PCs, Automotive, IoT and others.

Nuvia's designs run Arm's instruction set. It had an Instruction Set Architecture (ISA) license from Arm, with certain licensing fees. This license is also known as Architecture License Agreement (ALA) in legal documents. Since Qualcomm also has an ALA with Arm, with a different licensing fee structure, there is a difference of opinion between Qualcomm and Arm on which contract should apply to Nuvia's current designs and its evolutions.

According to the court documents, the discussions between Qualcomm and Arm broke down, and unexpectedly, Arm unilaterally cancelled Nuvia's ALA and asked it to destroy all its designs. It even demanded Qualcomm not to use Nuvia engineers for any CPU designs for three years. Arm officially filed the case against Qualcomm on August 31, 2022.

Qualcomm has both ALA and Technology License Agreement (TLA) with Arm. The former is required if you are using only Arm's instruction set, and the latter if you use cores designed by Arm. TLA fees are magnitudes higher than ALA. Qualcomm currently uses Arm cores and TLA licensing. According to Strategy Analytics analyst Sravan Kundojjala, it pays an estimated 20 – 30 cents per chip to Arm.

Since Qualcomm negotiated the contract years ago, its ALA rate is probably very low. So, if Qualcomm adopts Nuvia designs for its entire portfolio, it will only pay this lower ALA fee to Arm. For Arm, that puts all the revenue coming from Qualcomm at risk. That is problematic for Arm, especially when it is getting ready for its IPO.

With the Nuvia acquisition, Arm saw an opportunity to renegotiate Qualcomm's licensing contract. Moreover, Nuvia's ALA rate must be much higher than Qualcomm's. That is because of two reasons. First, Nuvia was a startup with little negotiation leverage. And second, it was designing higher-priced, low-volume chips, whereas Qualcomm primarily sells lower-priced, high-volume chips. So, it is in Arm's favor to insist Qualcomm pay Nuvia's rate. But Qualcomm disagrees, as it thinks its ALA covers Nuvia designs.

 
Last edited:
  • Like
  • Wow
  • Fire
Reactions: 26 users

Evermont

Stealth Mode
Howdy All,

This is showing as having been published 3 hours ago. I don't think it's been posted, but if so, I'll delete.

💋


Neuromorphic vision sensors are coming to smartphones​

By Dan O'SheaFeb 28, 2023 03:45pm
Prophesee SAQualcomm Snapdragonneuromorphicimage sensors
neuromorphic image sensors


A new partnership teaming Prophesee and Qualcomm will optimize event-based neuromorphic vision sensors for use in mobile devices, resulting in better images from device cameras. (Prophesee).

Prophesee, a provider of event-based neuromorphic vision sensor technology, announced a partnership with Qualcomm Technologies at the massive Mobile World Congress event in Barcelona, Spain, this week, a collaboration that will see Prophesee’s Metavision sensors optimized for use with Qualcomm’s Snapdragon platform to bring neuromorphic-enabled image capture capabilities to mobile devices.

Event-based neuromorphic vision sensor technology could be a game changer for camera performance, with neuromorphic capabilities processing movement and moments closer to the way the human brain processes them–with less blurring and more clarity than a frame-based camera can manage. Specifically, the technology allows cameras to perform better while capturing fast movements and scenes in low lighting in their photos and videos. For the most part, the consumer marketplace is still waiting to get their hands on devices with these capabilities, but the new partnership means that wait is growing shorter.

Later this year, the Prophesee plans to release a development kit to support the integration of the Metavision sensor technology for use with devices that contain next-generation Snapdragon platforms. After that, it will not be too much longer before consumers can experience the benefits of event-based neuromorphic vision sensor technology themselves.

Luca Verre, co-founder and CEO of Prophesee, told Fierce Electronics, “We expect phones with this feature/capability to be in the market by 2024. It will likely appear in ‘flagship’ models first.”

When the technology arrives, it is not expected to replace traditional frame-based sensors, but instead work in tandem with them, with much of the camera performance improvement coming through Prophesee’s event-based continuous and asynchronous pixel sensing approach which will help in the “de-blurring of images” and the highlighting of focal points that otherwise might become lost where low lighting intrudes on a captured moment.


That could make consumers much happier about the quality of the pictures they take on their mobile devices, although there is a good chance they may not even know they will have neuromorphic sensors to thank for the improvement, as they probably will not have to think about switching into a different photo capture mode to take advantage of neuromorphic sensing.

“It’s unlikely that smartphones would have a ‘neuromorphic mode,’ but instead would work seamlessly with the existing image capture capabilities - but, in theory, that could be something the OEM could consider,” Verre said. “Note that using an event based camera actually reduces the amount of data processed, capturing only things in a scene that move, which are often ‘invisible’ to traditional cameras, so it is largely an augmentation of traditional frame based cameras (and other sensors, such as lidar in a car), not a replacement, especially in consumer applications.”

These sensors already are used in other kinds of applications, including business and industrial use cases such as security cameras, surveillance, preventative maintenance, vibration monitoring, high speed counting, and others where event cameras can work “as a standalone machine vision option,” Verre said, adding, “There is vast potential in the idea of sensor fusion, combining event-based sensors with other types of sensors, like frame-based cameras.”

The Qualcomm partnership comes almost a year after Prophesee announced a partnership with Sony that revolved around enabling improved integration of event-based sensing technology into devices, and Verre said the migration of the technology to mobile phones likely will be smoother as a result of the earlier partnership. Working with Sony, a leading CMOS sensor provider for the mobile industry, helped make the sensors “more applicable for mobile (smaller size, lower power, etc.) with 3D stacking manufacturing techniques,” he said.

Prophesee also sees the technology as having potential in other mobile and wearable device applications, such as augmented reality headsets. Prophesee already is talking to OEMs about moving in that direction, and Verre said the company believes that “by enabling a paradigm shift in sensing modalities with this approach, there are countless applications that can benefit.”


Nice pick-up @Bravo

Last paragraph below has synergies with one of the AGM slides last year. Can't wait to see where we go from here.

Prophesee also sees the technology as having potential in other mobile and wearable device applications, such as augmented reality headsets. Prophesee already is talking to OEMs about moving in that direction, and Verre said the company believes that “by enabling a paradigm shift in sensing modalities with this approach, there are countless applications that can benefit.”


1677635469968.png
 
  • Like
  • Fire
Reactions: 20 users

buena suerte :-)

BOB Bank of Brainchip
I simply cannot believe that in today's age, there are still people who invest in stocks and base their actions on what anonymous individuals write in a forum. Not only because it is always emphasized that one should do their own research, but also because it is frightening to think that one is entrusting their hard-earned money to strangers without knowing them. A forum serves to exchange information in order to get a rough idea of the company and to catch up on anything missed. But it is by no means a platform for investment advice! So don't blame those who post positive or negative comments here, but blame yourselves because you allow yourselves to be influenced. Seek out an investment advisor and attend seminars if you don't know how the stock market works! Just my opinion... no trading recommendations.
And that is why zeeb0t has added this to the website!

No Advice - Opinions Only - All postings are opinion only as unlicensed individuals. There can be no representation of licensed advice and by using this website you understand that any post is general information and individual opinion only. You will not make any investment decision based on general information or opinion and will consider seeking advice from a licensed financial provider if you require it. Posters may be anonymous or impossible to identify, and you may not be able to recover losses or access ASIC-approved dispute resolution processes by relying on posts.

Parts of this website offer an opportunity for users to post and exchange opinions and information in certain areas of the website. The Stock Exchange Forum does not filter, edit, publish or review Comments prior to their presence on the website. Comments do not reflect the views and opinions of The Stock Exchange Forum, its agents and/or affiliates. Comments reflect the views and opinions of the person who post their views and opinions. To the extent permitted by applicable laws, The Stock Exchange Forum shall not be liable for the Comments or for any liability, damages or expenses caused and/or suffered because of any use of and/or posting of and/or appearance of the Comments on this website.
 
  • Like
  • Fire
Reactions: 12 users

buena suerte :-)

BOB Bank of Brainchip
Howdy All,

This is showing as having been published 3 hours ago. I don't think it's been posted, but if so, I'll delete.

💋


Neuromorphic vision sensors are coming to smartphones​

By Dan O'SheaFeb 28, 2023 03:45pm
Prophesee SAQualcomm Snapdragonneuromorphicimage sensors
neuromorphic image sensors


A new partnership teaming Prophesee and Qualcomm will optimize event-based neuromorphic vision sensors for use in mobile devices, resulting in better images from device cameras. (Prophesee).

Prophesee, a provider of event-based neuromorphic vision sensor technology, announced a partnership with Qualcomm Technologies at the massive Mobile World Congress event in Barcelona, Spain, this week, a collaboration that will see Prophesee’s Metavision sensors optimized for use with Qualcomm’s Snapdragon platform to bring neuromorphic-enabled image capture capabilities to mobile devices.

Event-based neuromorphic vision sensor technology could be a game changer for camera performance, with neuromorphic capabilities processing movement and moments closer to the way the human brain processes them–with less blurring and more clarity than a frame-based camera can manage. Specifically, the technology allows cameras to perform better while capturing fast movements and scenes in low lighting in their photos and videos. For the most part, the consumer marketplace is still waiting to get their hands on devices with these capabilities, but the new partnership means that wait is growing shorter.

Later this year, the Prophesee plans to release a development kit to support the integration of the Metavision sensor technology for use with devices that contain next-generation Snapdragon platforms. After that, it will not be too much longer before consumers can experience the benefits of event-based neuromorphic vision sensor technology themselves.

Luca Verre, co-founder and CEO of Prophesee, told Fierce Electronics, “We expect phones with this feature/capability to be in the market by 2024. It will likely appear in ‘flagship’ models first.”

When the technology arrives, it is not expected to replace traditional frame-based sensors, but instead work in tandem with them, with much of the camera performance improvement coming through Prophesee’s event-based continuous and asynchronous pixel sensing approach which will help in the “de-blurring of images” and the highlighting of focal points that otherwise might become lost where low lighting intrudes on a captured moment.


That could make consumers much happier about the quality of the pictures they take on their mobile devices, although there is a good chance they may not even know they will have neuromorphic sensors to thank for the improvement, as they probably will not have to think about switching into a different photo capture mode to take advantage of neuromorphic sensing.

“It’s unlikely that smartphones would have a ‘neuromorphic mode,’ but instead would work seamlessly with the existing image capture capabilities - but, in theory, that could be something the OEM could consider,” Verre said. “Note that using an event based camera actually reduces the amount of data processed, capturing only things in a scene that move, which are often ‘invisible’ to traditional cameras, so it is largely an augmentation of traditional frame based cameras (and other sensors, such as lidar in a car), not a replacement, especially in consumer applications.”

These sensors already are used in other kinds of applications, including business and industrial use cases such as security cameras, surveillance, preventative maintenance, vibration monitoring, high speed counting, and others where event cameras can work “as a standalone machine vision option,” Verre said, adding, “There is vast potential in the idea of sensor fusion, combining event-based sensors with other types of sensors, like frame-based cameras.”

The Qualcomm partnership comes almost a year after Prophesee announced a partnership with Sony that revolved around enabling improved integration of event-based sensing technology into devices, and Verre said the migration of the technology to mobile phones likely will be smoother as a result of the earlier partnership. Working with Sony, a leading CMOS sensor provider for the mobile industry, helped make the sensors “more applicable for mobile (smaller size, lower power, etc.) with 3D stacking manufacturing techniques,” he said.

Prophesee also sees the technology as having potential in other mobile and wearable device applications, such as augmented reality headsets. Prophesee already is talking to OEMs about moving in that direction, and Verre said the company believes that “by enabling a paradigm shift in sensing modalities with this approach, there are countless applications that can benefit.”

Very nice @Bravo obviously making up for lost time in "Hoarders paradise" :cool:
 
  • Haha
  • Like
Reactions: 8 users

Slade

Top 20
I don't think Qualcomm will be buying IP from anybody else apart from directly via BRN.

Qualcomm are ruthless & negotiate the lowest royalty. They have done the same with ARM.

Qualcomm acquired Nuvia who were on a separate licence agreement with ARM. Qualcomm tried to use their licence agreement with ARM to pay lower royalties on Nuvia chips & are now being sued by ARM.

It all started when Qualcomm announced and finally acquired processor design startup Nuvia in 2021. Nuvia was developing a new CPU architecture that it claims is superior to anything in the market. Qualcomm has publicly stated that it will use Nuvia designs and the team for its entire portfolio, including smartphones, tablets, PCs, Automotive, IoT and others.

Nuvia's designs run Arm's instruction set. It had an Instruction Set Architecture (ISA) license from Arm, with certain licensing fees. This license is also known as Architecture License Agreement (ALA) in legal documents. Since Qualcomm also has an ALA with Arm, with a different licensing fee structure, there is a difference of opinion between Qualcomm and Arm on which contract should apply to Nuvia's current designs and its evolutions.

According to the court documents, the discussions between Qualcomm and Arm broke down, and unexpectedly, Arm unilaterally cancelled Nuvia's ALA and asked it to destroy all its designs. It even demanded Qualcomm not to use Nuvia engineers for any CPU designs for three years. Arm officially filed the case against Qualcomm on August 31, 2022.

Qualcomm has both ALA and Technology License Agreement (TLA) with Arm. The former is required if you are using only Arm's instruction set, and the latter if you use cores designed by Arm. TLA fees are magnitudes higher than ALA. Qualcomm currently uses Arm cores and TLA licensing. According to Strategy Analytics analyst Sravan Kundojjala, it pays an estimated 20 – 30 cents per chip to Arm.

Since Qualcomm negotiated the contract years ago, its ALA rate is probably very low. So, if Qualcomm adopts Nuvia designs for its entire portfolio, it will only pay this lower ALA fee to Arm. For Arm, that puts all the revenue coming from Qualcomm at risk. That is problematic for Arm, especially when it is getting ready for its IPO.

With the Nuvia acquisition, Arm saw an opportunity to renegotiate Qualcomm's licensing contract. Moreover, Nuvia's ALA rate must be much higher than Qualcomm's. That is because of two reasons. First, Nuvia was a startup with little negotiation leverage. And second, it was designing higher-priced, low-volume chips, whereas Qualcomm primarily sells lower-priced, high-volume chips. So, it is in Arm's favor to insist Qualcomm pay Nuvia's rate. But Qualcomm disagrees, as it thinks its ALA covers Nuvia designs.

Great post and a good update re Arm and Qualcomm. It reminded me of what Renesas’s Sailesh Chittipeddi recently said in an interview. I think he was referring to the Akida inspired chip that they are producing when he said something like “we will see where the demand lies and decide if we take things in-house. We prefer to do everything in-house”. I have thought about this ‘in-house’ a lot and I think it is the preferred option of Qualcomm that you describe in your post. Mercedes has also made similar comments about their MB.OS stating that it gives them control. Thanks.
 
  • Like
  • Fire
Reactions: 16 users

JB49

Regular
Howdy All,

This is showing as having been published 3 hours ago. I don't think it's been posted, but if so, I'll delete.

💋


Neuromorphic vision sensors are coming to smartphones​

By Dan O'SheaFeb 28, 2023 03:45pm
Prophesee SAQualcomm Snapdragonneuromorphicimage sensors
neuromorphic image sensors


A new partnership teaming Prophesee and Qualcomm will optimize event-based neuromorphic vision sensors for use in mobile devices, resulting in better images from device cameras. (Prophesee).

Prophesee, a provider of event-based neuromorphic vision sensor technology, announced a partnership with Qualcomm Technologies at the massive Mobile World Congress event in Barcelona, Spain, this week, a collaboration that will see Prophesee’s Metavision sensors optimized for use with Qualcomm’s Snapdragon platform to bring neuromorphic-enabled image capture capabilities to mobile devices.

Event-based neuromorphic vision sensor technology could be a game changer for camera performance, with neuromorphic capabilities processing movement and moments closer to the way the human brain processes them–with less blurring and more clarity than a frame-based camera can manage. Specifically, the technology allows cameras to perform better while capturing fast movements and scenes in low lighting in their photos and videos. For the most part, the consumer marketplace is still waiting to get their hands on devices with these capabilities, but the new partnership means that wait is growing shorter.

Later this year, the Prophesee plans to release a development kit to support the integration of the Metavision sensor technology for use with devices that contain next-generation Snapdragon platforms. After that, it will not be too much longer before consumers can experience the benefits of event-based neuromorphic vision sensor technology themselves.

Luca Verre, co-founder and CEO of Prophesee, told Fierce Electronics, “We expect phones with this feature/capability to be in the market by 2024. It will likely appear in ‘flagship’ models first.”

When the technology arrives, it is not expected to replace traditional frame-based sensors, but instead work in tandem with them, with much of the camera performance improvement coming through Prophesee’s event-based continuous and asynchronous pixel sensing approach which will help in the “de-blurring of images” and the highlighting of focal points that otherwise might become lost where low lighting intrudes on a captured moment.


That could make consumers much happier about the quality of the pictures they take on their mobile devices, although there is a good chance they may not even know they will have neuromorphic sensors to thank for the improvement, as they probably will not have to think about switching into a different photo capture mode to take advantage of neuromorphic sensing.

“It’s unlikely that smartphones would have a ‘neuromorphic mode,’ but instead would work seamlessly with the existing image capture capabilities - but, in theory, that could be something the OEM could consider,” Verre said. “Note that using an event based camera actually reduces the amount of data processed, capturing only things in a scene that move, which are often ‘invisible’ to traditional cameras, so it is largely an augmentation of traditional frame based cameras (and other sensors, such as lidar in a car), not a replacement, especially in consumer applications.”

These sensors already are used in other kinds of applications, including business and industrial use cases such as security cameras, surveillance, preventative maintenance, vibration monitoring, high speed counting, and others where event cameras can work “as a standalone machine vision option,” Verre said, adding, “There is vast potential in the idea of sensor fusion, combining event-based sensors with other types of sensors, like frame-based cameras.”

The Qualcomm partnership comes almost a year after Prophesee announced a partnership with Sony that revolved around enabling improved integration of event-based sensing technology into devices, and Verre said the migration of the technology to mobile phones likely will be smoother as a result of the earlier partnership. Working with Sony, a leading CMOS sensor provider for the mobile industry, helped make the sensors “more applicable for mobile (smaller size, lower power, etc.) with 3D stacking manufacturing techniques,” he said.

Prophesee also sees the technology as having potential in other mobile and wearable device applications, such as augmented reality headsets. Prophesee already is talking to OEMs about moving in that direction, and Verre said the company believes that “by enabling a paradigm shift in sensing modalities with this approach, there are countless applications that can benefit.”

You'd have to imagine this is us. I had a look at the Synsense CEO's Linkedin and she still likes posts from Prophesee and Luca Verre. But if Synsense met their needs, theres no way that Prophesee would have announced the partnership with Brainchip, and spoke so highly of it in the podcast. But the only thing that still gets me is that in Luca Verre's last tiny ML podcast about a month ago, he stated they were using Intel from a research perspective, "and upcoming in an industry grade side with Synsense", which I assume meant that they had a product that was ready for market with Synsense, by using the wording industry grade side.
 
  • Like
  • Fire
Reactions: 5 users

BaconLover

Founding Member
Qualcomm are ruthless & negotiate the lowest royalty. They have done the same with ARM.
May be the reason why Apple is trying to move away from Qualcomm and built their own.

I hope when time comes BRN sales team don't go back foot, and apply the same ruthlessness.
If Qualcomm doesn't get Akida IP, it's their loss, so hopefully our sales team stand the ground.
 
  • Like
  • Fire
Reactions: 20 users

Bravo

If ARM was an arm, BRN would be its biceps💪!

Screen Shot 2023-03-01 at 1.08.0.png

Mercedes-Benz previews new MB.OS operating system​

Mercedes-Benz's new chip-to-cloud architecture is due to launch in 2024, first appearing in the third-generation CLA.

Greg Kable
Greg Kable
10:1423 February 2023
CommentIcon 0comments
ShareIcon
32shares


Mercedes-Benz previews new MB.OS operating system
View 5 images
PhotoIcon


Mercedes-Benz has provided the first details of its proprietary MB.OS operating system that is planned to be launched on new electric and petrol and diesel-engined models based on the German car maker's upcoming Mercedes Modular Architecture platform from late next year.
The new chip-to-cloud architecture will first appear on the new third-generation CLA, providing what Mercedes-Benz CEO, Ola Källenius, describes as “exceptional software capabilities”, including access to new embedded Google-based maps and services.
“We made the decision to be the architects of our own operating system. By combining this in-house expertise with a selection of partners, we will create an outstanding customer experience, from driving assistance, navigation and entertainment to integrated charging.

"MB.OS will feature full upgradeability and constant improvements,” said Källenius at the unveiling of the new system at Mercedes-Benz’s Silicon Valley-based R&D centre in Sunnyvale, California.
Mercedes-Benz previews new MB.OS operating system
View 5 images
PhotoIcon

Mercedes-Benz CEO Ola Källenius
While Mercedes-Benz has conceived, designed and will produce the new architecture, its ability to integrate applications from third parties will allow customers to choose their own services, content and functions.
The partnership with Google will see Mercedes-Benz become the first car maker to develop its own branded navigation based on in-car data and navigation capabilities from the Google Maps Platform, says Källenius.

“This will give Mercedes-Benz access to Google’s real-time and predictive traffic information, automatic re-routing and more,” he says.
Additionally, Mercedes-Benz says the MB.OS will provide access to Place Details provided by Google. Included are details such as business hours, photos, ratings and reviews of over 200 million businesses and places worldwide
As well as partnering with Google, Mercedes-Benz confirms it is pursuing further collaborations with leading software and hardware companies in a move it says will ensure efficient development and rapid scaling of MB.OS.

Mercedes-Benz previews new MB.OS operating system
View 5 images
PhotoIcon

Among the key development aims of the new operating system is improved smartphone mirroring and region-specific content, including music, video, gaming and office applications.
Offering full integration with electric drive systems, MB.OS is also claimed to provide more accurate range and energy usage predictions for electric models.
Mercedes-Benz also says MB.OS will play a crucial role in the planned rollout of new Level 2 autonomous driving functions tailored to urban driving, as well as a new Level 3 autonomous driving function capable of operating at speeds to 130km/h - both featuring new machine learning capabilities.

Among the partners for the new autonomous driving functions is NVIDIA, which Mercedes-Benz says provides its software, data and artificial intelligence expertise as well as its Orin system-on-chip hardware. It collaborates with Luminar for new light detection and LiDAR sensors.
A partnership with YouTube will allow drivers to view video content on the infotainment while using the Level 3 Drive Pilot autonomous driving system in countries where it is permitted.
Mercedes-Benz previews new MB.OS operating system
View 5 images
PhotoIcon

The current-generation Mercedes-Benz CLA is due to be replaced in 2024
Further collaborations have been forged with Antstream for in-car gaming as well as Webex and Zoom for in-car video conferencing, says Mercedes-Benz.

The new MB.OS operating system will be able to share data across a network with 5G over-the-air (OTA) functionality, providing scope for software upgrades and new features from selected partners to be able to offered to customers.
Mercedes-Benz says it expects MB.OS to provide substantial annual revenue streams “in the high single digit billion Euro figure” by the end of the decade.

 
  • Like
  • Fire
Reactions: 22 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
  • Like
  • Love
Reactions: 6 users
SOCIONEXT, BRAINCHIP & PROPHESEE THOUGHTS.

We know the following facts:

1. Socionext publicly stated back in 2020 that they looked forward to assisting Brainchip to commercialise the full range of AKIDA technology products;

2. Socionext have presented AKD1000 engineering samples at trade shows and as recently as the CES2023 were promoting AKIDA based products for automotive;

3. Socionext have products across many areas but in particular cameras and vision sensors including for mobile phones:


4. Brainchip & Propheseee have a clear ongoing relationship which covers ground occupied already by Socionext which includes camera/vision technology for mobile phones;

5. Neither Brainchip or Prophesee run an exclusive sales model and so will sell their technology to all takers and as such a Prophesee deal with Qualcomm does not exclude Prophesee dealing with other camera/vision sensor makers in fact they are already partnered with Sony so why not Socionext at some point.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Fire
  • Love
Reactions: 53 users

Steve10

Regular
Interesting that 20-30 cents per chip royalty is paid by Qualcomm to ARM.

Using 25c per chip royalty paid to BRN for IP & working backwards requires 80 million chips per year IP royalties for BRN to break even.

Qualcomm sell 650-700 million SoC's per year for smartphones. Top end models will initially have Prophesee tech possibly accounting for 10-20% of volume.

Feasible for BRN to break even anytime between Q4 CY2023 & Q1 CY2024. $5M per quarter / 25 cents = 20M chips per quarter.

Samsung, Oppo, Sony, Xiaomi, Huawei, LG & Vivo launch products in February/March. They will need stock manufactured prior to the launch.

In 2022, about 1.35 billion mobile phones were sold, with Apple dominating yearly sales at over 253 million units sold, taking up 24/8% market share.

Apple will be a big one with 253M units per year x 25c royalty = $63.25M revenue potential.

Global smartphone sales reached 1.43 billion units during 2021, showing signs of recovery following a considerable drop in sales in 2020 attributable to the COVID-19 pandemic. In the fourth quarter of 2021, Samsung sold nearly 69 million smartphones to end users worldwide.

Samsung also big with 69M units per year x 25c royalty = $17.25M revenue potential.

Total smartphone market appears to be 1.3-1.4B units per year x 25c = $325-350M per year TAM revenue.

All depends on whether royalty will be 25c per unit or more/less & percentage market share.

SynSense with Speck will also be targeting this market.

iniVation & Prophesee are competitors. iniVation is using SynSense & Prophesee was using SynSense until BRN came along.

Qualcomm partnered with Prophesee & not iniVation.
 
  • Like
  • Fire
  • Love
Reactions: 38 users

chapman89

Founding Member
Interesting that 20-30 cents per chip royalty is paid by Qualcomm to ARM.

Using 25c per chip royalty paid to BRN for IP & working backwards requires 80 million chips per year IP royalties for BRN to break even.

Qualcomm sell 650-700 million SoC's per year for smartphones. Top end models will initially have Prophesee tech possibly accounting for 10-20% of volume.

Feasible for BRN to break even anytime between Q4 CY2023 & Q1 CY2024. $5M per quarter / 25 cents = 20M chips per quarter.

Samsung, Oppo, Sony, Xiaomi, Huawei, LG & Vivo launch products in February/March. They will need stock manufactured prior to the launch.

In 2022, about 1.35 billion mobile phones were sold, with Apple dominating yearly sales at over 253 million units sold, taking up 24/8% market share.

Apple will be a big one with 253M units per year x 25c royalty = $63.25M revenue potential.

Global smartphone sales reached 1.43 billion units during 2021, showing signs of recovery following a considerable drop in sales in 2020 attributable to the COVID-19 pandemic. In the fourth quarter of 2021, Samsung sold nearly 69 million smartphones to end users worldwide.

Samsung also big with 69M units per year x 25c royalty = $17.25M revenue potential.

Total smartphone market appears to be 1.3-1.4B units per year x 25c = $325-350M per year TAM revenue.

All depends on whether royalty will be 25c per unit or more/less & percentage market share.

SynSense with Speck will also be targeting this market.

iniVation & Prophesee are competitors. iniVation is using SynSense & Prophesee was using SynSense until BRN came along.

Qualcomm partnered with Prophesee & not iniVation.
Hi @Steve10

As great as that would be, I just don’t think we will be in this run on phones with Prophesee/Qualcomm.

We all value @Diogenese expertise in breaking down patents and whatnot for us and he has said that Qualcomm are using their own AI accelerator “hexagon” from memory, now whether he is right or wrong time will tell!
I hope he is very wrong as I’m sure we all do, but Prophesee and/or Qualcomm haven’t signed with us and it’s not going through renesas or megachip because I’m sure I read that Sony was going to be the chip maker.

My brain keeps reminding me of the podcast between Rob Telson and Luca and how Rob said words to the effect of “where Prophesee’s technology ends up is where Brainchips technology will end up”

And also how Luca said they could only tell half a story to their customers until they came across Brainchip, so if judged on that podcast alone it would seem that we are going to be inside mobile phones next year!

Time will tell but one thing is for certain, it’s a matter of when, not if!

My opinion only!
 
  • Like
  • Love
  • Fire
Reactions: 74 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Well Luminar was a bit left field.

I have had Luminar on my competitors list because we did not have any dots, but @Stable Genius caused me to revisit their patents.

Luminar have a large number of LiDaR related patents, just shy of 100.

This one caught my eye and initiated a little synaptic frisson.

US2018284234A1 Foveated Imaging in a Lidar System



View attachment 30429


To identify the most important areas in front of a vehicle for avoiding collisions, a lidar system obtains a foveated imaging model. The foveated imaging model is generated by detecting the direction at which drivers' are facing at various points in time for several scenarios based on road conditions or upcoming maneuvers. The lidar system identifies an upcoming maneuver for the vehicle or a road condition and applies the identified maneuver or road condition to the foveated imaging model to identify a region of a field of regard at which to increase the resolution. The lidar system then increases the resolution at the identified region by increasing the pulse rate for transmitting light pulses within the identified region, filtering pixels outside of the identified region, or in any other suitable manner.

"Goodness me!*" I hear you exclaim "What is foveated imaging?"

Foveated imaging - Wikipedia


Foveated imaging is a digital image processing technique in which the image resolution, or amount of detail, varies across the image according to one or more "fixation points". A fixation point indicates the highest resolution region of the image and corresponds to the center of the eye's retina, the fovea.

In LiDaR, foveated means concentrating more light spots on a region of interest.

So why is this interesting?

In the fireside chat, PvdM mentioned that the eye has a central region which is more high definition and attuned to movement than the peripheral region which is lower definition but more sensitive to light variation (the very same fovea that Luminar's patent seeks to imitate)

So does this mean Luminar and BrainChip are an item? Well, no, but ...

* Archaic version of "WTF".

Product recall: - It has been pointed out to me by a poster that the reference to PvdM discussing high definition and lower definition peripheral regions does not exist in the fireside chat. I cannot recall the origin of the information, so the paragraph verballing Peter should be ignored, as my memory is unreliable.

All the best
DodgyNews

Here's one for you Dodgy Knees @Diogenese !

Research Article Posted Date: October 6th, 2022


Screen Shot 2023-03-01 at 1.59.4.png


Screen Shot 2023-03-01 at 1.58.2.png



 
  • Like
  • Fire
Reactions: 7 users

Steve10

Regular
Hi @Steve10

As great as that would be, I just don’t think we will be in this run on phones with Prophesee/Qualcomm.

We all value @Diogenese expertise in breaking down patents and whatnot for us and he has said that Qualcomm are using their own AI accelerator “hexagon” from memory, now whether he is right or wrong time will tell!
I hope he is very wrong as I’m sure we all do, but Prophesee and/or Qualcomm haven’t signed with us and it’s not going through renesas or megachip because I’m sure I read that Sony was going to be the chip maker.

My brain keeps reminding me of the podcast between Rob Telson and Luca and how Rob said words to the effect of “where Prophesee’s technology ends up is where Brainchips technology will end up”

And also how Luca said they could only tell half a story to their customers until they came across Brainchip, so if judged on that podcast alone it would seem that we are going to be inside mobile phones next year!

Time will tell but one thing is for certain, it’s a matter of when, not if!

My opinion only!

It was mentioned by BRN that AKD1500 is for high volume applications. Smartphone annual unit sales are 1.3-1.4B.

Qualcomm annual SoC unit sales are 650-700M.
 
  • Like
  • Fire
  • Love
Reactions: 23 users

Diogenese

Top 20
Hi @Steve10

As great as that would be, I just don’t think we will be in this run on phones with Prophesee/Qualcomm.

We all value @Diogenese expertise in breaking down patents and whatnot for us and he has said that Qualcomm are using their own AI accelerator “hexagon” from memory, now whether he is right or wrong time will tell!
I hope he is very wrong as I’m sure we all do, but Prophesee and/or Qualcomm haven’t signed with us and it’s not going through renesas or megachip because I’m sure I read that Sony was going to be the chip maker.

My brain keeps reminding me of the podcast between Rob Telson and Luca and how Rob said words to the effect of “where Prophesee’s technology ends up is where Brainchips technology will end up”

And also how Luca said they could only tell half a story to their customers until they came across Brainchip, so if judged on that podcast alone it would seem that we are going to be inside mobile phones next year!

Time will tell but one thing is for certain, it’s a matter of when, not if!

My opinion only!

Make that time or the Snapdragon 8.2 product Brief for those who don't have the time.

1677640232385.png
 
  • Like
  • Fire
  • Love
Reactions: 12 users

Steve10

Regular
Make that time or the Snapdragon 8.2 product Brief for those who don't have the time.

View attachment 30893

The specs will have to updated to include event based camera. Maybe the Akida IP will go into ISP or into the AI Hexagon processor?
 
  • Like
Reactions: 3 users

skutza

Regular
Again I apoligise in advance to the twin holders of Weebit. BUT WHERE ARE ALL THE ARTICLES ABOUT WBT BEING $1.35 Billion MC with ZERO revenue?

Can ASIC not see that the shorts on WBT are roughly 200 shares, therefore no manipulation in SP or articles being written constantly about 0 revenue etc.

This is beyond a friggin joke. And no I'm not bitter at all ROFL!!!

1677640796771.png
 
  • Like
  • Wow
  • Haha
Reactions: 29 users

Diogenese

Top 20
The specs will have to updated to include event based camera. Maybe the Akida IP will go into ISP or into the AI Hexagon processor?
Got a circuit diagram for that?
 
  • Haha
Reactions: 3 users
Top Bottom