We know Quadric. They are part owned by MegaChips, who have a deal with us. So if Quadric can start using Akida IP this will be huge.OPPO article mentions production H2 CY2023 for launch next year. Most likely February/March when they launch new models. And they will launch their own SoC.
So if any new smartphones next year will have Akida IP they will have to be manufactured in H2 CY2023.
Vivo X90 launched 3 February, 2023 was first phone to use Snapdragon 8 Gen2 SoC. New Vivo model next year could be the first phone to have Prophesee tech. One to keep an eye on for specs.
OPPO to Launch its Own SoC in 2024: Hereâs What We Know
TECHNOLOGY
By Sidharth Joseph Last updated Feb 22, 2023
View attachment 30995
Chinese smartphone manufacturer, OPPO is planning to bring the companyâs very own self developed SoC in 2024. This in-house chipset that the company will be introducing will make the brand much more independent and will give an extra advantage in terms of performance and pricing with its competitors.
Reports on the Chinese microblogging platform, Weibo reveals that OPPO has already started its work on its very own smartphone chipset which the company would release in 2024. The chipset is going to use 4nm manufacturing process and is expected to be made by TSMC. It is also expected that the chipset will be compatible with 5G smartphones.
It was from a MediaTek executive that we had first received the information about OPPOâs self made SoC and later an insider of OPPO had also revealed that the company has already started its works. Reports also reveal that the company has invested about 1.4 billion in the research and development of its chipset.
OPPOâs first custom made chipset, MariSilicon X was released in 2021 and was a 6nm imaging chipset. The company also has a connector chipset as well called MariSilicon Y. All these indicate that the brand in the future is trying not to depend more on leading SoC manufacturers like MediaTek and Qualcomm but to rely more on their own self developed chipsets.
Leading smartphone manufacturers like Samsung, Apple and Google already have their own chipsets and with OPPO also following the same trend, it will definitely enable the company to achieve independence and come to the forefront in the smartphone market.
https://www.thetechoutlook.com/news/technology/oppo-to-launch-its-own-soc-in-2024-heres-what-we-know/#:~:text=Chinese%20smartphone%20manufacturer%2C%20OPPO%20is,and%20pricing%20with%20its%20competitors.
Debayan Roy (Gadgetsdata)
@Gadgetsdata
Oppo's Self-Developed smartphone SoC is expected to arrive in next year, 2024.
⢠It is a 4nm SoC, supports 5G.
⢠The design process of this SoC is completed and it is ready to be sent into manufacturing in the 2nd Half of 2023.
⢠It might be manufactured by TSMC.
Via: Weibo
Another company to keep an eye on is MediaTek.
As the industry leader in developing powerful, highly integrated and efficient system-on-chip products, MediaTek is enabling the future of AI by creating an ecosystem of Edge-AI hardware processing paired with comprehensive software tools across its product range - smartphones to smart homes, wearables, IoT and connected cars.
MediaTek NeuroPilot
Weâre meeting the Edge AI challenge head-on with MediaTek NeuroPilot. Through heterogeneous computing capabilities such as CPUs, GPUs and APUs (AI processing units) we embed into our system-on-chip products, we are providing high performance and power efficiency for AI features and applications. Developers can target these specific processing units within the system-on-chip or, they can let MediaTek NeuroPoint SDK intelligently handle the processing allocation for them.
Learn how it works >
Artificial Intelligence | Edge-AI Technology
MediaTek leads the pack with its Edge-AI technology solutions with respect to Artificial Intelligence. Try out some powerful, highly integrated SoC products.www.mediatek.com
Many companies becoming active in edge AI. Hopefully, BRN's Akida will become a high volume building block for a majority of devices in future.
Osram recently partnered with MegaChips' other AI partner Quadric. Will be using Quadric's Chimera general purpose neural processor. Appears this application may be better suited to Quadric in leau of Akida. Or they don't know about Akida yet.
Quadric, Ams Osram to develop smart sensing solutions for edge-based applications
Jan 11, 2023 | Abhishek Jadhav
Quadric, an edge AI chip provider, has partnered with Ams Osram, an optical solutions provider, to create a smart sensing solution for edge computing applications. The partnership will combine Ams Osramâs Mira family of CMOS sensors and Quadricâs Chimera general-purpose neural processors. Leveraging the power of both their respective strengths, Quadric says it has developed a low-power smart sensing module that combines image capturing and machine learning capabilities.
The Ams Osram Mira220 CMOS image sensor is designed for use in 2D and 3D consumer and industrial machine vision applications. With the intention of further enhancing its sensor and releasing versions with higher resolutions, Quadric is integrating future models of its CMOS sensors with a range of computing power from 1 to 16 TOPs in its Chimera processor lineup. Both companies showcased this sensing module at CES 2023.
The company says that the Mira CMOS image sensor family provides maximum resolution in a compact form factor while minimizing power consumption. By incorporating new modules within the Mira family, the company says it can provide a broad selection of resolutions that cater to different applications needing high performance and energy efficiency.
âThe combination of Ams Osramâs sensors and Quadricâs processing into a single low-power module opens up vast new possibilities for the deployment of smart vision sensing,â said Joost Seijnaeve, the vice president and general manager of CMOS Image Sensors for Ams Osram.
The Quadric Chimera general-purpose neural processor has a unified hardware and software architecture optimized for on-device AI computing. This architecture enables the processor to execute matrix and vector operations, as well as scalar code within a single execution pipeline. The processorâs design combines a neural processor, digital signal processor and real-time CPU into an individual programmable core. This means that it is exceptionally proficient in dealing with multiple tasks, the company says.
âQuadric is excited to be joining forces with Ams Osram,â said Veer Kheterpal, the CEO at Quadric. âEmpowering device makers with the capability for a fully-programmable smart sensing device at incredibly low power levels will open a vast new tranche of deployments of machine learning in edge devices.â
In March of 2022, Quadric raised $21 million in a Series B funding round led by mobility supplier Densoâs NSITEXE subsidiary. The company stated the funding would, among other things, accelerate advancements in its next-generation processor architecture.
Quadric, Ams Osram to develop smart sensing solutions for edge-based applications | Edge Industry Review
The partnership will combine Ams Osramâs Mira family of CMOS sensors and Quadricâs Chimera general-purpose neural processors.www.edgeir.com
Why would Quadric do that?So if Quadric can start using Akida IP this will be huge.
Whilst a great article, it was published in January 2022.From the 18 page Mercedes Benz EQXX pdf which is part of @Pmel âs
post above:
âNeuromorphic computing â a car that thinks like you
Another key efficiency feature of the VISION EQXX that takes its cue from nature is the way it thinks. It uses an innovative form of information processing called neuromorphic computing. The hardware runs so-called spiking neural networks. Information is coded in discrete spikes and energy is only consumed when a spike occurs, which reduces energy consumption by orders of magnitude.
Working with California-based artificial intelligence experts BrainChip, Mercedes-Benz engineers developed systems based on BrainChipâs Akida hardware and software. The example in the VISION EQXX is the âHey Mercedesâ hot-word detection. Structured along neuromorphic principles, it is five to ten times more efficient than conventional voice control.
Although neuromorphic computing is still in its infancy, systems like these will be available on the market in just a few years. When applied on scale throughout a vehicle, they have the potential to radically reduce the energy needed to run the latest AI technologies.â
Locked away in a pdf document and made available for ongoing public consumption. Brainchip gives all the appearance of being part of the furniture at Mercedes Benz.
My opinion only DYOR
FF
AKIDA BALLISTA
I also liked this bit..From the 18 page Mercedes Benz EQXX pdf which is part of @Pmel âs
post above:
âNeuromorphic computing â a car that thinks like you
Another key efficiency feature of the VISION EQXX that takes its cue from nature is the way it thinks. It uses an innovative form of information processing called neuromorphic computing. The hardware runs so-called spiking neural networks. Information is coded in discrete spikes and energy is only consumed when a spike occurs, which reduces energy consumption by orders of magnitude.
Working with California-based artificial intelligence experts BrainChip, Mercedes-Benz engineers developed systems based on BrainChipâs Akida hardware and software. The example in the VISION EQXX is the âHey Mercedesâ hot-word detection. Structured along neuromorphic principles, it is five to ten times more efficient than conventional voice control.
Although neuromorphic computing is still in its infancy, systems like these will be available on the market in just a few years. When applied on scale throughout a vehicle, they have the potential to radically reduce the energy needed to run the latest AI technologies.â
Locked away in a pdf document and made available for ongoing public consumption. Brainchip gives all the appearance of being part of the furniture at Mercedes Benz.
My opinion only DYOR
FF
AKIDA BALLISTA
Your logic (and persistence) is starting to win me over to thinking there could be something in thisProphesee already have the metavision sensor from Sony. The metavision sensor will be installed in phones with Qualcomm's Snapdragon SoC.
Qualcomm's SoC will require a processor for the Prophesee metavision sensor. Will either be an in-house Qualcomm processor as has been mentioned here and/or SynSense (unlikely) or Akida (likely).
Another way around it is for the foundry to have the licence agreement.
And herein lies the point. On a Mercedes Benz public social media site Mercedesâ Benz on 3 March, 2023 Mercedes Benz says:Whilst a great article, it was published in January 2022.
Some more to think aboutYour logic (and persistence) is starting to win me over to thinking there could be something in this
There is absolutely no comparison between Tesla autonomous or FSD a MercedesâŚTesla are light years ahead, you can find a heap of comparisons on YouTube. One difference is Mercedes only works on pre mapped roads, Tesla on any road. Tesla do have a way to go before FSD is the real deal but they are light years ahead of all the competition with autonomous
Will Mercedes mandate the use of Akida with Luminar?
Tesla One level BEHIND then!!!View attachment 31044
Tesla admits its semi-autonomous driving tech is not the worldâs most advanced
Tesla has admitted its controversial âFull Self-Drivingâ autonomous driving technology is less capable than systems developed by its rival car-makers.www.drive.com.au
From the 18 page Mercedes Benz EQXX pdf which is part of @Pmel âs
post above:
âNeuromorphic computing â a car that thinks like you
Another key efficiency feature of the VISION EQXX that takes its cue from nature is the way it thinks. It uses an innovative form of information processing called neuromorphic computing. The hardware runs so-called spiking neural networks. Information is coded in discrete spikes and energy is only consumed when a spike occurs, which reduces energy consumption by orders of magnitude.
Working with California-based artificial intelligence experts BrainChip, Mercedes-Benz engineers developed systems based on BrainChipâs Akida hardware and software. The example in the VISION EQXX is the âHey Mercedesâ hot-word detection. Structured along neuromorphic principles, it is five to ten times more efficient than conventional voice control.
Although neuromorphic computing is still in its infancy, systems like these will be available on the market in just a few years. When applied on scale throughout a vehicle, they have the potential to radically reduce the energy needed to run the latest AI technologies.â
Locked away in a pdf document and made available for ongoing public consumption. Brainchip gives all the appearance of being part of the furniture at Mercedes Benz.
My opinion only DYOR
FF
AKIDA BALLISTA
And that is why anyone selling this bad boy is crazy as F!.And herein lies the point. On a Mercedes Benz public social media site Mercedesâ Benz on 3 March, 2023 Mercedes Benz says:
âDid you know the aim for the VISION EQXX was not to build yet another show car? The actual mission was to develop a technology programme that would bring innovative solutions into series production faster than ever. Learn more: http://mb4.me/the-vision-eqxx
When you click on the link it takes the public to the pdf containing these quotes.
You do not have to be Alex the Rocket Scientist as Blind Freddie says to understand that this means Mercedes Benz is adopting this earlier article as a current and correct statement of the facts contained therein.
In other words what we said then is still current and is the program we are following to bring innovative solutions into series production.
It could not be clearer Brainchip and Mercedes Benz are still engaged together for the purpose of bringing neuromorphic computing solutions into series production.
My opinion only DYOR
FF
AKIDA BALLISTA