BRN Discussion Ongoing

7für7

Top 20
Germany: Three days in a row our Sp has been GREEN!!!

Party Dancing GIF by Florida Georgia Line
Actually it’s almost-3% 😂😂😂😂
 
  • Like
Reactions: 2 users
Selling for tax purposes? You'd have to be crazy to sell now as there's still another month to go before the end of the financial year ... and I believe there is a strong possibility an announcement is imminent. What's the point of selling now only to watch the share take off before 30 June?
Yeah I hope so,skies the limits onwards and upwards
 
7für7 said:
Mate….I know people whose grandparents also run a business in which they specialized. That doesn't mean they ( the grandchildren) automatically know everything. The media landscape is constantly changing. Camera work, which is meant to evoke a certain drama, is also evolving. Imagine if films were still shot the same way they were in your grandparents' time… I come from this field as well, so relax… my great-great-great-grandparents were already doing theater, and my ancestors invented drama and comedy…so!? 🤷🏻‍♂️🤦🏻‍♂️

Reply,

Respect is urnt not given

I have taken a few days off as it’s in my blood to get very angry at disrespectful people like yourself 7fur7.

As you started all this....I’ll finish it with this note,

I have spent all my whole life in TV and Film Production being 63 years now with my first job at 16 you do maths, my comment of ...... it’s in my blood ...you took out of context and proceeded to be a smart arse and try and insult me with your above rambling comments, accusing me of being the grandkid running off his grandparents not knowing what I am talking about.

You don’t have a clue but chose to take the path of insult.
I have attached the standard video presentations formats professional producers use and the reasons why it is the standard requirements to engage with the audience in a professional manner , your cut and paste of documentaries / non fiction films that you quickly put up without any reason but IMO to be a smart arse and which you think is correct are just plan wrong and irrelevant for this format.

By the way I have sent this on to the stock broker video team so they take more care next time around when Sean is trying to find which camera to look at, very unprofessional and relevant to me as a shareholder that it needs attention.

My apologies to everyone for this continued discussion, however this is relevant and needs to be cleared up.

I won’t except disrespect from someone whom thinks he knows everything and tries to belittle people to make himself look good imo.

I have said my peace now and that’s all I have to say moving forward.

See below.
 

Attachments

  • C0E4D443-BBEB-433A-AD2B-0072991EE707.jpeg
    C0E4D443-BBEB-433A-AD2B-0072991EE707.jpeg
    350.3 KB · Views: 69
  • Like
  • Fire
  • Love
Reactions: 8 users

HopalongPetrovski

I'm Spartacus!
  • Haha
Reactions: 2 users
7für7 said:
Mate….I know people whose grandparents also run a business in which they specialized. That doesn't mean they ( the grandchildren) automatically know everything. The media landscape is constantly changing. Camera work, which is meant to evoke a certain drama, is also evolving. Imagine if films were still shot the same way they were in your grandparents' time… I come from this field as well, so relax… my great-great-great-grandparents were already doing theater, and my ancestors invented drama and comedy…so!? 🤷🏻‍♂️🤦🏻‍♂️

Reply,

Respect is urnt not given

I have taken a few days off as it’s in my blood to get very angry at disrespectful people like yourself 7fur7.

As you started all this....I’ll finish it with this note,

I have spent all my whole life in TV and Film Production being 63 years now with my first job at 16 you do maths, my comment of ...... it’s in my blood ...you took out of context and proceeded to be a smart arse and try and insult me with your above rambling comments, accusing me of being the grandkid running off his grandparents not knowing what I am talking about.

You don’t have a clue but chose to take the path of insult.
I have attached the standard video presentations formats professional producers use and the reasons why it is the standard requirements to engage with the audience in a professional manner , your cut and paste of documentaries / non fiction films that you quickly put up without any reason but IMO to be a smart arse and which you think is correct are just plan wrong and irrelevant for this format.

By the way I have sent this on to the stock broker video team so they take more care next time around when Sean is trying to find which camera to look at, very unprofessional and relevant to me as a shareholder that it needs attention.

My apologies to everyone for this continued discussion, however this is relevant and needs to be cleared up.

I won’t except disrespect from someone whom thinks he knows everything and tries to belittle people to make himself look good imo.

I have said my peace now and that’s all I have to say moving forward.

See below.
. For heavens sake smooth sailing , do what I did months ago and put 7fur7 on ignore .
 
  • Like
  • Haha
  • Fire
Reactions: 7 users
  • Like
Reactions: 5 users

zeeb0t

Administrator
Staff member
Unsolicitation alert ... but if I can't ask the BRN fam for help, who can I ask?

I have launched a new product today on a special website dedicated to tech product launches. It may even interest you personally (feel free to visit the website and sign up to the free beta if it does!) but if all of you could wonder on over to https://www.producthunt.com/posts/hellocaller-ai and vote for my launch, I would be most appreciative.

Season 3 Smiling GIF by The Simpsons
 
  • Like
  • Love
  • Thinking
Reactions: 32 users

7für7

Top 20
Unsolicitation alert ... but if I can't ask the BRN fam for help, who can I ask?

I have launched a new product today on a special website dedicated to tech product launches. It may even interest you personally (feel free to visit the website and sign up to the free beta if it does!) but if all of you could wonder on over to https://www.producthunt.com/posts/hellocaller-ai and vote for my launch, I would be most appreciative.

Season 3 Smiling GIF by The Simpsons
The BRN community right now:

For zeeb0t

1716809502880.gif
 
  • Haha
  • Like
  • Love
Reactions: 9 users
  • Like
Reactions: 8 users

Frangipani

Regular
Has anyone else stumbled upon this 3 year EU-funded research project called Nimble AI, kick-started in November 2022, that “aims to unlock the potential of neuromorphic vision?“ Couldn’t find anything here on TSE with the help of the search function except a reference to US-based company Nimble Robotics, but they seem totally unrelated.

The 19 project partners include imec in Leuven (Belgium) as well as Paris-based GrAI Matter Labs, highly likely Brainchip’s most serious competitor, according to other posters.

An article about Nimble AI’s ambitious project was published today:

What do you make of of the consortium’s claim that their 3D neuromorphic vision chip will have more than an edge over Akida once it will be ready to hit the market? 🤔


NimbleAI: Ultra-Energy Efficient and Secure Neuromorphic Sensing and Processing at the Endpoint​

“Today only very light AI processing tasks are executed in ubiquitous IoT endpoint devices, where sensor data are generated and access to energy is usually constrained. However, this approach is not scalable and results in high penalties in terms of security, privacy, cost, energy consumption, and latency as data need to travel from endpoint devices to remote processing systems such as data centres. Inefficiencies are especially evident in energy consumption.
To keep up pace with the exponentially growing amount of data (e.g. video) and allow more advanced, accurate, safe and timely interactions with the surrounding environment, next-generation endpoint devices will need to run AI algorithms (e.g. computer vision) and other compute intense tasks with very low latency (i.e. units of ms or less) and energy envelops (i.e. tens of mW or less).
NimbleAI will harness the latest advances in microelectronics and integrated circuit technology to create an integral neuromorphic sensing-processing solution to efficiently run accurate and diverse computer vision algorithms in resource- and area-constrained chips destined to endpoint devices. Biology will be a major source of inspiration in NimbleAI, especially with a focus to reproduce adaptivity and experience-induced plasticity that allow biological structures to continuously become more efficient in processing dynamic visual stimuli.
NimbleAI is expected to allow significant improvements compared to state-of-the-art (e.g. commercially available neuromorphic chips), and at least 100x improvement in energy efficiency and 50x shorter latency compared to state-of-the-practice (e.g. CPU/GPU/NPU/TPUs processing frame-based video). NimbleAI will also take a holistic approach for ensuring safety and security at different architecture levels, including silicon level.”


What I find a little odd, though, is that this claim re expected superiority over “state-of-the-art (e.g. commercially available neuromorphic chips)“ doesn’t get any mention on the official Nimble AI website (https://www.nimbleai.eu/), in contrast to the expectation of “at least 100x improvement in energy efficiency and 50x shorter latency compared to state-of-the-practice (e.g. CPU/GPU/NPU/TPUs Processing Frame-based Video).”


A few months back, I shared an article featuring Brainchip in a project called Nimble AI. This isn't a small PhD project; Nimble AI has 19 project partners across Europe with €10 million funding from both the EU and UK governments.

While keeping tabs on Nimble, there hasn't been further mention of Brainchip or any more media releases I've found. However, I did notice the project coordinator of Nimble AI liking a Brainchip related post on LinkedIn.

View attachment 59295

Have a good dig into their website. It's interesting stuff. https://www.nimbleai.eu/

For the tech heads, there's a scientific paper discussing how SNNs integrate and operate within the chip stack. While it doesn't explicitly mention Brainchip, it predates the article referencing Brainchip, suggesting that Brainchip might have been incorporated later on. DATE23_nimbleai.pdf

Take a look at the project partners and their respective roles. There are some heavyweight companies and contributors involved, hopefully providing exposure to Brainchip. https://www.nimbleai.eu/consortium/

Also worth noting is Xabier Iturbe's got a second new role as the coordinator of the Spanish Association of the Semiconductor Industry's newly formed working group for neuromorphic tech.


Found a slightly updated illustration and project description of the Nimble AI neuromorphic 3D vision prototype, inspired by the operation of an insect’s brain, that will use the AKD1500 as a neuromorphic processor to perform 3D perception inference. This will be benchmarked against a non-neuromorphic Edge AI processor by Hailo.

The EU-funded project, kicked off in November 2022, is about half-way through its three-year duration.



0D2AE138-4AA1-4154-BDC9-0258076E1FCA.jpeg





48629092-28C1-4280-A55B-B68F8E836F7D.jpeg
 
  • Like
  • Fire
  • Love
Reactions: 45 users

Frangipani

Regular
Hi FJ-215,

the article you linked to refers to a different Fraunhofer Institute, Fraunhofer IPMS in Dresden, whereas the Fraunhofer Institute shown in the video is Fraunhofer HHI (Heinrich-Hertz-Institut) in Berlin. (There are 76 Fraunhofer Institutes in total.)

At the very end of the video, there is a reference to a research paper, that I posted about a few weeks ago:

View attachment 63664



https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-417987

View attachment 63666
View attachment 63667



View attachment 63665

And thanks to the video we now know what neuromorphic hardware the researchers used, even though they didn’t reveal it in their paper! 😍

Forgot to mention:

That research paper’s future outlook…


AB45047B-AC01-4C69-872A-2E3E39737FE1.jpeg




… ties in nicely with this job ad the Fraunhofer Heinrich-Hertz-Institut had published in November, looking for “several student assistants to support research projects on neuromorphic signal processing in the field of (medical) sensory applications”:

https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-399275

F1F41CCF-3B63-479C-B993-050BDB860F4D.jpeg
 
  • Like
  • Fire
  • Love
Reactions: 32 users

goodvibes

Regular
Unsupervised Neuromorphic Motion Segmentation


The Western Sydney University unveils a novel unsupervised event-based motion segmentation algorithm, employing the #Prophesee Gen4 HD event camera. Source Code announced, not relesead yet.

𝐇𝐢𝐠𝐡𝐥𝐢𝐠𝐡𝐭𝐬:
✅Unsupervised segmention of moving objects
✅Dynamic mask refinement, appearance from DINO
✅Ev-Airborne: HD data w/ ground truth annotations
✅Superior segmentation performance on major benchmarks

#artificialintelligence #machinelearning #ml #AI #deeplearning #computervision #AIwithPapers #metaverse

👉Discussion https://lnkd.in/dMgakzWm
👉Paper https://lnkd.in/dCxfDFFK
👉Project https://lnkd.in/d4dxcNMT
👉Repo (empty) https://lnkd.in/dbTBZArg
 
  • Like
  • Love
  • Thinking
Reactions: 12 users

manny100

Regular
But are GPUs "AI accelerators"?

Anyway, what would TSMC know about AI accelerators?

We are not alone:

https://www.synsense.ai/synsense-ad...ic-audio-processing-with-xyloaudio-3-tapeout/

SynSense advances ultra-low-power neuromorphic audio processing with Xylo™Audio 3 tapeout​

2023-07-07
By SynSense

SynSense, the world’s leading commercial supplier of ultra-low-power neuromorphic hardware and application solutions, has completed the tapeout of Xylo™Audio 3, their advanced ultra-low-power audio processing platform built on the neuromorphic inference coreXylo™. Xylo™Audio 3 is based on the TSMC 40nm CMOS LOGIC Low Power process, delivering real-time, ultra-low-power audio signal processing capabilities while reducing chip costs. This tapeout marks a milestone for the commercialization of SynSense’s neuromorphic audio processing technology.
SynSense now have a chip with learning capabilities.
How do they compare to AKIDA. Are they full on competition?
Is this the reason we we are flooging the AKIDA Gen 2 TENNS combination because we have real AKIDA only competition?
Appears they are analogue or CNN only.
Seems they they have near cloud capabilities but are not cloudless like AKIDA.
 
  • Like
  • Fire
Reactions: 9 users

manny100

Regular
But are GPUs "AI accelerators"?

Anyway, what would TSMC know about AI accelerators?

We are not alone:

https://www.synsense.ai/synsense-ad...ic-audio-processing-with-xyloaudio-3-tapeout/

SynSense advances ultra-low-power neuromorphic audio processing with Xylo™Audio 3 tapeout​

2023-07-07
By SynSense

SynSense, the world’s leading commercial supplier of ultra-low-power neuromorphic hardware and application solutions, has completed the tapeout of Xylo™Audio 3, their advanced ultra-low-power audio processing platform built on the neuromorphic inference coreXylo™. Xylo™Audio 3 is based on the TSMC 40nm CMOS LOGIC Low Power process, delivering real-time, ultra-low-power audio signal processing capabilities while reducing chip costs. This tapeout marks a milestone for the commercialization of SynSense’s neuromorphic audio processing technology.
The reason I asked the earlier question is mainly because when it comes to public or workplace safety any delays are unacceptable from an OH &S perspective as it could lead to injuries. Near cloud capabilities will not be good enough.
So anything from motor vehicle safety, industrial safety or health uses will require a real time uninterrupted service.
Can SyneSense or other companies offer this.??
 
  • Like
  • Fire
Reactions: 5 users
Not watched either



 
  • Like
  • Fire
  • Thinking
Reactions: 5 users
Unsolicitation alert ... but if I can't ask the BRN fam for help, who can I ask?

I have launched a new product today on a special website dedicated to tech product launches. It may even interest you personally (feel free to visit the website and sign up to the free beta if it does!) but if all of you could wonder on over to https://www.producthunt.com/posts/hellocaller-ai and vote for my launch, I would be most appreciative.

Season 3 Smiling GIF by The Simpsons
Will it block her indoors calls?
 
  • Haha
  • Love
Reactions: 6 users

zeeb0t

Administrator
Staff member
  • Haha
Reactions: 5 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Here's a sample of the report "AI Chips for Edge Applications 2024-2034". The full report costs about $11,000. What a bargain! Who wants to "chip" in for a copy?

BTW, we are listed in the Hardware Start-Up and New Players diagram.🥳






Annual revenue generated by AI Chips for edge devices is set to exceed US$22 billion by 2034.
AI Chips for Edge Applications 2024-2034: Artificial Intelligence at the Edge

AI Chips for Edge Applications 2024-2034: Artificial Intelligence at the Edge​

Technology analyses and market forecasts for the global sale of AI chips for edge applications by geography, architecture, packaging, end-user, application, and industry vertical.​




The global AI chips market for edge devices will grow to US$22.0 billion by 2034, with the three largest industry verticals at that time being Consumer Electronics, Industrial, and Automotive. Artificial Intelligence (AI) is already displaying significant transformative potential across a number of different applications, from fraud detection in high-frequency trading to the use of generative AI (such as the likes of ChatGPT) as a significant time-saver for the preparation of written documentation, as well as a creative prompt. While the use of semiconductor chips with neural network architectures (these architectures being especially well-equipped in handling machine learning workloads, machine learning being an integral facet to functioning AI) is prevalent within data centers, it is at the edge where significant opportunity for adoption of AI lies. The benefits to end-users of providing a greater array of functionalities to edge devices, as well as - in certain applications - being able to fully outsource human-hours to intelligent systems, is significant. AI has already found its way into the flagship smartphones of the world's leading designers, and is set to be rolled out across a number of different devices, from automotive vehicles to smart appliances in the home.

Following a period of dedicated research by expert analysts, IDTechEx has published a report that offers unique insights into the global edge AI chip technology landscape and corresponding markets. The report contains a comprehensive analysis of 23 players involved with AI chip design for edge devices, as well as a detailed assessment of technology innovations and market dynamics. The market analysis and forecasts focus on total revenue (where this corresponds to the revenue that can be attributed to the specific neural network architecture included in sold chips/chipsets that is responsible for handling machine learning workloads), with granular forecasts that are segmented by geography (APAC, Europe, North America, and Rest of World), type of buyer (consumer and enterprise), chip architecture (GPU, CPU, ASIC, DSP, and FPGA), packaging type (System-on-Chip, Multi-Chip Module, and 2.5D+), application (language, computer vision, and predictive), and industry vertical (industrial, healthcare, automotive, retail, media & advertising, consumer electronics, and others).

The report presents an unbiased analysis of primary data gathered via our interviews with key players, and it builds on our expertise in the semiconductor, computing and electronics sectors.

This research delivers valuable insights for:
  • Companies that require AI-capable hardware.
  • Companies that design/manufacture AI chips and/or AI-capable embedded systems.
  • Companies that supply components used in AI-capable embedded systems.
  • Companies that invest in AI and/or semiconductor design, manufacture, and packaging.
  • Companies that develop devices that may require AI functionality.

87.png

Computing can be segmented with regards to the different environments, designated by where computation takes place within the network (i.e. within the cloud or at the edge of the network). This report covers the consumer edge and enterprise edge environments. Source: IDTechEx

Artificial Intelligence at the Edge
The differentiation between edge and cloud computing environments is not a trivial one, as each environment has its own requirements and capabilities. An edge computing environment is one in which computations are performed on a device - usually the same device on which the data is created - that is at the edge of the network (and, therefore, close to the user). This contrasts with cloud or data center computing, which is at the center of the network. Such edge devices include cars, cameras, laptops, mobile phones, autonomous vehicles, etc. In all of these instances, computation is carried out close to the user, at the edge of the network where the data is located. Given this definition of edge computing, edge AI is therefore the deployment of AI applications at the edge of the network, in the types of devices listed above. The benefits of running AI applications on edge devices include not having to send data back and forth between the cloud and the edge device to carry out the computation; as such, edge devices running AI algorithms can make decisions quickly without needing a connection to the internet or the cloud. Given that many edge devices run on a power cell, AI chips used for such edge devices need to have lower power consumption than within data centers, in order to be able to run effectively on these devices. This results in typically simpler algorithms being deployed, that don't require as much power.

Edge devices can be split into two categories depending on who they are intended for; consumer devices are sold directly to end-users, and so are developed with end-user requirements in mind. Enterprise devices, on the other hand, are purchased by businesses or institutions, who may have different requirements to the end-user. Both types of edge devices are considered in the report.

82.png

The consumer electronics, industrial, and automotive industry verticals are expected to generate the most revenue for AI chips at the edge by 2034. Source: IDTechEx

AI: A crucial technology for an Internet of Things
AI's capabilities in natural language processing (understanding of textual data, not just from a linguistic perspective but also a contextual one), speech recognition (being able to decipher a spoken language and convert it to text in the same language, or convert to another language), recommendation (being able to send personalized adverts/suggestions to consumers based on their interactions with service items), reinforcement learning (being able to make predictions based on observations/exploration, such as is used when training agents to play a game), object detection, and image classification (being able to distinguish objects from an environment, and decide on what that object is) are such that AI can be applied to a number of different devices across industry verticals and thoroughly transform the ways in which human users interact with these devices. This can range from additional functionality that enhances user experience (such as in smartphones, smart televisions, personal computers, and tablets), to functionality that is inherently crucial to the technology (such as is the case for autonomous vehicles and industrial robots, which would simply not be able to function in the desired manner without the inclusion of AI).

The Smart Home in particular is a growing avenue for AI (which primarily comprises consumer electronics products), given that artificial intelligence (allowing for automation and hands-free access) and Wi-Fi connectivity are two key technologies for realizing an Internet of Things (IoT), where appliances can communicate directly with one another. Smart televisions, mirrors, virtual reality headsets, sensors, kitchen appliances, cleaning appliances, and safety systems are all devices that can be brought into a state of interconnectivity through the deployment of artificial intelligence and Wi-Fi, where AI allows for hands-free access and voice command over smart home devices. The opportunity afforded by bringing AI into the home is reflected somewhat by the growth of the consumer electronics vertical over the forecast period, with it being the industry that generates the most revenue for edge AI chips in 2034.

8A.png

The Edge AI chip landscape. Source: IDTechEx

The growth of AI at the edge
While the forecast presented in this report does predict substantial growth of AI at the edge over the next ten years - where global revenue is in excess of US$22 billion by 2034 - this growth is anything but steady. This is due to the saturation and stop-start nature of certain markets that have already employed AI architectures in their incumbent chipsets, and where rigorous testing is necessary prior to high volume rollout, respectively. For example, the smartphone market has already begun to saturate; though premiumization of smartphones continues (where the percentage share of total smartphones sold given over to premium smartphones is, year-on-year, increasing), where AI revenue increases as more premium smartphones are sold given that these smartphones incorporate AI coprocessing in their chipsets, it is expected that this will itself begin to saturate over the next ten years.

In contrast to this, two notable jumps in revenue on the forecast presented in the report are from 2024 to 2025, and 2026 to 2027. The first of these jumps can be largely attributed to the most cutting-edge ADAS (Advanced Driver-Assistance Systems) finding their way into car manufacturers' 2025 production line. The second jump is due in part to increased adoption of ADAS systems, as well as the relative maturation of start-ups operating presently targeting embedded devices, especially for smart home appliances. These applications are discussed in greater detail in the report, with a particular focus on the smartphone and automotive markets.

89.png

Smartphone price as compared to the node process that incumbent chipsets have been manufactured in. This plot has been created from a survey - carried out specifically for this report - of 196 smartphones released since 2020, 91 of which incorporate neural network architectures to allow for AI acceleration. Source: IDTechEx

Market developments and roadmaps
IDTechEx's model of the edge AI chips market considers architectural trends, developments in packaging, the dispersion/concentration of funding and investments, historical financial data, individual industry vertical market saturation, and geographically-localized ecosystems to give an accurate representation of the evolving market value over the next ten years.

Our report answers important questions such as:
  • Which industry verticals will AI chips for edge devices be used most prominently in?
  • What opportunities are there for growth within the edge computing environments?
  • How has the adoption of AI within more mature markets been received, and what are the obstacles to adoption in more emergent applications?
  • How will each AI chip application and industry vertical grow in the short and long-term?
  • What are the trends associated with the design and manufacture of chips that incorporate neural network architectures?

Summary
This report provides critical market intelligence concerning AI hardware at the edge, particularly chips used for accelerating machine learning workloads. This includes:

Market forecasts and analysis
  • Market forecasts from 2024-2034, segmented in six different ways: by geography, architecture, packaging, end-user, application and industry vertical.
  • Analysis of market forecasts, including assumptions, methodologies, limitations, and explanations for the characteristics of each forecast.

A review of the technology behind AI chips
  • History and context for AI chip design and manufacture.
  • Overview of different architectures.
  • General capabilities of AI chips.
  • Review of semiconductor manufacture processes, from raw material to wafer to chip.
  • Review of the physics behind transistor technology.
  • Review of transistor technology development, and industry/company roadmaps in this area.
  • Analysis of the benchmarking used in the industry for AI chips.

Surveys and analysis of key edge AI applications
  • Analysis of the chipsets included in almost 200 smartphones released since 2020, along with pricing estimations and key trends.
  • Analysis of the chipsets included in almost 50 tablets released since 2020, along with pricing estimations and key trends.
  • Performance comparisons for automotive chipsets, along with key trends with regards performance, power consumption, and efficiency.

Full market characterization for each major edge AI chip product
  • Review of the edge AI chip landscape, including key players across edge applications.
  • Profiles of 23 of the most prominent companies designing AI chips for edge applications today, with a focus on their latest and in-development chip technologies.
  • Reviews of promising start-up companies developing AI chips for edge applications.

Report MetricsDetails
Historic Data2019 - 2022
CAGRThe global market for AI chips at the edge will reach US$22.0 billion by 2034. This represents a CAGR of 7.63% over the forecast period (2024 to 2034).
Forecast Period2024 - 2034
Forecast UnitsUSD$ Billions
Regions CoveredWorldwide, All Asia-Pacific, North America (USA + Canada), Europe
Segments CoveredGeography (North America, APAC, Europe, Rest of World), architecture (FPGA, CPU, GPU, DSP, ASIC), packaging (SoC, MCM, 2.5D+), end-user (consumer, enterprise), application (computer vision, language, predictive), and industry vertical (consumer electronics, industrial, automotive, healthcare, retail, media & advertising, other).

Analyst access from IDTechEx
All report purchases include up to 30 minutes telephone time with an expert analyst who will help you link key findings in the report to the business issues you're addressing. This needs to be used within three months of purchasing the report.




Table of Contents
1.EXECUTIVE SUMMARY
1.1.Edge AI
1.2.IDTechEx definition of Edge AI
1.3.Edge vs Cloud characteristics
1.4.Advantages and disadvantages of edge AI
1.5.Edge devices that employ AI chips
1.6.The edge AI chip landscape - overview
1.7.The edge AI chip landscape - key hardware players
1.8.The edge AI chip landscape - hardware start-ups
1.9.The AI chip landscape - other than hardware
1.10.Edge AI landscape - geographic split: China
1.11.Edge AI landscape - geographic split: North America
1.12.Edge AI landscape - geographic split: Rest of World
1.13.Inference at the edge
1.14.Deep learning: How an AI algorithm is implemented
1.15.AI chip capabilities
2.FORECASTS
2.1.Total revenue forecast
2.2.Methodology and analysis
2.3.Estimating annual revenue from smartphone chipsets
2.4.Smartphone chipset costs
2.5.Costs garnered by AI in smartphone chipsets
2.6.Revenue forecast by geography
2.7.Percentage shares of market by geography
2.8.Chip types: architecture
2.9.Forecast by chip type
2.10.Semiconductor packaging timeline
2.11.From 1D to 3D semiconductor packaging
2.12.2D packaging - System-on-Chip
2.13.2D packaging - Multi-Chip Modules
2.14.2.5D and 3D packaging - System-in-Package
2.15.3D packaging - System-on-Package
2.16.Forecast by packaging
2.17.Consumer vs Enterprise forecast
2.18.Forecast by application
2.19.Forecast by industry vertical
2.20.Forecast by industry vertical - full
3.TECHNOLOGY: FROM SEMICONDUCTOR WAFERS TO AI CHIPS
3.1.Wafer and chip manufacture processes
3.1.1.Raw material to wafer: process flow
3.1.2.Wafer to chip: process flow
3.1.3.Wafer to chip: process flow
3.1.4.The initial deposition stage
3.1.5.Thermal oxidation
3.1.6.Oxidation by vapor deposition
3.1.7.Photoresist coating
3.1.8.How a photoresist coating is applied
3.1.9.Lithography
3.1.10.Lithography: DUV
3.1.11.Lithography: Enabling higher resolution
3.1.12.Lithography: EUV
3.1.13.Etching
3.1.14.Deposition and ion implantation
3.1.15.Deposition of thin films
3.1.16.Silicon Vapor Phase Epitaxy
3.1.17.Atmospheric Pressure CVD
3.1.18.Low Pressure CVD and Plasma-Enhanced CVD
3.1.19.Atomic Layer Deposition
3.1.20.Molecular Beam Epitaxy
3.1.21.Evaporation and Sputtering
3.1.22.Ion Implantation: Generation
3.1.23.Ion Implantation: Penetration
3.1.24.Metallization
3.1.25.Wafer: The final form
3.1.26.Semiconductor supply chain players
3.2.Transistor technology
3.2.1.How transistors operate: p-n junctions
3.2.2.How transistors operate: electron shells
3.2.3.How transistors operate: valence electrons
3.2.4.How transistors work: back to p-n junctions
3.2.5.How transistors work: connecting a battery
3.2.6.How transistors work: PNP operation
3.2.7.How transistors work: PNP
3.2.8.How transistors switch
3.2.9.From p-n junctions to FETs
3.2.10.How FETs work
3.2.11.Moore's law
3.2.12.Gate length reductions
3.2.13.FinFET
3.2.14.GAAFET, MBCFET, RibbonFET
3.2.15.Process nodes
3.2.16.Device architecture roadmap
3.2.17.Evolution of transistor device architectures
3.2.18.Carbon nanotubes for transistors
3.2.19.CNTFET designs
3.2.20.Semiconductor foundry node roadmap
3.2.21.Roadmap for advanced nodes
4.EDGE INFERENCE AND KEY APPLICATIONS
4.1.Inference at the edge and benchmarking
4.1.1.Edge AI
4.1.2.Edge vs Cloud characteristics
4.1.3.Advantages and disadvantages of edge AI
4.1.4.Edge devices that employ AI chips
4.1.5.AI in smartphones and tablets
4.1.6.Recent history: Siri
4.1.7.Text-to-speech
4.1.8.AI in personal computers
4.1.9.AI chip basics
4.1.10.Parallel computing
4.1.11.Low-precision computing
4.1.12.AI in speakers
4.1.13.AI in smart appliances
4.1.14.AI in automotive vehicles
4.1.15.AI in sensors and structural health monitoring
4.1.16.AI in security cameras
4.1.17.AI in robotics
4.1.18.AI in wearables and hearables
4.1.19.The edge AI chip landscap
4.1.20.Inference at the edge
4.1.21.Deep learning: How an AI algorithm is implemented
4.1.22.AI chip capabilities
4.1.23.AI chip capabilities
4.1.24.MLPerf - Inference
4.1.25.MLPerf Edge
4.1.26.Inference: Edge, Nvidia vs Nvidia
4.1.27.MLPerf Mobile - Qualcomm HTP
4.1.28.The battle for domination: Qualcomm vs MediaTek
4.1.29.MLPerf Tiny
4.2.AI in smartphones
4.2.1.Mobile device competitive landscape
4.2.2.Samsung and Oppo chipsets
4.2.3.US restrictions on China
4.2.4.Smartphone chipset landscape 2022 - Present
4.2.5.MediaTek and Qualcomm 2020 - Present
4.2.6.AI processing in smartphones: 2020 - Present
4.2.7.Node concentrations 2020 - Present
4.2.8.Chipset concentrations 2020 - Present
4.2.9.Chipset designer concentrations 2020 - Present
4.2.10.Node concentrations for each chipset designer
4.2.11.AI-capable versus non AI-capable smartphones
4.2.12.Chipset volume: 2021 and 2022
4.3.AI in tablets
4.3.1.Tablet competitive landscape
4.3.2.Tablet chipset landscape 2020 - Present
4.3.3.AI processing in tablets: 2020 - Present
4.3.4.Node concentrations 2020 - Present
4.3.5.Chipset designer concentrations 2021 - Present
4.3.6.Node concentrations for each chipset designer
4.3.7.AI-capable versus non AI-capable tablets
4.4.AI in automotive
4.4.1.AI in automobiles: Competitive landscape
4.4.2.Levels of driving automation
4.4.3.Computational efficiencies
4.4.4.AI chips for automotive vehicles
4.4.5.Performance and node trends
4.4.6.Rising power consumption
5.SUPPLY CHAIN PLAYERS
5.1.Smartphone chipset case studies
5.1.1.MediaTek: Dimensity and APU
5.1.2.Qualcomm: MLPerf results - Inference Mobile and Inference Tiny
5.1.3.Qualcomm: Mobile AI
5.1.4.Apple: Neural Engine
5.1.5.Apple: The ANE's capabilities and shortcomings
5.1.6.Google: Pixel Neural Core and Pixel Tensor
5.1.7.Google: Edge TPU
5.1.8.Samsung: Exynos
5.1.9.Huawei: Kirin chipsets
5.1.10.Unisoc: T618 and T710
5.2.Automotive case studies
5.2.1.Nvidia: DRIVE AGX Orin and Thor
5.2.2.Qualcomm: Snapdragon Ride Flex
5.2.3.Ambarella: CV3-AD685 for automotive applications
5.2.4.Ambarella: CVflow architecture
5.2.5.Hailo
5.2.6.Blaize
5.2.7.Tesla: FSD
5.2.8.Horizon Robotics: Journey 5
5.2.9.Horizon Robotics: Journey 5 Architecture
5.2.10.Renesas: R-Car 4VH
5.2.11.Mobileye
5.2.12.Mobileye: EyeQ Ultra
5.2.13.Texas Instruments: TDA4VM
5.3.Embedded device case studies
5.3.1.Nvidia: Jetson AGX Orin
5.3.2.NXP Semiconductors: Introduction
5.3.3.NXP Semiconductors: MCX N
5.3.4.NXP Semiconductors: i.MX 95 and NPU
5.3.5.Intel: AI hardware portfolio
5.3.6.Intel: Core
5.3.7.Perceive
5.3.8.Perceive: Ergo 2 architecture
5.3.9.GreenWaves Technologies
5.3.10.GreenWaves Technologies: GAP9 architecture
5.3.11.AMD Xilinx: ACAP
5.3.12.AMD: Versal AI
5.3.13.NationalChip: GX series
5.3.14.NationalChip: GX8002 and gxNPU
5.3.15.Efinix: Quantum architecture
5.3.16.Efinix: Titanium and Trion FPGAs
6.APPENDICES
6.1.List of smartphones surveyed
6.1.1.Appendix: List of smartphones surveyed - Apple and Asus
6.1.2.Appendix: List of smartphones surveyed - Google and Honor
6.1.3.Appendix: List of smartphones surveyed - Huawei, HTC and Motorola
6.1.4.Appendix: List of smartphones surveyed - Nokia, OnePlus, Oppo
6.1.5.Appendix: List of smartphones surveyed - realme
6.1.6.Appendix: List of smartphones surveyed - Samsung and Sony
6.1.7.Appendix: List of smartphones surveyed - Tecno Mobile
6.1.8.Appendix: List of smartphones surveyed - Xiaomi
6.1.9.Appendix: List of smartphones surveyed - Vivo and ZTE
6.2.List of tablets surveyed
6.2.1.Appendix: List of tablets surveyed - Acer, Amazon and Apple
6.2.2.Appendix: List of tablets surveyed - Barnes & Noble, Google, Huawei, Lenovo
6.2.3.Appendix: List of tablets surveyed - Microsoft, OnePlus, Samsung, Xiaomi

Ordering Information​

AI Chips for Edge Applications 2024-2034: Artificial Intelligence at the Edge​

£€$¥元
Electronic (1-5 users)
$7,000.00
Electronic (6-10 users)
$10,000.00
Electronic and 1 Hardcopy (1-5 users)
$7,975.00
Electronic and 1 Hardcopy (6-10 users)
$10,975.00
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 24 users

7für7

Top 20
Here's a sample of the report "AI Chips for Edge Applications 2024-2034". The full report costs about $11,000. What a bargain! Who wants to "chip" in for a copy?

BTW, we are listed in the Hardware Start-Up and New Players diagram.🥳






Annual revenue generated by AI Chips for edge devices is set to exceed US$22 billion by 2034.
AI Chips for Edge Applications 2024-2034: Artificial Intelligence at the Edge

AI Chips for Edge Applications 2024-2034: Artificial Intelligence at the Edge​

Technology analyses and market forecasts for the global sale of AI chips for edge applications by geography, architecture, packaging, end-user, application, and industry vertical.​




The global AI chips market for edge devices will grow to US$22.0 billion by 2034, with the three largest industry verticals at that time being Consumer Electronics, Industrial, and Automotive. Artificial Intelligence (AI) is already displaying significant transformative potential across a number of different applications, from fraud detection in high-frequency trading to the use of generative AI (such as the likes of ChatGPT) as a significant time-saver for the preparation of written documentation, as well as a creative prompt. While the use of semiconductor chips with neural network architectures (these architectures being especially well-equipped in handling machine learning workloads, machine learning being an integral facet to functioning AI) is prevalent within data centers, it is at the edge where significant opportunity for adoption of AI lies. The benefits to end-users of providing a greater array of functionalities to edge devices, as well as - in certain applications - being able to fully outsource human-hours to intelligent systems, is significant. AI has already found its way into the flagship smartphones of the world's leading designers, and is set to be rolled out across a number of different devices, from automotive vehicles to smart appliances in the home.

Following a period of dedicated research by expert analysts, IDTechEx has published a report that offers unique insights into the global edge AI chip technology landscape and corresponding markets. The report contains a comprehensive analysis of 23 players involved with AI chip design for edge devices, as well as a detailed assessment of technology innovations and market dynamics. The market analysis and forecasts focus on total revenue (where this corresponds to the revenue that can be attributed to the specific neural network architecture included in sold chips/chipsets that is responsible for handling machine learning workloads), with granular forecasts that are segmented by geography (APAC, Europe, North America, and Rest of World), type of buyer (consumer and enterprise), chip architecture (GPU, CPU, ASIC, DSP, and FPGA), packaging type (System-on-Chip, Multi-Chip Module, and 2.5D+), application (language, computer vision, and predictive), and industry vertical (industrial, healthcare, automotive, retail, media & advertising, consumer electronics, and others).

The report presents an unbiased analysis of primary data gathered via our interviews with key players, and it builds on our expertise in the semiconductor, computing and electronics sectors.

This research delivers valuable insights for:
  • Companies that require AI-capable hardware.
  • Companies that design/manufacture AI chips and/or AI-capable embedded systems.
  • Companies that supply components used in AI-capable embedded systems.
  • Companies that invest in AI and/or semiconductor design, manufacture, and packaging.
  • Companies that develop devices that may require AI functionality.

87.png

Computing can be segmented with regards to the different environments, designated by where computation takes place within the network (i.e. within the cloud or at the edge of the network). This report covers the consumer edge and enterprise edge environments. Source: IDTechEx

Artificial Intelligence at the Edge
The differentiation between edge and cloud computing environments is not a trivial one, as each environment has its own requirements and capabilities. An edge computing environment is one in which computations are performed on a device - usually the same device on which the data is created - that is at the edge of the network (and, therefore, close to the user). This contrasts with cloud or data center computing, which is at the center of the network. Such edge devices include cars, cameras, laptops, mobile phones, autonomous vehicles, etc. In all of these instances, computation is carried out close to the user, at the edge of the network where the data is located. Given this definition of edge computing, edge AI is therefore the deployment of AI applications at the edge of the network, in the types of devices listed above. The benefits of running AI applications on edge devices include not having to send data back and forth between the cloud and the edge device to carry out the computation; as such, edge devices running AI algorithms can make decisions quickly without needing a connection to the internet or the cloud. Given that many edge devices run on a power cell, AI chips used for such edge devices need to have lower power consumption than within data centers, in order to be able to run effectively on these devices. This results in typically simpler algorithms being deployed, that don't require as much power.

Edge devices can be split into two categories depending on who they are intended for; consumer devices are sold directly to end-users, and so are developed with end-user requirements in mind. Enterprise devices, on the other hand, are purchased by businesses or institutions, who may have different requirements to the end-user. Both types of edge devices are considered in the report.

82.png

The consumer electronics, industrial, and automotive industry verticals are expected to generate the most revenue for AI chips at the edge by 2034. Source: IDTechEx

AI: A crucial technology for an Internet of Things
AI's capabilities in natural language processing (understanding of textual data, not just from a linguistic perspective but also a contextual one), speech recognition (being able to decipher a spoken language and convert it to text in the same language, or convert to another language), recommendation (being able to send personalized adverts/suggestions to consumers based on their interactions with service items), reinforcement learning (being able to make predictions based on observations/exploration, such as is used when training agents to play a game), object detection, and image classification (being able to distinguish objects from an environment, and decide on what that object is) are such that AI can be applied to a number of different devices across industry verticals and thoroughly transform the ways in which human users interact with these devices. This can range from additional functionality that enhances user experience (such as in smartphones, smart televisions, personal computers, and tablets), to functionality that is inherently crucial to the technology (such as is the case for autonomous vehicles and industrial robots, which would simply not be able to function in the desired manner without the inclusion of AI).

The Smart Home in particular is a growing avenue for AI (which primarily comprises consumer electronics products), given that artificial intelligence (allowing for automation and hands-free access) and Wi-Fi connectivity are two key technologies for realizing an Internet of Things (IoT), where appliances can communicate directly with one another. Smart televisions, mirrors, virtual reality headsets, sensors, kitchen appliances, cleaning appliances, and safety systems are all devices that can be brought into a state of interconnectivity through the deployment of artificial intelligence and Wi-Fi, where AI allows for hands-free access and voice command over smart home devices. The opportunity afforded by bringing AI into the home is reflected somewhat by the growth of the consumer electronics vertical over the forecast period, with it being the industry that generates the most revenue for edge AI chips in 2034.

8A.png

The Edge AI chip landscape. Source: IDTechEx

The growth of AI at the edge
While the forecast presented in this report does predict substantial growth of AI at the edge over the next ten years - where global revenue is in excess of US$22 billion by 2034 - this growth is anything but steady. This is due to the saturation and stop-start nature of certain markets that have already employed AI architectures in their incumbent chipsets, and where rigorous testing is necessary prior to high volume rollout, respectively. For example, the smartphone market has already begun to saturate; though premiumization of smartphones continues (where the percentage share of total smartphones sold given over to premium smartphones is, year-on-year, increasing), where AI revenue increases as more premium smartphones are sold given that these smartphones incorporate AI coprocessing in their chipsets, it is expected that this will itself begin to saturate over the next ten years.

In contrast to this, two notable jumps in revenue on the forecast presented in the report are from 2024 to 2025, and 2026 to 2027. The first of these jumps can be largely attributed to the most cutting-edge ADAS (Advanced Driver-Assistance Systems) finding their way into car manufacturers' 2025 production line. The second jump is due in part to increased adoption of ADAS systems, as well as the relative maturation of start-ups operating presently targeting embedded devices, especially for smart home appliances. These applications are discussed in greater detail in the report, with a particular focus on the smartphone and automotive markets.

89.png

Smartphone price as compared to the node process that incumbent chipsets have been manufactured in. This plot has been created from a survey - carried out specifically for this report - of 196 smartphones released since 2020, 91 of which incorporate neural network architectures to allow for AI acceleration. Source: IDTechEx

Market developments and roadmaps
IDTechEx's model of the edge AI chips market considers architectural trends, developments in packaging, the dispersion/concentration of funding and investments, historical financial data, individual industry vertical market saturation, and geographically-localized ecosystems to give an accurate representation of the evolving market value over the next ten years.

Our report answers important questions such as:
  • Which industry verticals will AI chips for edge devices be used most prominently in?
  • What opportunities are there for growth within the edge computing environments?
  • How has the adoption of AI within more mature markets been received, and what are the obstacles to adoption in more emergent applications?
  • How will each AI chip application and industry vertical grow in the short and long-term?
  • What are the trends associated with the design and manufacture of chips that incorporate neural network architectures?

Summary
This report provides critical market intelligence concerning AI hardware at the edge, particularly chips used for accelerating machine learning workloads. This includes:

Market forecasts and analysis
  • Market forecasts from 2024-2034, segmented in six different ways: by geography, architecture, packaging, end-user, application and industry vertical.
  • Analysis of market forecasts, including assumptions, methodologies, limitations, and explanations for the characteristics of each forecast.

A review of the technology behind AI chips
  • History and context for AI chip design and manufacture.
  • Overview of different architectures.
  • General capabilities of AI chips.
  • Review of semiconductor manufacture processes, from raw material to wafer to chip.
  • Review of the physics behind transistor technology.
  • Review of transistor technology development, and industry/company roadmaps in this area.
  • Analysis of the benchmarking used in the industry for AI chips.

Surveys and analysis of key edge AI applications
  • Analysis of the chipsets included in almost 200 smartphones released since 2020, along with pricing estimations and key trends.
  • Analysis of the chipsets included in almost 50 tablets released since 2020, along with pricing estimations and key trends.
  • Performance comparisons for automotive chipsets, along with key trends with regards performance, power consumption, and efficiency.

Full market characterization for each major edge AI chip product
  • Review of the edge AI chip landscape, including key players across edge applications.
  • Profiles of 23 of the most prominent companies designing AI chips for edge applications today, with a focus on their latest and in-development chip technologies.
  • Reviews of promising start-up companies developing AI chips for edge applications.

Report MetricsDetails
Historic Data2019 - 2022
CAGRThe global market for AI chips at the edge will reach US$22.0 billion by 2034. This represents a CAGR of 7.63% over the forecast period (2024 to 2034).
Forecast Period2024 - 2034
Forecast UnitsUSD$ Billions
Regions CoveredWorldwide, All Asia-Pacific, North America (USA + Canada), Europe
Segments CoveredGeography (North America, APAC, Europe, Rest of World), architecture (FPGA, CPU, GPU, DSP, ASIC), packaging (SoC, MCM, 2.5D+), end-user (consumer, enterprise), application (computer vision, language, predictive), and industry vertical (consumer electronics, industrial, automotive, healthcare, retail, media & advertising, other).

Analyst access from IDTechEx
All report purchases include up to 30 minutes telephone time with an expert analyst who will help you link key findings in the report to the business issues you're addressing. This needs to be used within three months of purchasing the report.




Table of Contents
1.EXECUTIVE SUMMARY
1.1.Edge AI
1.2.IDTechEx definition of Edge AI
1.3.Edge vs Cloud characteristics
1.4.Advantages and disadvantages of edge AI
1.5.Edge devices that employ AI chips
1.6.The edge AI chip landscape - overview
1.7.The edge AI chip landscape - key hardware players
1.8.The edge AI chip landscape - hardware start-ups
1.9.The AI chip landscape - other than hardware
1.10.Edge AI landscape - geographic split: China
1.11.Edge AI landscape - geographic split: North America
1.12.Edge AI landscape - geographic split: Rest of World
1.13.Inference at the edge
1.14.Deep learning: How an AI algorithm is implemented
1.15.AI chip capabilities
2.FORECASTS
2.1.Total revenue forecast
2.2.Methodology and analysis
2.3.Estimating annual revenue from smartphone chipsets
2.4.Smartphone chipset costs
2.5.Costs garnered by AI in smartphone chipsets
2.6.Revenue forecast by geography
2.7.Percentage shares of market by geography
2.8.Chip types: architecture
2.9.Forecast by chip type
2.10.Semiconductor packaging timeline
2.11.From 1D to 3D semiconductor packaging
2.12.2D packaging - System-on-Chip
2.13.2D packaging - Multi-Chip Modules
2.14.2.5D and 3D packaging - System-in-Package
2.15.3D packaging - System-on-Package
2.16.Forecast by packaging
2.17.Consumer vs Enterprise forecast
2.18.Forecast by application
2.19.Forecast by industry vertical
2.20.Forecast by industry vertical - full
3.TECHNOLOGY: FROM SEMICONDUCTOR WAFERS TO AI CHIPS
3.1.Wafer and chip manufacture processes
3.1.1.Raw material to wafer: process flow
3.1.2.Wafer to chip: process flow
3.1.3.Wafer to chip: process flow
3.1.4.The initial deposition stage
3.1.5.Thermal oxidation
3.1.6.Oxidation by vapor deposition
3.1.7.Photoresist coating
3.1.8.How a photoresist coating is applied
3.1.9.Lithography
3.1.10.Lithography: DUV
3.1.11.Lithography: Enabling higher resolution
3.1.12.Lithography: EUV
3.1.13.Etching
3.1.14.Deposition and ion implantation
3.1.15.Deposition of thin films
3.1.16.Silicon Vapor Phase Epitaxy
3.1.17.Atmospheric Pressure CVD
3.1.18.Low Pressure CVD and Plasma-Enhanced CVD
3.1.19.Atomic Layer Deposition
3.1.20.Molecular Beam Epitaxy
3.1.21.Evaporation and Sputtering
3.1.22.Ion Implantation: Generation
3.1.23.Ion Implantation: Penetration
3.1.24.Metallization
3.1.25.Wafer: The final form
3.1.26.Semiconductor supply chain players
3.2.Transistor technology
3.2.1.How transistors operate: p-n junctions
3.2.2.How transistors operate: electron shells
3.2.3.How transistors operate: valence electrons
3.2.4.How transistors work: back to p-n junctions
3.2.5.How transistors work: connecting a battery
3.2.6.How transistors work: PNP operation
3.2.7.How transistors work: PNP
3.2.8.How transistors switch
3.2.9.From p-n junctions to FETs
3.2.10.How FETs work
3.2.11.Moore's law
3.2.12.Gate length reductions
3.2.13.FinFET
3.2.14.GAAFET, MBCFET, RibbonFET
3.2.15.Process nodes
3.2.16.Device architecture roadmap
3.2.17.Evolution of transistor device architectures
3.2.18.Carbon nanotubes for transistors
3.2.19.CNTFET designs
3.2.20.Semiconductor foundry node roadmap
3.2.21.Roadmap for advanced nodes
4.EDGE INFERENCE AND KEY APPLICATIONS
4.1.Inference at the edge and benchmarking
4.1.1.Edge AI
4.1.2.Edge vs Cloud characteristics
4.1.3.Advantages and disadvantages of edge AI
4.1.4.Edge devices that employ AI chips
4.1.5.AI in smartphones and tablets
4.1.6.Recent history: Siri
4.1.7.Text-to-speech
4.1.8.AI in personal computers
4.1.9.AI chip basics
4.1.10.Parallel computing
4.1.11.Low-precision computing
4.1.12.AI in speakers
4.1.13.AI in smart appliances
4.1.14.AI in automotive vehicles
4.1.15.AI in sensors and structural health monitoring
4.1.16.AI in security cameras
4.1.17.AI in robotics
4.1.18.AI in wearables and hearables
4.1.19.The edge AI chip landscap
4.1.20.Inference at the edge
4.1.21.Deep learning: How an AI algorithm is implemented
4.1.22.AI chip capabilities
4.1.23.AI chip capabilities
4.1.24.MLPerf - Inference
4.1.25.MLPerf Edge
4.1.26.Inference: Edge, Nvidia vs Nvidia
4.1.27.MLPerf Mobile - Qualcomm HTP
4.1.28.The battle for domination: Qualcomm vs MediaTek
4.1.29.MLPerf Tiny
4.2.AI in smartphones
4.2.1.Mobile device competitive landscape
4.2.2.Samsung and Oppo chipsets
4.2.3.US restrictions on China
4.2.4.Smartphone chipset landscape 2022 - Present
4.2.5.MediaTek and Qualcomm 2020 - Present
4.2.6.AI processing in smartphones: 2020 - Present
4.2.7.Node concentrations 2020 - Present
4.2.8.Chipset concentrations 2020 - Present
4.2.9.Chipset designer concentrations 2020 - Present
4.2.10.Node concentrations for each chipset designer
4.2.11.AI-capable versus non AI-capable smartphones
4.2.12.Chipset volume: 2021 and 2022
4.3.AI in tablets
4.3.1.Tablet competitive landscape
4.3.2.Tablet chipset landscape 2020 - Present
4.3.3.AI processing in tablets: 2020 - Present
4.3.4.Node concentrations 2020 - Present
4.3.5.Chipset designer concentrations 2021 - Present
4.3.6.Node concentrations for each chipset designer
4.3.7.AI-capable versus non AI-capable tablets
4.4.AI in automotive
4.4.1.AI in automobiles: Competitive landscape
4.4.2.Levels of driving automation
4.4.3.Computational efficiencies
4.4.4.AI chips for automotive vehicles
4.4.5.Performance and node trends
4.4.6.Rising power consumption
5.SUPPLY CHAIN PLAYERS
5.1.Smartphone chipset case studies
5.1.1.MediaTek: Dimensity and APU
5.1.2.Qualcomm: MLPerf results - Inference Mobile and Inference Tiny
5.1.3.Qualcomm: Mobile AI
5.1.4.Apple: Neural Engine
5.1.5.Apple: The ANE's capabilities and shortcomings
5.1.6.Google: Pixel Neural Core and Pixel Tensor
5.1.7.Google: Edge TPU
5.1.8.Samsung: Exynos
5.1.9.Huawei: Kirin chipsets
5.1.10.Unisoc: T618 and T710
5.2.Automotive case studies
5.2.1.Nvidia: DRIVE AGX Orin and Thor
5.2.2.Qualcomm: Snapdragon Ride Flex
5.2.3.Ambarella: CV3-AD685 for automotive applications
5.2.4.Ambarella: CVflow architecture
5.2.5.Hailo
5.2.6.Blaize
5.2.7.Tesla: FSD
5.2.8.Horizon Robotics: Journey 5
5.2.9.Horizon Robotics: Journey 5 Architecture
5.2.10.Renesas: R-Car 4VH
5.2.11.Mobileye
5.2.12.Mobileye: EyeQ Ultra
5.2.13.Texas Instruments: TDA4VM
5.3.Embedded device case studies
5.3.1.Nvidia: Jetson AGX Orin
5.3.2.NXP Semiconductors: Introduction
5.3.3.NXP Semiconductors: MCX N
5.3.4.NXP Semiconductors: i.MX 95 and NPU
5.3.5.Intel: AI hardware portfolio
5.3.6.Intel: Core
5.3.7.Perceive
5.3.8.Perceive: Ergo 2 architecture
5.3.9.GreenWaves Technologies
5.3.10.GreenWaves Technologies: GAP9 architecture
5.3.11.AMD Xilinx: ACAP
5.3.12.AMD: Versal AI
5.3.13.NationalChip: GX series
5.3.14.NationalChip: GX8002 and gxNPU
5.3.15.Efinix: Quantum architecture
5.3.16.Efinix: Titanium and Trion FPGAs
6.APPENDICES
6.1.List of smartphones surveyed
6.1.1.Appendix: List of smartphones surveyed - Apple and Asus
6.1.2.Appendix: List of smartphones surveyed - Google and Honor
6.1.3.Appendix: List of smartphones surveyed - Huawei, HTC and Motorola
6.1.4.Appendix: List of smartphones surveyed - Nokia, OnePlus, Oppo
6.1.5.Appendix: List of smartphones surveyed - realme
6.1.6.Appendix: List of smartphones surveyed - Samsung and Sony
6.1.7.Appendix: List of smartphones surveyed - Tecno Mobile
6.1.8.Appendix: List of smartphones surveyed - Xiaomi
6.1.9.Appendix: List of smartphones surveyed - Vivo and ZTE
6.2.List of tablets surveyed
6.2.1.Appendix: List of tablets surveyed - Acer, Amazon and Apple
6.2.2.Appendix: List of tablets surveyed - Barnes & Noble, Google, Huawei, Lenovo
6.2.3.Appendix: List of tablets surveyed - Microsoft, OnePlus, Samsung, Xiaomi

Ordering Information​

AI Chips for Edge Applications 2024-2034: Artificial Intelligence at the Edge​

£€$¥元
Electronic (1-5 users)
$7,000.00
Electronic (6-10 users)
$10,000.00
Electronic and 1 Hardcopy (1-5 users)
$7,975.00
Electronic and 1 Hardcopy (6-10 users)
$10,975.00
11 tsd. Dollar for a full report? Jesus Christ… better invest it into BRN what the…..😱👻
 
  • Haha
  • Like
Reactions: 5 users
Top Bottom