BRN Discussion Ongoing

TopCat

Regular
iMedia

Home
Tech
Application of SNN in Vehicle Field

2023-02-03 22:41

iMedia

Home
Tech
Application of SNN in Vehicle Field

2023-02-03 22:41 HKT

Depots are using neuromorphic technologies to implement AI functions such as keyword recognition, driver attention, and passenger behavior monitoring.

Mimicking biological brain processes is tantalizing because it promises to enable advanced functionality without a significant increase in power consumption, which is EV-friendly. Neuromorphic computing and perception are also expected to bring these advantages, such as extremely low latency, enabling real-time decision-making in some cases. This combination of low latency and high energy efficiency is very attractive.

Spike Network


The truth is, there's still something we don't know about how the human brain works. However, cutting-edge research has shown that neurons communicate by sending each other electrical signals called spikes, and that the sequence and timing of the spikes (rather than their size) are key factors. Mathematical models of how neurons respond to these spikes are still being studied. But many scientists agree that if multiple spikes arrive at adjacent neurons at the same time (or in very rapid succession), it means that the information those spikes represent are correlated, thus causing the neuron to fire a spike.

This is in contrast to artificial neural networks based on deep learning (the mainstream AI today), where information travels through the network in a regular rhythm; that is, the information entering each neuron is represented as a numerical value, rather than based on time.

Making a spike-based artificial system is not easy. Besides we don't know how neurons work, there is no consensus on the best way to train spike neural networks. Backpropagation requires computing derivatives, which is not possible with spikes. Some companies approximate the derivative of the spike in order to use backpropagation (like SynSense), and some use another technique called STDP (spike timing dependent plasticity), which is closer to how biological brains function. However, STDP is not yet mature as a technique (BrainChip uses this method for one-shot learning at the edge). It is also possible to take a deep learning CNN, trained by backpropagation in the normal way, and convert it to run in the spike domain (another technique used by BrainChip).


SynSense Speck

SynSense is working with BMW to advance the integration of neuromorphic chips in smart cockpits and explore related areas together. BMW will evaluate SynSense's Speck SoC, which combines SynSense's neuromorphic vision processor and Inivation's 128x128-pixel event camera. Can be used to capture visual information in real-time, identify and detect objects, and perform other vision-based detection and interaction functions.

Dylan Muir, vice president of global research operations at SynSense, said: "When BMW replaces RGB cameras with Speck modules for visual perception, they can replace not only sensors, but also a lot of GPU or CPU computing required to process standard RGB visual streams."


Using event-based cameras provides a higher dynamic range than standard cameras, which is beneficial for use in extreme lighting conditions inside and outside the vehicle.

BMW will explore the use of neuromorphic technology in cars, including monitoring driver attention and passenger behavior through the Speck module.

"In the coming months, we will explore more applications inside and outside the vehicle," Muir said.


SynSense's neuromorphic vision processors have a fully asynchronous digital architecture. Each neuron uses integer logic with 8-bit synaptic weights, 16-bit neuron states, 16-bit thresholds, and unit input-output spikes. Neurons use a simple integrate-and-fire model, where when the neuron fires a simple 1-bit spike, the input spike is combined with the neuron's synaptic weights until a threshold is reached. Overall, the design is a balance between complexity and computational efficiency, Muir said.

Application of SNN in Vehicle Field

SynSense's electronic neurons are based on the integrate-and-fire model


SynSense's digital chips are designed to process event-based CNNs, with each layer processed by a different core. The kernel runs asynchronously and independently; the entire processing pipeline is event-driven.

"Our Speck modules run in real-time with low latency," Muir said. "We can manage effective inference rates above 20Hz at less than 5mW power consumption. This is much faster than using traditional low-power computing on standard RGB video streams. ."

While SynSense and BMW will initially explore neuromorphic use cases in smart cockpits, it has potential for other automotive applications as well.


"First, we'll explore non-safety-critical use cases, and we're planning future versions of Speck with higher resolution, as well as improvements to our DynapCNN vision processor, which will interface with high-resolution sensors," Muir said. We plan for these future technologies It will support advanced automotive applications such as autonomous driving, emergency braking, etc."

Application of SNN in Vehicle Field

SynSense and Inivation Speck module, an event camera-based module containing sensors and processors


BrainChip Akida

Mercedes-Benz's EQXX concept car, which debuted at CES earlier this year, uses BrainChip's Akida neuromorphic processor for in-vehicle keyword recognition. Billed as "the most efficient car Mercedes has ever made," the car utilizes neuromorphic technology that consumes less power than a deep learning-based keyword spotting system. That's crucial for a car with a range of 620 miles, or 167 miles more than Mercedes' flagship electric car, the EQS.

Mercedes said at the time that BrainChip's solution was five to 10 times more efficient than traditional voice controls at recognizing the wake word "Hey Mercedes."


Application of SNN in Vehicle Field

Mercedes said, “Although neuromorphic computing is still in its infancy, such systems will soon be on the market within a few years. When applied at scale throughout vehicles, they have the potential to radically reduce the amount of effort required to run the latest AI technologies. power consumption."

BrainChip's CMO Jerome Nadel said: "Mercedes is focused on big issues like battery management and transmission, but every milliwatt counts, and when you think about energy efficiency, even the most basic reasoning, like finding keywords, matters. important."


A typical car could have as many as 70 different sensors by 2022, Nadel said. For cockpit applications, these sensors can enable face detection, gaze assessment, emotion classification, and more.

He said: “From a system architecture perspective, we can do a 1:1 approach where there is a sensor that will do some preprocessing and then the data will be forwarded. The AI will do inference near the sensor...it will Instead of the full array of data from sensors, the inference metadata is passed forward.”

The idea is to minimize the size and complexity of packets sent to AI accelerators, while reducing latency and minimizing power consumption. Each vehicle will likely have 70 Akida chips or sensors with Akida technology, each of which will be "low-cost parts that won't notice them at all," Nadel said. He noted that attention needs to be paid to the BOM of all these sensors.


Application of SNN in Vehicle Field

BrainChip expects to have its neuromorphic processor next to every sensor on the vehicle

Going forward, Nadel said, neuromorphic processing will also be used in ADAS and autonomous driving systems. This has the potential to reduce the need for other types of power-hungry AI accelerators.


"If every sensor could have Akida configured on one or two nodes, it would do adequate inference, and the data passed would be an order of magnitude less, because that would be inference metadata...that would affect the servers you need," he said. power."

BrainChip's Akida chip accelerates SNNs (spike neural networks) and CNNs (by converting to SNNs). It's not tailored for any specific use case or sensor, so it can be paired with visual sensing for face recognition or people detection, or other audio applications like speaker ID. BrainChip also demonstrated Akida's smell and taste sensors, although it's hard to imagine how these could be used in cars (perhaps to detect air pollution or fuel quality through smell and taste).

Akida is set up to handle SNNs or deep learning CNNs that have been transformed into SNNs. Unlike the native spike network, the transformed CNN preserves some spike-level information, so it may require 2 or 4 bits of computation. However, this approach allows exploiting the properties of CNNs, including their ability to extract features from large datasets. Both types of networks can be updated at the edge using STDP. In the case of Mercedes-Benz, this might mean retraining the network after deployment to discover more or different keywords.


Application of SNN in Vehicle Field

According to Autocar, Mercedes-Benz confirmed that "many innovations" from the EQXX concept car, including "specific components and technologies," will be used in the production model. There's no word yet on whether new Mercedes-Benz models will feature artificial brains.






Depots are using neuromorphic technologies to implement AI functions such as keyword recognition, driver attention, and passenger behavior monitoring.

Mimicking biological brain processes is tantalizing because it promises to enable advanced functionality without a significant increase in power consumption, which is EV-friendly. Neuromorphic computing and perception are also expected to bring these advantages, such as extremely low latency, enabling real-time decision-making in some cases. This combination of low latency and high energy efficiency is very attractive.

Spike Network


The truth is, there's still something we don't know about how the human brain works. However, cutting-edge research has shown that neurons communicate by sending each other electrical signals called spikes, and that the sequence and timing of the spikes (rather than their size) are key factors. Mathematical models of how neurons respond to these spikes are still being studied. But many scientists agree that if multiple spikes arrive at adjacent neurons at the same time (or in very rapid succession), it means that the information those spikes represent are correlated, thus causing the neuron to fire a spike.

This is in contrast to artificial neural networks based on deep learning (the mainstream AI today), where information travels through the network in a regular rhythm; that is, the information entering each neuron is represented as a numerical value, rather than based on time.

Making a spike-based artificial system is not easy. Besides we don't know how neurons work, there is no consensus on the best way to train spike neural networks. Backpropagation requires computing derivatives, which is not possible with spikes. Some companies approximate the derivative of the spike in order to use backpropagation (like SynSense), and some use another technique called STDP (spike timing dependent plasticity), which is closer to how biological brains function. However, STDP is not yet mature as a technique (BrainChip uses this method for one-shot learning at the edge). It is also possible to take a deep learning CNN, trained by backpropagation in the normal way, and convert it to run in the spike domain (another technique used by BrainChip).


SynSense Speck

SynSense is working with BMW to advance the integration of neuromorphic chips in smart cockpits and explore related areas together. BMW will evaluate SynSense's Speck SoC, which combines SynSense's neuromorphic vision processor and Inivation's 128x128-pixel event camera. Can be used to capture visual information in real-time, identify and detect objects, and perform other vision-based detection and interaction functions.

Dylan Muir, vice president of global research operations at SynSense, said: "When BMW replaces RGB cameras with Speck modules for visual perception, they can replace not only sensors, but also a lot of GPU or CPU computing required to process standard RGB visual streams."


Using event-based cameras provides a higher dynamic range than standard cameras, which is beneficial for use in extreme lighting conditions inside and outside the vehicle.

BMW will explore the use of neuromorphic technology in cars, including monitoring driver attention and passenger behavior through the Speck module.

"In the coming months, we will explore more applications inside and outside the vehicle," Muir said.


SynSense's neuromorphic vision processors have a fully asynchronous digital architecture. Each neuron uses integer logic with 8-bit synaptic weights, 16-bit neuron states, 16-bit thresholds, and unit input-output spikes. Neurons use a simple integrate-and-fire model, where when the neuron fires a simple 1-bit spike, the input spike is combined with the neuron's synaptic weights until a threshold is reached. Overall, the design is a balance between complexity and computational efficiency, Muir said.

Application of SNN in Vehicle Field

SynSense's electronic neurons are based on the integrate-and-fire model


SynSense's digital chips are designed to process event-based CNNs, with each layer processed by a different core. The kernel runs asynchronously and independently; the entire processing pipeline is event-driven.

"Our Speck modules run in real-time with low latency," Muir said. "We can manage effective inference rates above 20Hz at less than 5mW power consumption. This is much faster than using traditional low-power computing on standard RGB video streams. ."

While SynSense and BMW will initially explore neuromorphic use cases in smart cockpits, it has potential for other automotive applications as well.


"First, we'll explore non-safety-critical use cases, and we're planning future versions of Speck with higher resolution, as well as improvements to our DynapCNN vision processor, which will interface with high-resolution sensors," Muir said. We plan for these future technologies It will support advanced automotive applications such as autonomous driving, emergency braking, etc."

Application of SNN in Vehicle Field

SynSense and Inivation Speck module, an event camera-based module containing sensors and processors


BrainChip Akida

Mercedes-Benz's EQXX concept car, which debuted at CES earlier this year, uses BrainChip's Akida neuromorphic processor for in-vehicle keyword recognition. Billed as "the most efficient car Mercedes has ever made," the car utilizes neuromorphic technology that consumes less power than a deep learning-based keyword spotting system. That's crucial for a car with a range of 620 miles, or 167 miles more than Mercedes' flagship electric car, the EQS.

Mercedes said at the time that BrainChip's solution was five to 10 times more efficient than traditional voice controls at recognizing the wake word "Hey Mercedes."


Application of SNN in Vehicle Field

Mercedes said, “Although neuromorphic computing is still in its infancy, such systems will soon be on the market within a few years. When applied at scale throughout vehicles, they have the potential to radically reduce the amount of effort required to run the latest AI technologies. power consumption."

BrainChip's CMO Jerome Nadel said: "Mercedes is focused on big issues like battery management and transmission, but every milliwatt counts, and when you think about energy efficiency, even the most basic reasoning, like finding keywords, matters. important."


A typical car could have as many as 70 different sensors by 2022, Nadel said. For cockpit applications, these sensors can enable face detection, gaze assessment, emotion classification, and more.

He said: “From a system architecture perspective, we can do a 1:1 approach where there is a sensor that will do some preprocessing and then the data will be forwarded. The AI will do inference near the sensor...it will Instead of the full array of data from sensors, the inference metadata is passed forward.”

The idea is to minimize the size and complexity of packets sent to AI accelerators, while reducing latency and minimizing power consumption. Each vehicle will likely have 70 Akida chips or sensors with Akida technology, each of which will be "low-cost parts that won't notice them at all," Nadel said. He noted that attention needs to be paid to the BOM of all these sensors.


Application of SNN in Vehicle Field

BrainChip expects to have its neuromorphic processor next to every sensor on the vehicle

Going forward, Nadel said, neuromorphic processing will also be used in ADAS and autonomous driving systems. This has the potential to reduce the need for other types of power-hungry AI accelerators.


"If every sensor could have Akida configured on one or two nodes, it would do adequate inference, and the data passed would be an order of magnitude less, because that would be inference metadata...that would affect the servers you need," he said. power."

BrainChip's Akida chip accelerates SNNs (spike neural networks) and CNNs (by converting to SNNs). It's not tailored for any specific use case or sensor, so it can be paired with visual sensing for face recognition or people detection, or other audio applications like speaker ID. BrainChip also demonstrated Akida's smell and taste sensors, although it's hard to imagine how these could be used in cars (perhaps to detect air pollution or fuel quality through smell and taste).

Akida is set up to handle SNNs or deep learning CNNs that have been transformed into SNNs. Unlike the native spike network, the transformed CNN preserves some spike-level information, so it may require 2 or 4 bits of computation. However, this approach allows exploiting the properties of CNNs, including their ability to extract features from large datasets. Both types of networks can be updated at the edge using STDP. In the case of Mercedes-Benz, this might mean retraining the network after deployment to discover more or different keywords.


Application of SNN in Vehicle Field

According to Autocar, Mercedes-Benz confirmed that "many innovations" from the EQXX concept car, including "specific components and technologies," will be used in the production model. There's no word yet on whether new Mercedes-Benz models will feature artificial brains.
“SynSense is working with BMW to advance the integration of neuromorphic chips in smart cockpits and explore related areas together. BMW will evaluate SynSense's Speck SoC, which combines SynSense's neuromorphic vision processor and Inivation's 128x128-pixel event camera. Can be used to capture visual information in real-time, identify and detect objects, and perform other vision-based detection and interaction functions.”

I wonder what the details of the partnership are? Who is working with BMW the most…SynSense or IniVation…or both equally? If IniVation were to release a better SoC, say their Aeveon SoC, and SynSense isn’t involved with that technology, what would BMW do?
 
  • Like
Reactions: 8 users

Learning

Learning to the Top 🕵‍♂️


Learning 🏖
 
  • Like
  • Fire
  • Love
Reactions: 41 users

Steve10

Regular
Interesting that Edge Impulse is a Nvidia Inception Premier member.

What Is NVIDIA Inception?​

NVIDIA Inception is a free program designed to help startups evolve faster through access to cutting-edge technology and NVIDIA experts, opportunities to connect with venture capitalists, and co-marketing support to heighten your company’s visibility.

Program Benefits​

Unlike traditional accelerators, NVIDIA Inception supports all stages of a startup’s life cycle. We work closely with members to provide the best technical tools, latest resources, and opportunities to connect with investors. As your startup matures, your program benefits also evolve to further your growth. Premier members receive increased NVIDIA marketing support, access to Premier-only member events, and a dedicated NVIDIA relationship manager.

Get Started​

You’re encouraged to apply to NVIDIA Inception no matter what your current funding stage is. There are no application deadlines, cohorts, or term limits.


NVIDIA Inception Premier Members
1675580992360.png

North America

Edge Impulse​

Enabling the ultimate development experience for machine learning on embedded devices for sensors, audio, and computer vision at scale.
 
  • Like
  • Fire
Reactions: 14 users

alwaysgreen

Top 20
Sure….

News has been out for 5 mins and we’re already picking them apart. Get a grip.

In terms of lightening up - I’m a retired 34yo.

Eat a dick hahahahaha

High Five Sacha Baron Cohen GIF by filmeditor


I made a joke about a tech company and you reacted like I was talking about your mum.

Despite nobody asking, please keep telling me how amazing you are though. I'm all ears.
 
  • Haha
  • Like
  • Love
Reactions: 20 users

MrNick

Regular
I see they’ve released Andrew Tate.
 
  • Haha
  • Like
Reactions: 8 users

FJ-215

Regular
  • Haha
  • Fire
  • Like
Reactions: 10 users

FlipDollar

Never dog the boys
Fair play - I’ll delete my posts.
 
  • Like
Reactions: 6 users

Boab

I wish I could paint like Vincent
  • Like
  • Fire
  • Love
Reactions: 52 users
"The preference is for a prototype processor fabricated in a technology node suitable for the space environment, such as 22-nm FDSOI, which has become increasingly affordable"

Timing is everything.

BrainChip Tapes Out AKD1500 Chip in GlobalFoundries 22nm FD SOI Process
Didn't NASA prefer 28nm before now they seem to prefer 22nm🤔🤔
 
  • Like
  • Fire
Reactions: 15 users
D

Deleted member 118

Guest
So is Nasa having current issues with neuromophic processors in SWaP missions?

State of the Art and Critical Gaps:

Neuromorphic and deep neural net software for point applications has become widespread and is state of the art. Integrated solutions that achieve space-relevant mission capabilities with high throughput and energy efficiency is a critical gap. For example, terrestrial neuromorphic processors such as Intels Cooporation's LoihiTM, Brainchip's AkidaTM, and Google Inc's Tensor Processing Unit (TPUTM) require full host processors for integration for their software development kit (SDK) that are power hungry or limit throughput. This by itself is inhibiting the use of neuromorphic processors for low SWaP space missions.

The system integration principles for integrated combinations of neuromorphic software is a critical gap that requires R&D, as well as the efficient mapping of integrated software to integrated avionics hardware. Challenges include translating the throughput and energy efficiency of neuromorphic processors from the component level to the system level, which means minimizing the utilization and processing done by supportive CPUs and GPUs.
 
  • Like
  • Thinking
Reactions: 5 users
D

Deleted member 118

Guest
Some sort of redundancy is likely to be needed. I wonder who?

The innovation, as compared to terrestrial processors, is to incorporate the mechanisms for fault tolerance in an edge processor capable of machine learning with high power efficiency. Some type of redundancy will likely be needed. The reference for Johann Schumann’s incorporation of triple modular redundancy for Loihi is one example mechanism that masks faults, but at the expense of an overall 3x reduction in power efficiency. In a neuromorphic context with stochasticity, innovations for more efficient fault tolerance techniques might be developed.
 
  • Like
Reactions: 3 users

Andy38

The hope of potential generational wealth is real
OK,
Deleted last three attempts.
We got the wrong guy !

It's this bloke

With more than 20 years experience of developing and applying emerging technologies to industry, Bhasker is a results-focused business leader with an in-depth understanding of deploying IoT, AI and Blockchain in the utilities and energy sector.

During his career, Bhasker has founded, led and advised companies focused on delivering innovative solutions to their customers, especially in the manufacturing sector. He brings a strong combination of technology, client relationship and commercial skills to any business challenge, plus a passion for innovation and creativity that has seen him develop five patents.

In his current role as CEO of Fortech Energy, he leads the development of enterprise automation and IoT solutions for the power, gas, water and manufacturing industry. Bhasker has a Physics degree from the Indian Institute of Technology in Bombay, plus a PhD in Electrical Engineering from the University at Buffalo in New York.

Here is link to Rob Telson post about Edge impulse official support for Brainchip
Bao Bhasker ( Spelt with an "e" ) responds with a congrats etc.

Great find @Taproot!
Fortech Energy is part of the group of a number of Enzen companies. The reach of this group appears to be ridiculously crazy! The Minsky cube video ( can’t remember who posted, apologies) states it can be applied to ALL GAS, ALL OIL, ALL POWER, ALL WATER companies and their manufacturing operations! I’m blown away reading and listening to what is unfolding. Links to cybersecurity, drone and smart city applications… bring it on! 💪💪
 
  • Like
  • Fire
  • Love
Reactions: 26 users

Diogenese

Top 20
I suppose that's ok on the "No publicity is bad publicity" principle.

" ... terrestrial neuromorphic processors such as Intels Cooporation's LoihiTM, Brainchip's AkidaTM, and Google Inc's Tensor Processing Unit (TPUTM) require full host processors for integration for their software development kit (SDK) that are power hungry or limit throughput. This by itself is inhibiting the use of neuromorphic processors for low SWaP space missions."

The redeeming feature is:

"The preference is for a prototype processor fabricated in a technology node suitable for the space environment, such as 22-nm FDSOI, which has become increasingly affordable."

So, suddenly, Global Foundries is about to make Akida 1500 in FD-SOI, without the encumbrance of a full host processor.
 
  • Like
  • Fire
  • Love
Reactions: 62 users

wilzy123

Founding Member
Very, thanks for asking.

How about you? Good weekend I trust.


@FJ-215: "If they need the extra 10M of shares it will be because the 30M wasn't enough to meet our $15 M obligation."

The conditions for this capital call are clear as day (to some/most?)

The likelihood that BRN will hit their $15 target, with shares to spare (i.e. returned to BRN), is very probable (do you need me to calculate the VWAP since the call was made, or you got that?).

Yet, you have chosen to ignore all of this when called out....... ok then lol (y)(y)
 
  • Like
  • Thinking
Reactions: 4 users

Proga

Regular
“SynSense is working with BMW to advance the integration of neuromorphic chips in smart cockpits and explore related areas together. BMW will evaluate SynSense's Speck SoC, which combines SynSense's neuromorphic vision processor and Inivation's 128x128-pixel event camera. Can be used to capture visual information in real-time, identify and detect objects, and perform other vision-based detection and interaction functions.”

I wonder what the details of the partnership are? Who is working with BMW the most…SynSense or IniVation…or both equally? If IniVation were to release a better SoC, say their Aeveon SoC, and SynSense isn’t involved with that technology, what would BMW do?
I think this is BMW's second mistake. The first was using Qualcomm's Snapdragon and they knew it which I'd previously mentioned so decided to investigate SynSense.

The interesting paragraph (copied below) means BMW are years behind MB who already have built a vehicle which is running around being tested using (suspect) Akida in all their sensors + Valeo's gen3 Lidar. This will be the backbone of their L3-L4 autonomous driving system into the near to medium term future in every model starting off with the above AAM platform in the 2nd half of 2024.

"First, we'll explore non-safety-critical use cases, and we're planning future versions of Speck with higher resolution, as well as improvements to our DynapCNN vision processor, which will interface with high-resolution sensors," Muir said. We plan for these future technologies It will support advanced automotive applications such as autonomous driving, emergency braking, etc."

Nadel's comment brought joy. "The idea is to minimize the size and complexity of packets sent to AI accelerators, while reducing latency and minimizing power consumption. Each vehicle will likely have 70 Akida chips or sensors with Akida technology, each of which will be "low-cost parts that won't notice them at all," Nadel said. He noted that attention needs to be paid to the BOM of all these sensors.". Cha-ching

Great find @Tothemoon24 .

I was hoping if I scrolled down further @Diogenese you'd have made a comment to confirm my own thoughts. Your second effort was exactly what I was looking for so thanks mate.
 
  • Like
  • Fire
  • Wow
Reactions: 29 users

Xray1

Regular
  • Like
  • Thinking
Reactions: 3 users

FJ-215

Regular
BrainChip Holdings Announces new Capital Call Notice under continuing agreement with LDA Capital

A few observations on this announcement.

Looks like BRN are chasing the whole $30M available in one call. LTH's will notice that the time frame is longer than previous periods. Originally 30 days, subsequent calls were reduced to a couple of weeks (possibly to reduce shorting impact). This call is running for months. Expect some volatility, we might be a traders paradise for a while but we should see some very decent gains in the SP.

Very happy to see this announcement, calls on LDA usually come with good news.
But, but, but,

Do we have good news??????
 
  • Like
Reactions: 4 users

Proga

Regular
With the conversation in the IT space now well and truly focussing on AI, the BRN SP really needs a commercial application to hit the market sooner rather than later to take advantage. There is still too many doubters thinking it is space fiction. If BRN does get caught up in the euphoria, expect a lot of volatility until a commercial application does hit the market. The below was in Bloomberg today (BRN not mentioned just some of the Tier 1's)

AI-Fueled Stock Rally Ignites Debate on Who’s Winning and Losing​

  • UBS says industry could be worth more than a trillion dollars
  • Microsoft, Alphabet, other tech giants compete to be leaders
Microsoft Corp. Chief Executive Officer Satya Nadella Delivers A Keynote Speech

Photographer: SeongJoon Cho/Bloomberg
By
Bailey Lipschultz
5 February 2023 at 00:00 GMT+10
https://www.bloomberg.com/quote/MSFT:US
Stock rallies tied to the latest market craze may be a good investment or fears of missing out.
That debate, which hits on the theme of several boom-to-bust cycles in some corners of the market, has been reignited after artificial intelligence-linked companies saw their shares pop as tech giants like Microsoft Corp. announced big investments in the industry.
The frenzy conjures memories of the crypto and cannabis crazes and even the dot-com bubble in the late 1990s, where investors piled into stocks and asked questions later. After Friday’s close, a handful of firms added some $5.2 billion in market value despite announcements that look more like half-baked plans.
“It is total buyer-beware unless you know what you’re doing and have proper risk management,” said Matthew Tuttle, CEO of Tuttle Capital Management. “You can’t just go in and buy any company that says they’re in AI.”
Generative AI, a label used to describe artificial intelligence technology that can create things like art or text, and ChatGPT — the popular tool owned by OpenAI which Microsoft just invested $10 billion in — have been mentioned in more than 165 earnings calls and press releases this year so far, more than the number of mentions in all of 2022.
The citations come with good reason. The market for generative AI could be “exceedingly large,” easily “in excess of a trillion dollars,” according to UBS analysts led by Lloyd Walmsley. The potential windfall is enticing investors and underscores the fine line between trend-chasing companies hoping to make a quick return and those with goals to utilize a technological advancement to drive growth and make billions.
BuzzFeed Inc., a struggling media company, soared 307% in a two-day frenzy in January on veiled plans to utilize AI. Similarly, after touting its AI plans in a statement software company C3.ai Inc. shares more than doubled this year with $48 million in buying from retail traders, according to data from Vanda Research. Meanwhile, semiconductor maker Nvidia Corp., which has been touted by Wall Street analysts as a beneficiary of greater investment in AI, posted its best month in almost six years, its shares rallied 34% in January.
AI Euphoria | Stocks with exposure to artificial intelligence see big stock rallies




But investors should proceed with caution as some companies are already starting to falter. BuzzFeed shares have wiped out more than 40% of their value from an intraday high for the year last week and video game software maker Versus Systems Inc. is trading back under $1 a share after selling 2.5 million shares to capitalize on its rally.
For now, despite signs of speculative froth, AI has yet to reach the heights of the blockchain madness when companies like Long Island Iced Tea Corp. rebranded itself Long Blockchain Corp. to avoid delisting.
Tech heavyweights looking to use AI to grow revenue are likely the best places for investors to park their cash, according to Tuttle.

There’s certainly lots of opportunities out there. Meta Platforms Inc. Chief Executive Officer Mark Zuckerberg, for example, recently said one of his main goals is to be a leader in generative AI. Alphabet CEO Sundar Pichai already sees Google as a leader in developing AI, noting during the company’s earnings call this week that he spoke about Google being an AI-first company more than six years ago. While Snap Inc. CEO Evan Spiegel sees generative AI as a “huge opportunity that they’re “already investing a ton” in.
Read more: Microsoft, Meta Among Tech Giants Talking AI on Earnings Calls
The return of market liquidity as speculative corners of the investing world soar are another signal that investors should exercise caution, Tuttle says. Many of the riskiest corners of the stock market have rallied to start the year. Cathie Wood’s ARK Innovation ETF is up 37% and a basket of so-called meme stocks — a group completely detached from fundamentals — is 22% higher, while the world’s largest cryptocurrency based on market value Bitcoin is up roughly 40%.
“Financial conditions are loose and it’s the perfect environment for anything new and AI is the new kid on the block right now,” he said. “If you know what you’re doing, great, have at it. If you’re a FOMO person, stay away until this gets a little more settled.”
 
  • Like
  • Fire
Reactions: 14 users
Top Bottom