BRN Discussion Ongoing

stuart888

Regular
While I love the watching machine AI tech.

Healthcare, Industrial Sensors, and ADAS on board safety is more massive of a use case. Seems like the Brainchip Spiking NPU can shine in higher value use-cases, the space satellite AI is an example.

Just trying to learn and figure this out.
 
  • Like
  • Fire
Reactions: 3 users

HopalongPetrovski

I'm Spartacus!
Yeah Bravo. Just so interesting Samsung is tossing around AI in their advertisements via the AI Ecobubble. I like to look at the Macro view of this event.

Very interesting. Kind of means invest AI. They are spending their marketing dollars boosting AI, which helps all in the industry.

Elevating User Experiences with AI-Based Technologies​

While the new lineup improves the performance in energy efficiency, the core functions also help users reduce efforts in treating their laundry.

AI Wash takes the guesswork out of the perfect wash. Four sensors weigh your wash load and check the level of soiling to adjust the water, amount of detergent dispensed, rinse time and spin speed* for great results and minimal wear and tear with every wash.

In addition to that AI Ecobubble goes one step further by detecting for wash loads of up to 2kg.
Hey Stuart.
Have you been getting into Bravo's supply of catnip? 🤣
You're on fire recently.

 
  • Haha
  • Like
  • Love
Reactions: 19 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Neil Tyler
05 Apr 2024
More in

Distribution


Electronics


Artificial Intelligence


Mouser and Edge Impulse to provide access to ML development platform​

News
1 min read
Mouser Electronics has announced a new global partnership with Edge Impulse, a development platform that enables machine learning (ML) on edge devices.
mr526-a-edgeimpulsepartnership.jpg

Credit: Mouser
The agreement will see the provision of advanced intelligence to a wide range of products and devices, from low-power MCUs to efficient Linux CPU targets and GPUs.
This collaboration is being described as a significant one for the electronics industry, as it brings hardware and AI/ML closer together, combining the platform and tools of Edge Impulse with Mouser's expansive range of compatible hardware products.
Mouser customers will now have access to the Edge Impulse software platform through relevant product microsites and will be able to learn more about how they can use ML with their designs. Similarly, Edge Impulse users will have access to compatible hardware, including microcontrollers and development kits, directly from Mouser.
"Our partnership with Edge Impulse is an exciting proposition for both companies, as well as the wider engineering community," said Kevin Hess, Senior Vice President of Marketing, Mouser Electronics. "Its innovative platform is revolutionising ML at the edge, making it accessible for a wide range of users, from individuals to large corporations. Teaming up with Edge Impulse allows us to provide its users with easy access to the hardware they need, at the scale they desire."
"Edge Impulse's alliance with Mouser Electronics accelerates our mission of empowering enterprises and developers worldwide with cutting-edge AI tools," added Zach Shelby, CEO and co-founder of Edge Impulse. "Mouser's global reach and industry-leading distribution services of electronic components will help us promote edge AI development to all industries."
The popularity of edge ML and artificial intelligence (AI) solutions in industries such as manufacturing, healthcare, and consumer electronics underscores the importance for engineers of having access to effective development platforms and appropriate products.
The Edge Impulse platform streamlines deployment and scaling and enables rapid implementation. As a hardware-agnostic solution, it enables engineers and students to develop sophisticated ML and AI solutions, leveraging a variety of hardware options to expand the range of applications.
This global collaboration is designed to remove barriers preventing edge AI and ML development and simplify access to hardware products, streamlining development. The partnership builds on previous successful collaborations between the two companies, initially, with BrickML, a standalone tool designed by Edge Impulse and manufactured by Reloc. This system-on-module (SoM) allows engineers to develop and deploy ML at the edge and is exclusively distributed through Mouser.

 
  • Like
  • Love
  • Fire
Reactions: 19 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
What does everyone think of Sima.ai? There has been previous discussions about whether or not they could be a potential licensee plus, I know that various Sima.ai posts have been "liked" by both LDN and Anil in the past.



Race to the gen AI edge heats up as Dell invests in SiMa.ai​

James Thomason@jathomason
April 5, 2024 1:22 PM
Edge computing digital background - stock photo

technology and innovation concept
Image Credit: MF3d

Join us in Atlanta on April 10th and explore the landscape of security workforce. We will explore the vision, benefits, and use cases of AI for security teams. Request an invite here.


SiMa.ai, a chip startup developing a software-centric edge AI solution, yesterday announced a $70 million funding round that highlights the growing interest and investment in edge AI. But it’s the participation of Dell Technologies Capital, the technology titan’s strategic investment arm, that signals a major vote of confidence in SiMa.ai’s approach and a shared vision for the future of AI at the edge.
The investment stands out as the only “hard tech” deal in Dell Technologies Capital’s portfolio in the past 12 months, according to data from Pitchbook, This underscores the potential of edge AI to drive new use cases for Dell’s products and unlock value for enterprises. SiMa.ai’s approach to simplifying AI deployment and management at the edge aligns closely with Dell’s product portfolio and go-to-market strategy, positioning the tech giant to capitalize on the growing demand for AI at the edge.



Edge computing is so back

Over the last decade, the use of edge computing focused primarily on industrial deployments, connecting machines and harvesting data from sensors, all of which are use cases with relatively low computational requirements. IoT was a major driver of edge computing in retail, heavy industry, logistics and supply chain services. However, despite significant investments in IoT projects over the past decade, many enterprises struggled to derive clear business value from these initiatives.
Today, the rise of AI is breathing new life into the edge computing market. Over the past three years, IT solution providers like Dell and HPE have been quickly transforming their edge offerings, evolving from simple gateway devices to powerful, ruggedized servers capable of handling more computationally intensive workloads like AI. According to analysts at Fortune Business Insights, Markets and Markets, the global edge computing market is expected to at least double in the next few years.

VB EVENT​

The AI Impact Tour – Atlanta
Continuing our tour, we’re headed to Atlanta for the AI Impact Tour stop on April 10th. This exclusive, invite-only event, in partnership with Microsoft, will feature discussions on how generative AI is transforming the security workforce. Space is limited, so request an invite today.

Request an invite
SiMa.ai’s machine learning system-on-chip (MLSoC) technology, could integrate with Dell’s edge computing offerings such as the PowerEdge XR series of ruggedized servers, enabling the company to deliver generative AI use cases to the edge.
ADVERTISEMENT

hZN-ZzCYpasxdQ81_le4s_T8XB4JcC1N4qn9LdpW6FWWpo--P104Ap3TMl2QPZ9YDAhnNCimDMAFfIYDlHHTf60dWsuG4ilVZ8Ye9fqOpp846I4xRI9UKfnJsJJImRojq2WgMlO99bwqPlTPTCjBJro

Credit: Koyfin

Generative AI at the edge

“AI — particularly the rapid rise of generative AI — is fundamentally reshaping the way that humans and machines work together,” said Krishna Rangasayee, founder and CEO at SiMa.ai. “Our customers are poised to benefit from giving sight, sound and speech to their edge devices, which is exactly what our next-generation MLSoC is designed to do.”
ADVERTISEMENT

In the last year, we have all witnessed the emergence of generative AI in chatbots and virtual assistants, but the potential applications of generative AI at the edge are enormous. According to IDC’s Future Enterprise Resiliency and Spending Survey, 38% of enterprises expect to see improved personalization of employee experiences in areas like call centers and customer interaction through the use of AI at the edge.
ADVERTISEMENT

For example, in retail, voice-assisted shopping experiences could revolutionize customer engagement, with AI-powered systems offering personalized product recommendations, answering queries and even guiding customers through virtual try-ons. Restaurants could leverage interactive AI-driven menus and ordering systems, enhancing the dining experience while optimizing kitchen operations based on real-time demand forecasting.
Beyond consumer-facing applications, generative AI at the edge could transform industrial operations and supply chain management. Autonomous quality control systems could identify defects and anomalies in real time, learning from past data to continuously improve their accuracy. Predictive maintenance models could analyze sensor data and generate proactive alerts, minimizing downtime and optimizing resource allocation. In logistics, AI-powered demand forecasting and route optimization could streamline operations, reducing costs and improving delivery times.

Generative AI is also poised to transform industrial operations and supply chain management across multiple fronts, according to the World Economic Forum (WEF). The use of large language models (LLMs) could automatically generate maintenance instructions, standard operating procedures and other textual assets, driving process automation. Additionally, deploying LLMs could enable robots and machines to comprehend and act upon voice commands without task-specific training or frequent retraining. Autonomous quality control powered by generative AI could identify defects and anomalies in real-time, continuously learning from past data to improve accuracy, while predictive maintenance models analyzing sensor information could generate proactive alerts, minimizing downtime while optimizing resource allocation.
An analysis from McKinsey suggests the healthcare sector could also benefit greatly from generative AI at the edge. Real-time patient monitoring systems could analyze vital signs, generate early warning alerts, and provide personalized treatment recommendations. AI-assisted diagnostic tools could help healthcare professionals make more accurate and timely decisions, improving patient outcomes and reducing the burden on overworked medical staff.
ADVERTISEMENT

The challenge of generative AI edge deployments

Deploying generative AI models at the edge is particularly challenging because it requires balancing the need for fast, real-time responses with the ability to leverage local data for personalization, as highlighted in a recent IEEE working paper. These models must quickly adapt to new information and user behaviors directly at the edge, where computational resources are more limited than in centralized cloud environments. This requires AI models that are not only efficient and responsive but also capable of learning and evolving from localized datasets to provide tailored services. The dual demands of speed and personalization, all within the resource-constrained context of edge computing, underscore the complexity of deploying generative AI in such settings.
SiMa.ai claims to overcome these hurdles with its machine learning system-on-chip (MLSoC) designed specifically for edge AI use cases. Unlike other solutions that often require combining a machine learning (ML) accelerator with a separate processor, SiMa.ai claims MLSoC integrates everything needed for edge AI into a single chip. This includes specialized processors for computer vision and machine learning, a high-performance ARM processor and efficient memory and interfaces. The result is a compact, power-efficient solution that simplifies the deployment of AI at the edge. This combination of features could make SiMa.ai’s platform potentially attractive for infrastructure providers like Dell looking to bring powerful AI capabilities to edge devices.


The race to the AI edge is on

As enterprises increasingly seek to harness the power of AI at the edge, Dell’s strategic investment in SiMa.ai suggests that edge computing may have finally found its killer use case in AI. With SiMa.ai’s platform and Dell’s edge computing strategy aligned, the future of edge AI looks brighter than ever, promising to transform the way businesses operate and interact with their customers.
The market has already picked who it thinks the winners in AI will be, with Dell stock up 70.83% YTD, HPE up 7.08% and Cisco down -2.49%. Meanwhile, Supermicro has seen its stock soar a staggering 232% YTD, largely due to expectations of its increased data center sales. However, as Dell’s investment in SiMa.ai suggests, the edge could be the next critical lap in the race.
Of course, this is just the beginning. Over the next few years, we can expect to see a flurry of strategic investments and acquisitions as major tech companies race to stake their claim in the edge AI market. The fight to bring powerful AI capabilities to the enterprise edge could put a strain on existing partnerships, reminiscent of the virtualization era when we saw the disruptive VCE alliance between VMware, Cisco and EMC, which ultimately sparked the enormous merger between Dell and EMC.
 
  • Like
  • Thinking
  • Love
Reactions: 20 users

stuart888

Regular
Hey Stuart.
Have you been getting into Bravo's supply of catnip? 🤣
You're on fire recently.


Love how when I click to respond, I still have the video going! The video is gone from the screen, but I can listen.
Nomatic Fanatic is an interesting chap, with two cats as a Youtuber in an RV. I like the guy.

This is part of my low key, non-learning time! I like all folks, cats, everyone.

 

stuart888

Regular
Or Felix! This guy is my favorite. Watch him daily. Lives in Hong Kong, so witty.
Just a gem of a chap. He is flying to an island tomorrow.

All about Free Cash Flow and profits, and cats/dogs!!!!



1712374156005.png
 
  • Like
Reactions: 4 users

Bravo

If ARM was an arm, BRN would be its biceps💪!

SiMa.ai secures $70M funding to introduce a multimodal GenAI chip​

Jagmeet Singh@jagmeets13 / 11:00 PM GMT+11•April 4, 2024
Comment
SiMa.ai founder Krishna Rangasayee

Image Credits: SiMa.ai
SiMa.ai, a Silicon Valley–based startup producing embedded machine learning (ML) system-on-chip (SoC) platforms, today announced that it has raised a $70 million extension funding round as it plans to bring its second-generation chipset, specifically built for multimodal generative AI processing, to market.
According to Gartner, the market for AI-supporting chips globally is forecast to more than double by 2027 to $119.4 billion compared to 2023. However, only a few players have started producing dedicated semiconductors for AI applications. Most of the prominent contenders initially focused on supporting AI in the cloud. Nonetheless, various reports predicted a significant growth in the market of AI on the edge, which means the hardware processing AI computations are closer to the data gathering source than in a centralized cloud. SiMa.ai, named after “seema,” the Hindi word for “boundary,” strives to leverage this shift by offering its edge AI SoC to organizations across industrial manufacturing, retail, aerospace, defense, agriculture and healthcare sectors.
The San Jose–headquartered startup, which targets the market segment between 5W and 25W of energy usage, launched its first ML SoC to bring AI and ML through an integrated software-hardware combination. This includes its proprietary chipset and no-code software called Palette. The combination has already been used by over 50 companies globally, Krishna Rangasayee, the founder and CEO of SiMa.ai, told TechCrunch.

The startup touts that its current generation of the ML SoC delivered the highest FPS/W results on the MLPerf benchmark across the MLPerf Inference 4.0 closed, edge and power division categories. However, the first-generation chipset was focused on classic computer vision.
As the demand for GenAI is growing, SiMa.ai is set to introduce its second-generation ML SoC in the first quarter of 2025 with an emphasis on providing its customers with multimodal GenAI capability. The new SoC will be an “evolutionary change” over its predecessor with “a few architectural tunings” over the existing ML chipset, Rangasayee said. He added that the fundamental concepts would remain the same.
The new GenAI SoC would adapt to any framework, network, model and sensor — similar to the company’s existing ML platform — and will also be compatible with any modality, including audio, speech, text and image. It would work as a single-edge platform for all AI across computer vision, transformers and multimodal GenAI, the startup said.
“You cannot predict the future, but you can pick the vector and say, hey, that’s the vector I want to bet on. And I want to continue evolving around my vector. That’s kind of the approach that we took architecturally,” said Rangasayee. “But fundamentally, we really haven’t walked away or had to drastically change our architecture. This is also the benefit of us taking a software-centric architecture that allows more flexibility and nimbleness.”

SiMa.ai has Taiwan’s TSMC as the manufacturing partner for both its first- and second-generation AI chipsets and Arm Holdings as the provider for its compute subsystem. The second-generation chipset will be based on TSMC’s 6nm process technology and include Synopsys EV74 embedded vision processors for pre- and post-processing in computer vision applications.

The startup considers incumbents like NXP, Texas Instruments, STMicro, Renaissance and Microchip Technology, and Nvidia, as well as AI chip startups like Hailo, among the competition. However, it considers Nvidia as the primary competitor — just like other AI chip startups.
Rangasayee told TechCrunch that while Nvidia is “fantastic in the cloud,” it has not built a platform for the edge. He believes that Nvidia lacks adequate power efficiency and software for edge AI. Similarly, he asserted that other startups building AI chipsets do not solve system problems and are just offering ML acceleration.
“Amongst all of our peers, Hailo has done a really good job. And it’s not us being better than them. But from our perspective, our value proposition is quite different,” he said.
The founder continued that SiMa.ai delivers higher performance and better power efficiency than Hailo. He also said SiMa.ai’s system software is quite different and effective for GenAI.

“As long as we’re solving customer problems, and we are better at doing that than anybody else, we are in a good place,” he said.
SiMa.ai’s fresh all-equity funding, led by Maverick Capital and with participation from Point72 and Jericho, extends the startup’s $30 million Series B round, initially announced in May 2022. Existing investors, including Amplify Partners, Dell Technologies Capital, Fidelity Management and Lip-Bu Tan also participated in the additional investment. With this fundraising, the five-year-old startup has raised a total of $270 million.
The company currently has 160 employees, 65 of whom are at its R&D center in Bengaluru, India. SiMa.ai plans to grow that headcount by adding new roles and extending its R&D capability. It also wants to develop a go-to-market team for Indian customers. Further, the startup plans to scale its customer-facing teams globally, starting with Korea and Japan and in Europe and the U.S.
“The computational intensity of generative AI has precipitated a paradigm shift in data center architecture. The next phase in this evolution will be widespread adoption of AI at the edge. Just as the data center has been revolutionized, the edge computing landscape is poised for a complete transformation. SiMa.ai possesses the essential trifecta of a best-in-class team, cutting-edge technology, and forward momentum, positioning it as a key player for customers traversing this tectonic shift. We’re excited to join forces with SiMa.ai to seize this once-in-a-generation opportunity,” said Andrew Homan, senior managing director at Maverick Capital, in a statement.
 
  • Like
  • Thinking
  • Wow
Reactions: 10 users
  • Like
  • Fire
  • Love
Reactions: 11 users

stuart888

Regular
Brainchip and Friends. Seems like this forum is for Bullish Brainchip, and Friends. We all have other stocks.

Why would a Bearish person even show up here, as they should be focused on what they are bullish on.

Brainchip and Friends is all things AI moving the eco-system forward.

Hopefully Brainchip grabs a chuck, but no matter what we are also buying all the Friends (AVGO, ANET, CDNS, GOOGL, ARM, etc.).

We love friends! Maybe Brainchip will shine, but we know the group will.
 

Bravo

If ARM was an arm, BRN would be its biceps💪!
  • Like
  • Love
  • Fire
Reactions: 23 users

Getupthere

Regular


Race to the gen AI Edge heats up as Dell Invests in SiMa.ai


SiMa.ai, a chip startup developing a software-centric edge AI solution, yesterday announced a $70 million funding round that highlights the growing interest and investment in edge AI. But it’s the participation of Dell Technologies Capital, the technology titan’s strategic investment arm, that signals a major vote of confidence in SiMa.ai’s approach and a shared vision for the future of AI at the edge.


The investment stands out as the only “hard tech” deal in Dell Technologies Capital’s portfolio in the past 12 months, according to data from Pitchbook, This underscores the potential of edge AI to drive new use cases for Dell’s products and unlock value for enterprises. SiMa.ai’s approach to simplifying AI deployment and management at the edge aligns closely with Dell’s product portfolio and go-to-market strategy, positioning the tech giant to capitalize on the growing demand for AI at the edge.


Edge computing is so back


Over the last decade, the use of edge computing focused primarily on industrial deployments, connecting machines and harvesting data from sensors, all of which are use cases with relatively low computational requirements. IoT was a major driver of edge computing in retail, heavy industry, logistics and supply chain services. However, despite significant investments in IoT projects over the past decade, many enterprises struggled to derive clear business value from these initiatives.


Today, the rise of AI is breathing new life into the edge computing market. Over the past three years, IT solution providers like Dell and HPE have been quickly transforming their edge offerings, evolving from simple gateway devices to powerful, ruggedized servers capable of handling more computationally intensive workloads like AI. According to analysts at Fortune Business Insights, Markets and Markets, the global edge computing market is expected to at least double in the next few years.


SiMa.ai’s machine learning system-on-chip (MLSoC) technology, could integrate with Dell’s edge computing offerings such as the PowerEdge XR series of ruggedized servers, enabling the company to deliver generative AI use cases to the edge.


Credit: Koyfin


Generative AI at the Edge


“AI — particularly the rapid rise of generative AI — is fundamentally reshaping the way that humans and machines work together,” said Krishna Rangasayee, founder and CEO at SiMa.ai. “Our customers are poised to benefit from giving sight, sound and speech to their edge devices, which is exactly what our next-generation MLSoC is designed to do.”


In the last year, we have all witnessed the emergence of generative AI in chatbots and virtual assistants, but the potential applications of generative AI at the edge are enormous. According to IDC’s Future Enterprise Resiliency and Spending Survey, 38% of enterprises expect to see improved personalization of employee experiences in areas like call centers and customer interaction through the use of AI at the edge.


For example, in retail, voice-assisted shopping experiences could revolutionize customer engagement, with AI-powered systems offering personalized product recommendations, answering queries and even guiding customers through virtual try-ons. Restaurants could leverage interactive AI-driven menus and ordering systems, enhancing the dining experience while optimizing kitchen operations based on real-time demand forecasting.


Beyond consumer-facing applications, generative AI at the edge could transform industrial operations and supply chain management. Autonomous quality control systems could identify defects and anomalies in real time, learning from past data to continuously improve their accuracy. Predictive maintenance models could analyze sensor data and generate proactive alerts, minimizing downtime and optimizing resource allocation. In logistics, AI-powered demand forecasting and route optimization could streamline operations, reducing costs and improving delivery times.

Generative AI is also poised to transform industrial operations and supply chain management across multiple fronts, according to the World Economic Forum (WEF). The use of large language models (LLMs) could automatically generate maintenance instructions, standard operating procedures and other textual assets, driving process automation. Additionally, deploying LLMs could enable robots and machines to comprehend and act upon voice commands without task-specific training or frequent retraining. Autonomous quality control powered by generative AI could identify defects and anomalies in real-time, continuously learning from past data to improve accuracy, while predictive maintenance models analyzing sensor information could generate proactive alerts, minimizing downtime while optimizing resource allocation.


An analysis from McKinsey suggests the healthcare sector could also benefit greatly from generative AI at the edge. Real-time patient monitoring systems could analyze vital signs, generate early warning alerts, and provide personalized treatment recommendations. AI-assisted diagnostic tools could help healthcare professionals make more accurate and timely decisions, improving patient outcomes and reducing the burden on overworked medical staff.


The challenge of generative AI Edge deployments


Deploying generative AI models at the edge is particularly challenging because it requires balancing the need for fast, real-time responses with the ability to leverage local data for personalization, as highlighted in a recent IEEE working paper. These models must quickly adapt to new information and user behaviors directly at the edge, where computational resources are more limited than in centralized cloud environments. This requires AI models that are not only efficient and responsive but also capable of learning and evolving from localized datasets to provide tailored services. The dual demands of speed and personalization, all within the resource-constrained context of edge computing, underscore the complexity of deploying generative AI in such settings.


SiMa.ai claims to overcome these hurdles with its machine learning system-on-chip (MLSoC) designed specifically for edge AI use cases. Unlike other solutions that often require combining a machine learning (ML) accelerator with a separate processor, SiMa.ai claims MLSoC integrates everything needed for edge AI into a single chip. This includes specialized processors for computer vision and machine learning, a high-performance ARM processor and efficient memory and interfaces. The result is a compact, power-efficient solution that simplifies the deployment of AI at the edge. This combination of features could make SiMa.ai’s platform potentially attractive for infrastructure providers like Dell looking to bring powerful AI capabilities to edge devices.


The Race to the AI Edge is on


As enterprises increasingly seek to harness the power of AI at the edge, Dell’s strategic investment in SiMa.ai suggests that edge computing may have finally found its killer use case in AI. With SiMa.ai’s platform and Dell’s edge computing strategy aligned, the future of edge AI looks brighter than ever, promising to transform the way businesses operate and interact with their customers.


The market has already picked who it thinks the winners in AI will be, with Dell stock up 70.83% YTD, HPE up 7.08% and Cisco down -2.49%. Meanwhile, Supermicro has seen its stock soar a staggering 232% YTD, largely due to expectations of its increased data center sales. However, as Dell’s investment in SiMa.ai suggests, the edge could be the next critical lap in the race.


Of course, this is just the beginning. Over the next few years, we can expect to see a flurry of strategic investments and acquisitions as major tech companies race to stake their claim in the edge AI market. The fight to bring powerful AI capabilities to the enterprise edge could put a strain on existing partnerships, reminiscent of the virtualization era when we saw the disruptive VCE alliance between VMware, Cisco and EMC, which ultimately sparked the enormous merger between Dell and EMC.
 
  • Like
  • Thinking
  • Love
Reactions: 10 users

Diogenese

Top 20
What does everyone think of Sima.ai? There has been previous discussions about whether or not they could be a potential licensee plus, I know that various Sima.ai posts have been "liked" by both LDN and Anil in the past.



Race to the gen AI edge heats up as Dell invests in SiMa.ai​

James Thomason@jathomason
April 5, 2024 1:22 PM
Edge computing digital background - stock photo

technology and innovation concept
Image Credit: MF3d

Join us in Atlanta on April 10th and explore the landscape of security workforce. We will explore the vision, benefits, and use cases of AI for security teams. Request an invite here.


SiMa.ai, a chip startup developing a software-centric edge AI solution, yesterday announced a $70 million funding round that highlights the growing interest and investment in edge AI. But it’s the participation of Dell Technologies Capital, the technology titan’s strategic investment arm, that signals a major vote of confidence in SiMa.ai’s approach and a shared vision for the future of AI at the edge.
The investment stands out as the only “hard tech” deal in Dell Technologies Capital’s portfolio in the past 12 months, according to data from Pitchbook, This underscores the potential of edge AI to drive new use cases for Dell’s products and unlock value for enterprises. SiMa.ai’s approach to simplifying AI deployment and management at the edge aligns closely with Dell’s product portfolio and go-to-market strategy, positioning the tech giant to capitalize on the growing demand for AI at the edge.



Edge computing is so back

Over the last decade, the use of edge computing focused primarily on industrial deployments, connecting machines and harvesting data from sensors, all of which are use cases with relatively low computational requirements. IoT was a major driver of edge computing in retail, heavy industry, logistics and supply chain services. However, despite significant investments in IoT projects over the past decade, many enterprises struggled to derive clear business value from these initiatives.
Today, the rise of AI is breathing new life into the edge computing market. Over the past three years, IT solution providers like Dell and HPE have been quickly transforming their edge offerings, evolving from simple gateway devices to powerful, ruggedized servers capable of handling more computationally intensive workloads like AI. According to analysts at Fortune Business Insights, Markets and Markets, the global edge computing market is expected to at least double in the next few years.

VB EVENT​

The AI Impact Tour – Atlanta
Continuing our tour, we’re headed to Atlanta for the AI Impact Tour stop on April 10th. This exclusive, invite-only event, in partnership with Microsoft, will feature discussions on how generative AI is transforming the security workforce. Space is limited, so request an invite today.

Request an invite
SiMa.ai’s machine learning system-on-chip (MLSoC) technology, could integrate with Dell’s edge computing offerings such as the PowerEdge XR series of ruggedized servers, enabling the company to deliver generative AI use cases to the edge.
ADVERTISEMENT

hZN-ZzCYpasxdQ81_le4s_T8XB4JcC1N4qn9LdpW6FWWpo--P104Ap3TMl2QPZ9YDAhnNCimDMAFfIYDlHHTf60dWsuG4ilVZ8Ye9fqOpp846I4xRI9UKfnJsJJImRojq2WgMlO99bwqPlTPTCjBJro

Credit: Koyfin

Generative AI at the edge

“AI — particularly the rapid rise of generative AI — is fundamentally reshaping the way that humans and machines work together,” said Krishna Rangasayee, founder and CEO at SiMa.ai. “Our customers are poised to benefit from giving sight, sound and speech to their edge devices, which is exactly what our next-generation MLSoC is designed to do.”
ADVERTISEMENT

In the last year, we have all witnessed the emergence of generative AI in chatbots and virtual assistants, but the potential applications of generative AI at the edge are enormous. According to IDC’s Future Enterprise Resiliency and Spending Survey, 38% of enterprises expect to see improved personalization of employee experiences in areas like call centers and customer interaction through the use of AI at the edge.
ADVERTISEMENT

For example, in retail, voice-assisted shopping experiences could revolutionize customer engagement, with AI-powered systems offering personalized product recommendations, answering queries and even guiding customers through virtual try-ons. Restaurants could leverage interactive AI-driven menus and ordering systems, enhancing the dining experience while optimizing kitchen operations based on real-time demand forecasting.
Beyond consumer-facing applications, generative AI at the edge could transform industrial operations and supply chain management. Autonomous quality control systems could identify defects and anomalies in real time, learning from past data to continuously improve their accuracy. Predictive maintenance models could analyze sensor data and generate proactive alerts, minimizing downtime and optimizing resource allocation. In logistics, AI-powered demand forecasting and route optimization could streamline operations, reducing costs and improving delivery times.

Generative AI is also poised to transform industrial operations and supply chain management across multiple fronts, according to the World Economic Forum (WEF). The use of large language models (LLMs) could automatically generate maintenance instructions, standard operating procedures and other textual assets, driving process automation. Additionally, deploying LLMs could enable robots and machines to comprehend and act upon voice commands without task-specific training or frequent retraining. Autonomous quality control powered by generative AI could identify defects and anomalies in real-time, continuously learning from past data to improve accuracy, while predictive maintenance models analyzing sensor information could generate proactive alerts, minimizing downtime while optimizing resource allocation.
An analysis from McKinsey suggests the healthcare sector could also benefit greatly from generative AI at the edge. Real-time patient monitoring systems could analyze vital signs, generate early warning alerts, and provide personalized treatment recommendations. AI-assisted diagnostic tools could help healthcare professionals make more accurate and timely decisions, improving patient outcomes and reducing the burden on overworked medical staff.
ADVERTISEMENT

The challenge of generative AI edge deployments

Deploying generative AI models at the edge is particularly challenging because it requires balancing the need for fast, real-time responses with the ability to leverage local data for personalization, as highlighted in a recent IEEE working paper. These models must quickly adapt to new information and user behaviors directly at the edge, where computational resources are more limited than in centralized cloud environments. This requires AI models that are not only efficient and responsive but also capable of learning and evolving from localized datasets to provide tailored services. The dual demands of speed and personalization, all within the resource-constrained context of edge computing, underscore the complexity of deploying generative AI in such settings.
SiMa.ai claims to overcome these hurdles with its machine learning system-on-chip (MLSoC) designed specifically for edge AI use cases. Unlike other solutions that often require combining a machine learning (ML) accelerator with a separate processor, SiMa.ai claims MLSoC integrates everything needed for edge AI into a single chip. This includes specialized processors for computer vision and machine learning, a high-performance ARM processor and efficient memory and interfaces. The result is a compact, power-efficient solution that simplifies the deployment of AI at the edge. This combination of features could make SiMa.ai’s platform potentially attractive for infrastructure providers like Dell looking to bring powerful AI capabilities to edge devices.


The race to the AI edge is on

As enterprises increasingly seek to harness the power of AI at the edge, Dell’s strategic investment in SiMa.ai suggests that edge computing may have finally found its killer use case in AI. With SiMa.ai’s platform and Dell’s edge computing strategy aligned, the future of edge AI looks brighter than ever, promising to transform the way businesses operate and interact with their customers.
The market has already picked who it thinks the winners in AI will be, with Dell stock up 70.83% YTD, HPE up 7.08% and Cisco down -2.49%. Meanwhile, Supermicro has seen its stock soar a staggering 232% YTD, largely due to expectations of its increased data center sales. However, as Dell’s investment in SiMa.ai suggests, the edge could be the next critical lap in the race.
Of course, this is just the beginning. Over the next few years, we can expect to see a flurry of strategic investments and acquisitions as major tech companies race to stake their claim in the edge AI market. The fight to bring powerful AI capabilities to the enterprise edge could put a strain on existing partnerships, reminiscent of the virtualization era when we saw the disruptive VCE alliance between VMware, Cisco and EMC, which ultimately sparked the enormous merger between Dell and EMC.
Sima's machine learning accelerator executes instructions - does that mean they're dead?*

US2021326189A1 SYNCHRONIZATION OF PROCESSING ELEMENTS THAT EXECUTE STATICALLY SCHEDULED INSTRUCTIONS IN A MACHINE LEARNING ACCELERATOR 20200417

1712376526027.png


bridging a deterministic phase of instructions with a non-deterministic phase of instructions when those instructions are executed by a machine learning accelerator while executing a machine learning network. In the non-deterministic phase, data and instructions are transferred from off-chip memory to on-chip memory. When the transfer is complete, processing elements are synchronized and, upon synchronization, a deterministic phase of instructions is executed by the processing elements.

US11631001B2 Heterogeneous computing on a system-on-chip, including machine learning inference 20200410

1712376641334.png


A system-on-chip (SoC) integrated circuit product includes a machine learning accelerator (MLA). It also includes other processor cores, such as general purpose processors and application-specific processors. It also includes a network-on-chip for communication between the different modules. The SoC implements a heterogeneous compute environment because the processor cores are customized for different purposes and typically will use different instruction sets. Applications may use some or all of the functionalities offered by the processor cores, and the processor cores may be programmed into different pipelines to perform different tasks.


* Rhetorical question - "Yes they are!"
US2023023303A1 MACHINE LEARNING NETWORK IMPLEMENTED BY STATICALLY SCHEDULED INSTRUCTIONS 20200203

1712377403686.png



[0027] The MLA 170 includes a plurality of Tiles 180 and an on-chip memory system implemented on a semiconductor die. The Tiles are organized into one or more meshes of interconnected Tiles. A depiction of a Tile mesh is shown to the right of box 170 in FIG. 1 A. In each mesh, the Tiles 180 are organized in a regular pattern and the interconnections within each mesh provide data transfer paths between Tiles in the mesh. The Tiles execute computations according to instructions received by the Tiles and using data stored in the on-chip memory system. These instructions may be for computations and/or for data transfer. Computations include multiply (including matrix multiply), add, and operators (e.g., nonlinear functions, lookup table, min/max, pooling). These are computations that implement the MLN. In the example of FIG. 1 A, the computations performed by layers 102 A-D are allocated to groups 182 A-D of Tiles as indicated. The allocation is not required to be 1:1. For example, multiple layers could be allocated to a single Tile or vice versa. Not every computation required to implement an MLN need be executed by a Tile; some computation may be executed outside the MLA (e.g., floating point operations, if the Tiles only do integer arithmetic). Tiles typically will at least perform matrix multiplication.

Dinosaurs have been buried in rock for thousands of millennia, and now these people are reinterring them in silicon.
 
Last edited:
  • Like
  • Haha
  • Fire
Reactions: 25 users

stuart888

Regular
What does everyone think of Sima.ai? There has been previous discussions about whether or not they could be a potential licensee plus, I know that various Sima.ai posts have been "liked" by both LDN and Anil in the past.



Race to the gen AI edge heats up as Dell invests in SiMa.ai​

James Thomason@jathomason
April 5, 2024 1:22 PM
Edge computing digital background - stock photo

technology and innovation concept
Image Credit: MF3d

Join us in Atlanta on April 10th and explore the landscape of security workforce. We will explore the vision, benefits, and use cases of AI for security teams. Request an invite here.


SiMa.ai, a chip startup developing a software-centric edge AI solution, yesterday announced a $70 million funding round that highlights the growing interest and investment in edge AI. But it’s the participation of Dell Technologies Capital, the technology titan’s strategic investment arm, that signals a major vote of confidence in SiMa.ai’s approach and a shared vision for the future of AI at the edge.
The investment stands out as the only “hard tech” deal in Dell Technologies Capital’s portfolio in the past 12 months, according to data from Pitchbook, This underscores the potential of edge AI to drive new use cases for Dell’s products and unlock value for enterprises. SiMa.ai’s approach to simplifying AI deployment and management at the edge aligns closely with Dell’s product portfolio and go-to-market strategy, positioning the tech giant to capitalize on the growing demand for AI at the edge.



Edge computing is so back

Over the last decade, the use of edge computing focused primarily on industrial deployments, connecting machines and harvesting data from sensors, all of which are use cases with relatively low computational requirements. IoT was a major driver of edge computing in retail, heavy industry, logistics and supply chain services. However, despite significant investments in IoT projects over the past decade, many enterprises struggled to derive clear business value from these initiatives.
Today, the rise of AI is breathing new life into the edge computing market. Over the past three years, IT solution providers like Dell and HPE have been quickly transforming their edge offerings, evolving from simple gateway devices to powerful, ruggedized servers capable of handling more computationally intensive workloads like AI. According to analysts at Fortune Business Insights, Markets and Markets, the global edge computing market is expected to at least double in the next few years.

VB EVENT​

The AI Impact Tour – Atlanta
Continuing our tour, we’re headed to Atlanta for the AI Impact Tour stop on April 10th. This exclusive, invite-only event, in partnership with Microsoft, will feature discussions on how generative AI is transforming the security workforce. Space is limited, so request an invite today.

Request an invite
SiMa.ai’s machine learning system-on-chip (MLSoC) technology, could integrate with Dell’s edge computing offerings such as the PowerEdge XR series of ruggedized servers, enabling the company to deliver generative AI use cases to the edge.
ADVERTISEMENT

hZN-ZzCYpasxdQ81_le4s_T8XB4JcC1N4qn9LdpW6FWWpo--P104Ap3TMl2QPZ9YDAhnNCimDMAFfIYDlHHTf60dWsuG4ilVZ8Ye9fqOpp846I4xRI9UKfnJsJJImRojq2WgMlO99bwqPlTPTCjBJro

Credit: Koyfin

Generative AI at the edge

“AI — particularly the rapid rise of generative AI — is fundamentally reshaping the way that humans and machines work together,” said Krishna Rangasayee, founder and CEO at SiMa.ai. “Our customers are poised to benefit from giving sight, sound and speech to their edge devices, which is exactly what our next-generation MLSoC is designed to do.”
ADVERTISEMENT

In the last year, we have all witnessed the emergence of generative AI in chatbots and virtual assistants, but the potential applications of generative AI at the edge are enormous. According to IDC’s Future Enterprise Resiliency and Spending Survey, 38% of enterprises expect to see improved personalization of employee experiences in areas like call centers and customer interaction through the use of AI at the edge.
ADVERTISEMENT

For example, in retail, voice-assisted shopping experiences could revolutionize customer engagement, with AI-powered systems offering personalized product recommendations, answering queries and even guiding customers through virtual try-ons. Restaurants could leverage interactive AI-driven menus and ordering systems, enhancing the dining experience while optimizing kitchen operations based on real-time demand forecasting.
Beyond consumer-facing applications, generative AI at the edge could transform industrial operations and supply chain management. Autonomous quality control systems could identify defects and anomalies in real time, learning from past data to continuously improve their accuracy. Predictive maintenance models could analyze sensor data and generate proactive alerts, minimizing downtime and optimizing resource allocation. In logistics, AI-powered demand forecasting and route optimization could streamline operations, reducing costs and improving delivery times.

Generative AI is also poised to transform industrial operations and supply chain management across multiple fronts, according to the World Economic Forum (WEF). The use of large language models (LLMs) could automatically generate maintenance instructions, standard operating procedures and other textual assets, driving process automation. Additionally, deploying LLMs could enable robots and machines to comprehend and act upon voice commands without task-specific training or frequent retraining. Autonomous quality control powered by generative AI could identify defects and anomalies in real-time, continuously learning from past data to improve accuracy, while predictive maintenance models analyzing sensor information could generate proactive alerts, minimizing downtime while optimizing resource allocation.
An analysis from McKinsey suggests the healthcare sector could also benefit greatly from generative AI at the edge. Real-time patient monitoring systems could analyze vital signs, generate early warning alerts, and provide personalized treatment recommendations. AI-assisted diagnostic tools could help healthcare professionals make more accurate and timely decisions, improving patient outcomes and reducing the burden on overworked medical staff.
ADVERTISEMENT

The challenge of generative AI edge deployments

Deploying generative AI models at the edge is particularly challenging because it requires balancing the need for fast, real-time responses with the ability to leverage local data for personalization, as highlighted in a recent IEEE working paper. These models must quickly adapt to new information and user behaviors directly at the edge, where computational resources are more limited than in centralized cloud environments. This requires AI models that are not only efficient and responsive but also capable of learning and evolving from localized datasets to provide tailored services. The dual demands of speed and personalization, all within the resource-constrained context of edge computing, underscore the complexity of deploying generative AI in such settings.
SiMa.ai claims to overcome these hurdles with its machine learning system-on-chip (MLSoC) designed specifically for edge AI use cases. Unlike other solutions that often require combining a machine learning (ML) accelerator with a separate processor, SiMa.ai claims MLSoC integrates everything needed for edge AI into a single chip. This includes specialized processors for computer vision and machine learning, a high-performance ARM processor and efficient memory and interfaces. The result is a compact, power-efficient solution that simplifies the deployment of AI at the edge. This combination of features could make SiMa.ai’s platform potentially attractive for infrastructure providers like Dell looking to bring powerful AI capabilities to edge devices.


The race to the AI edge is on

As enterprises increasingly seek to harness the power of AI at the edge, Dell’s strategic investment in SiMa.ai suggests that edge computing may have finally found its killer use case in AI. With SiMa.ai’s platform and Dell’s edge computing strategy aligned, the future of edge AI looks brighter than ever, promising to transform the way businesses operate and interact with their customers.
The market has already picked who it thinks the winners in AI will be, with Dell stock up 70.83% YTD, HPE up 7.08% and Cisco down -2.49%. Meanwhile, Supermicro has seen its stock soar a staggering 232% YTD, largely due to expectations of its increased data center sales. However, as Dell’s investment in SiMa.ai suggests, the edge could be the next critical lap in the race.
Of course, this is just the beginning. Over the next few years, we can expect to see a flurry of strategic investments and acquisitions as major tech companies race to stake their claim in the edge AI market. The fight to bring powerful AI capabilities to the enterprise edge could put a strain on existing partnerships, reminiscent of the virtualization era when we saw the disruptive VCE alliance between VMware, Cisco and EMC, which ultimately sparked the enormous merger between Dell and EMC.
Totally agree Dell is a badass. Jensen made a big dell by calling Michael Dell out, then Synopsys, Candence, and Ansys (which synopsys is buying).

But we still need to talk about Bitcoin! It is the 15-year winner, and the 10-year winner. Nvidia won the last five years, love them. Bitcoin is international, unique. The etf approval is freaky.
 

Attachments

  • 1712376723359.png
    1712376723359.png
    55 KB · Views: 22
  • Like
Reactions: 1 users

stuart888

Regular
Broadcom 30%, Bitcoin 30%, Intuitive 30% and Brainchip 10% could be a winner!

What a portfolio. The Telsa AI stuff might need to be in there too.
 

Diogenese

Top 20

SiMa.ai secures $70M funding to introduce a multimodal GenAI chip​

Jagmeet Singh@jagmeets13 / 11:00 PM GMT+11•April 4, 2024
Comment
SiMa.ai founder Krishna Rangasayee

Image Credits: SiMa.ai
SiMa.ai, a Silicon Valley–based startup producing embedded machine learning (ML) system-on-chip (SoC) platforms, today announced that it has raised a $70 million extension funding round as it plans to bring its second-generation chipset, specifically built for multimodal generative AI processing, to market.
According to Gartner, the market for AI-supporting chips globally is forecast to more than double by 2027 to $119.4 billion compared to 2023. However, only a few players have started producing dedicated semiconductors for AI applications. Most of the prominent contenders initially focused on supporting AI in the cloud. Nonetheless, various reports predicted a significant growth in the market of AI on the edge, which means the hardware processing AI computations are closer to the data gathering source than in a centralized cloud. SiMa.ai, named after “seema,” the Hindi word for “boundary,” strives to leverage this shift by offering its edge AI SoC to organizations across industrial manufacturing, retail, aerospace, defense, agriculture and healthcare sectors.
The San Jose–headquartered startup, which targets the market segment between 5W and 25W of energy usage, launched its first ML SoC to bring AI and ML through an integrated software-hardware combination. This includes its proprietary chipset and no-code software called Palette. The combination has already been used by over 50 companies globally, Krishna Rangasayee, the founder and CEO of SiMa.ai, told TechCrunch.

The startup touts that its current generation of the ML SoC delivered the highest FPS/W results on the MLPerf benchmark across the MLPerf Inference 4.0 closed, edge and power division categories. However, the first-generation chipset was focused on classic computer vision.
As the demand for GenAI is growing, SiMa.ai is set to introduce its second-generation ML SoC in the first quarter of 2025 with an emphasis on providing its customers with multimodal GenAI capability. The new SoC will be an “evolutionary change” over its predecessor with “a few architectural tunings” over the existing ML chipset, Rangasayee said. He added that the fundamental concepts would remain the same.
The new GenAI SoC would adapt to any framework, network, model and sensor — similar to the company’s existing ML platform — and will also be compatible with any modality, including audio, speech, text and image. It would work as a single-edge platform for all AI across computer vision, transformers and multimodal GenAI, the startup said.
“You cannot predict the future, but you can pick the vector and say, hey, that’s the vector I want to bet on. And I want to continue evolving around my vector. That’s kind of the approach that we took architecturally,” said Rangasayee. “But fundamentally, we really haven’t walked away or had to drastically change our architecture. This is also the benefit of us taking a software-centric architecture that allows more flexibility and nimbleness.”

SiMa.ai has Taiwan’s TSMC as the manufacturing partner for both its first- and second-generation AI chipsets and Arm Holdings as the provider for its compute subsystem. The second-generation chipset will be based on TSMC’s 6nm process technology and include Synopsys EV74 embedded vision processors for pre- and post-processing in computer vision applications.

The startup considers incumbents like NXP, Texas Instruments, STMicro, Renaissance and Microchip Technology, and Nvidia, as well as AI chip startups like Hailo, among the competition. However, it considers Nvidia as the primary competitor — just like other AI chip startups.
Rangasayee told TechCrunch that while Nvidia is “fantastic in the cloud,” it has not built a platform for the edge. He believes that Nvidia lacks adequate power efficiency and software for edge AI. Similarly, he asserted that other startups building AI chipsets do not solve system problems and are just offering ML acceleration.
“Amongst all of our peers, Hailo has done a really good job. And it’s not us being better than them. But from our perspective, our value proposition is quite different,” he said.
The founder continued that SiMa.ai delivers higher performance and better power efficiency than Hailo. He also said SiMa.ai’s system software is quite different and effective for GenAI.

“As long as we’re solving customer problems, and we are better at doing that than anybody else, we are in a good place,” he said.
SiMa.ai’s fresh all-equity funding, led by Maverick Capital and with participation from Point72 and Jericho, extends the startup’s $30 million Series B round, initially announced in May 2022. Existing investors, including Amplify Partners, Dell Technologies Capital, Fidelity Management and Lip-Bu Tan also participated in the additional investment. With this fundraising, the five-year-old startup has raised a total of $270 million.
The company currently has 160 employees, 65 of whom are at its R&D center in Bengaluru, India. SiMa.ai plans to grow that headcount by adding new roles and extending its R&D capability. It also wants to develop a go-to-market team for Indian customers. Further, the startup plans to scale its customer-facing teams globally, starting with Korea and Japan and in Europe and the U.S.
“The computational intensity of generative AI has precipitated a paradigm shift in data center architecture. The next phase in this evolution will be widespread adoption of AI at the edge. Just as the data center has been revolutionized, the edge computing landscape is poised for a complete transformation. SiMa.ai possesses the essential trifecta of a best-in-class team, cutting-edge technology, and forward momentum, positioning it as a key player for customers traversing this tectonic shift. We’re excited to join forces with SiMa.ai to seize this once-in-a-generation opportunity,” said Andrew Homan, senior managing director at Maverick Capital, in a statement.


"But fundamentally, we really haven’t walked away or had to drastically change our architecture. This is also the benefit of us taking a software-centric architecture that allows more flexibility and nimbleness."

... and here I was thinking they'd had a Damascene conversion, having seen the light, ... instead of sticking with their Triassic MACs.

... and from the EE Times article:

"Sima.ai is a Silicon Valley edge chip company started in 2018. The company has an SoC design for computer vision at the edge, featuring a 50 TOPS AI accelerator that uses just five watts.", so doing better than Hailo isn't such a high bar.

... maybe third time lucky?


"The San Jose–headquartered startup, which targets the market segment between 5W and 25W of energy usage, launched its first ML SoC to bring AI and ML through an integrated software-hardware combination. This includes its proprietary chipset and no-code software called Palette. The combination has already been used by over 50 companies globally, Krishna Rangasayee, the founder and CEO of SiMa.ai, told TechCrunch."

This demonstrates the market BRN has foregone in having an IP-only approach.

Now. of course, having had 50+ companies "use" their chip may not be the same as selling commercial quantities.

... and another thing, I don't know who coughed up the $70M, but it shows how difficult it is to do proper DD in this field - what's a neuromorphic computer anyway?
 
Last edited:
  • Like
  • Fire
Reactions: 28 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Sima's machine learning accelerator executes instructions - does that mean they're dead?*

US2021326189A1 SYNCHRONIZATION OF PROCESSING ELEMENTS THAT EXECUTE STATICALLY SCHEDULED INSTRUCTIONS IN A MACHINE LEARNING ACCELERATOR 20200417

View attachment 60413

bridging a deterministic phase of instructions with a non-deterministic phase of instructions when those instructions are executed by a machine learning accelerator while executing a machine learning network. In the non-deterministic phase, data and instructions are transferred from off-chip memory to on-chip memory. When the transfer is complete, processing elements are synchronized and, upon synchronization, a deterministic phase of instructions is executed by the processing elements.

US11631001B2 Heterogeneous computing on a system-on-chip, including machine learning inference 20200410

View attachment 60414

A system-on-chip (SoC) integrated circuit product includes a machine learning accelerator (MLA). It also includes other processor cores, such as general purpose processors and application-specific processors. It also includes a network-on-chip for communication between the different modules. The SoC implements a heterogeneous compute environment because the processor cores are customized for different purposes and typically will use different instruction sets. Applications may use some or all of the functionalities offered by the processor cores, and the processor cores may be programmed into different pipelines to perform different tasks.


* Rhetorical question - "Yes they are!"
US2023023303A1 MACHINE LEARNING NETWORK IMPLEMENTED BY STATICALLY SCHEDULED INSTRUCTIONS 20200203

View attachment 60417


[0027] The MLA 170 includes a plurality of Tiles 180 and an on-chip memory system implemented on a semiconductor die. The Tiles are organized into one or more meshes of interconnected Tiles. A depiction of a Tile mesh is shown to the right of box 170 in FIG. 1 A. In each mesh, the Tiles 180 are organized in a regular pattern and the interconnections within each mesh provide data transfer paths between Tiles in the mesh. The Tiles execute computations according to instructions received by the Tiles and using data stored in the on-chip memory system. These instructions may be for computations and/or for data transfer. Computations include multiply (including matrix multiply), add, and operators (e.g., nonlinear functions, lookup table, min/max, pooling). These are computations that implement the MLN. In the example of FIG. 1 A, the computations performed by layers 102 A-D are allocated to groups 182 A-D of Tiles as indicated. The allocation is not required to be 1:1. For example, multiple layers could be allocated to a single Tile or vice versa. Not every computation required to implement an MLN need be executed by a Tile; some computation may be executed outside the MLA (e.g., floating point operations, if the Tiles only do integer arithmetic). Tiles typically will at least perform matrix multiplication.

Dinosaurs have been buried in rock for thousands of millennia, and now these people are reinterring them in silicon.

Hi @Diogenese, further details about the second generation Sima.ai MLSoC. It says overall application development is powered by an Arm Holdings plc processor subsystem.

Am I right in thinking that neither this information nor the information in the patent rules out the potential incorporation of BrainChip's technology?

Screenshot 2024-04-06 at 3.27.19 pm.png

Screenshot 2024-04-06 at 3.27.36 pm.png

Screenshot 2024-04-06 at 3.28.32 pm.png

 
  • Like
  • Wow
Reactions: 5 users

Diogenese

Top 20
Hi @Diogenese, further details about the second generation Sima.ai MLSoC. It says overall application development is powered by an Arm Holdings plc processor subsystem.

Am I right in thinking that neither this information nor the information in the patent rules out the potential incorporation of BrainChip's technology?

View attachment 60418
View attachment 60419
View attachment 60420
Hi Bravo,

To start with, Sima are not bound to use the arrangement claimed in their patents, but there's a better than even chance that they do.

This patent suggests that they are using MACs (multiply accumulator circuits), which are basically antiSNN. They operate on numbers, not spikes.

US2023023303A1 MACHINE LEARNING NETWORK IMPLEMENTED BY STATICALLY SCHEDULED INSTRUCTIONS 20200203

. The Tiles execute computations according to instructions received by the Tiles and using data stored in the on-chip memory system. These instructions may be for computations and/or for data transfer. Computations include multiply (including matrix multiply), add, and operators (e.g., nonlinear functions, lookup table, min/max, pooling). These are computations that implement the MLN. In the example of FIG. 1 A, the computations performed by layers 102 A-D are allocated to groups 182 A-D of Tiles as indicated. The allocation is not required to be 1:1. For example, multiple layers could be allocated to a single Tile or vice versa. Not every computation required to implement an MLN need be executed by a Tile; some computation may be executed outside the MLA (e.g., floating point operations, if the Tiles only do integer arithmetic). Tiles typically will at least perform matrix multiplication.


Executing instructions requires the involvement of the processor. It also adds to latency and power consumption. Akida 1000 does not utilize the processor. Indeed, it does not execute instructions. It processes input signals automatically without interaction from the processor.

So their patents preclude the use of SNNs or spikes, and their "tiles" require instructions from the processor.

That said, incorporating Akida in their system would be a great improvement, but it would not be a simple plug-and-play substitution.
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 26 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Hi Bravo,

To start with, Sima are not bound to use the arrangement claimed in their patents, but there's a better than even chance that they do.

This patent suggests that they are using MACs (multiply accumulator circuits), which are basically antiSNN. They operate on numbers, not spikes.

US2023023303A1 MACHINE LEARNING NETWORK IMPLEMENTED BY STATICALLY SCHEDULED INSTRUCTIONS 20200203

. The Tiles execute computations according to instructions received by the Tiles and using data stored in the on-chip memory system. These instructions may be for computations and/or for data transfer. Computations include multiply (including matrix multiply), add, and operators (e.g., nonlinear functions, lookup table, min/max, pooling). These are computations that implement the MLN. In the example of FIG. 1 A, the computations performed by layers 102 A-D are allocated to groups 182 A-D of Tiles as indicated. The allocation is not required to be 1:1. For example, multiple layers could be allocated to a single Tile or vice versa. Not every computation required to implement an MLN need be executed by a Tile; some computation may be executed outside the MLA (e.g., floating point operations, if the Tiles only do integer arithmetic). Tiles typically will at least perform matrix multiplication.


Executing instructions requires the involvement of the processor. Akida 1000 does not utilize the processor. Indeed, it does not execute instructions. It processes input signals automatically without interaction from the processor.

So their patents preclude the use of SNNs or spikes, and their "tiles" require instructions from the processor.

That said, incorporating Akida in their system would be a great improvement, but it would not be a simple plug-and-play substitution.
1pnG.gif
 
  • Haha
  • Like
  • Love
Reactions: 11 users

SiMa.ai secures $70M funding to introduce a multimodal GenAI chip​

Jagmeet Singh@jagmeets13 / 11:00 PM GMT+11•April 4, 2024
Comment
SiMa.ai founder Krishna Rangasayee

Image Credits: SiMa.ai
SiMa.ai, a Silicon Valley–based startup producing embedded machine learning (ML) system-on-chip (SoC) platforms, today announced that it has raised a $70 million extension funding round as it plans to bring its second-generation chipset, specifically built for multimodal generative AI processing, to market.
According to Gartner, the market for AI-supporting chips globally is forecast to more than double by 2027 to $119.4 billion compared to 2023. However, only a few players have started producing dedicated semiconductors for AI applications. Most of the prominent contenders initially focused on supporting AI in the cloud. Nonetheless, various reports predicted a significant growth in the market of AI on the edge, which means the hardware processing AI computations are closer to the data gathering source than in a centralized cloud. SiMa.ai, named after “seema,” the Hindi word for “boundary,” strives to leverage this shift by offering its edge AI SoC to organizations across industrial manufacturing, retail, aerospace, defense, agriculture and healthcare sectors.
The San Jose–headquartered startup, which targets the market segment between 5W and 25W of energy usage, launched its first ML SoC to bring AI and ML through an integrated software-hardware combination. This includes its proprietary chipset and no-code software called Palette. The combination has already been used by over 50 companies globally, Krishna Rangasayee, the founder and CEO of SiMa.ai, told TechCrunch.

The startup touts that its current generation of the ML SoC delivered the highest FPS/W results on the MLPerf benchmark across the MLPerf Inference 4.0 closed, edge and power division categories. However, the first-generation chipset was focused on classic computer vision.
As the demand for GenAI is growing, SiMa.ai is set to introduce its second-generation ML SoC in the first quarter of 2025 with an emphasis on providing its customers with multimodal GenAI capability. The new SoC will be an “evolutionary change” over its predecessor with “a few architectural tunings” over the existing ML chipset, Rangasayee said. He added that the fundamental concepts would remain the same.
The new GenAI SoC would adapt to any framework, network, model and sensor — similar to the company’s existing ML platform — and will also be compatible with any modality, including audio, speech, text and image. It would work as a single-edge platform for all AI across computer vision, transformers and multimodal GenAI, the startup said.
“You cannot predict the future, but you can pick the vector and say, hey, that’s the vector I want to bet on. And I want to continue evolving around my vector. That’s kind of the approach that we took architecturally,” said Rangasayee. “But fundamentally, we really haven’t walked away or had to drastically change our architecture. This is also the benefit of us taking a software-centric architecture that allows more flexibility and nimbleness.”

SiMa.ai has Taiwan’s TSMC as the manufacturing partner for both its first- and second-generation AI chipsets and Arm Holdings as the provider for its compute subsystem. The second-generation chipset will be based on TSMC’s 6nm process technology and include Synopsys EV74 embedded vision processors for pre- and post-processing in computer vision applications.

The startup considers incumbents like NXP, Texas Instruments, STMicro, Renaissance and Microchip Technology, and Nvidia, as well as AI chip startups like Hailo, among the competition. However, it considers Nvidia as the primary competitor — just like other AI chip startups.
Rangasayee told TechCrunch that while Nvidia is “fantastic in the cloud,” it has not built a platform for the edge. He believes that Nvidia lacks adequate power efficiency and software for edge AI. Similarly, he asserted that other startups building AI chipsets do not solve system problems and are just offering ML acceleration.
“Amongst all of our peers, Hailo has done a really good job. And it’s not us being better than them. But from our perspective, our value proposition is quite different,” he said.
The founder continued that SiMa.ai delivers higher performance and better power efficiency than Hailo. He also said SiMa.ai’s system software is quite different and effective for GenAI.

“As long as we’re solving customer problems, and we are better at doing that than anybody else, we are in a good place,” he said.
SiMa.ai’s fresh all-equity funding, led by Maverick Capital and with participation from Point72 and Jericho, extends the startup’s $30 million Series B round, initially announced in May 2022. Existing investors, including Amplify Partners, Dell Technologies Capital, Fidelity Management and Lip-Bu Tan also participated in the additional investment. With this fundraising, the five-year-old startup has raised a total of $270 million.
The company currently has 160 employees, 65 of whom are at its R&D center in Bengaluru, India. SiMa.ai plans to grow that headcount by adding new roles and extending its R&D capability. It also wants to develop a go-to-market team for Indian customers. Further, the startup plans to scale its customer-facing teams globally, starting with Korea and Japan and in Europe and the U.S.
“The computational intensity of generative AI has precipitated a paradigm shift in data center architecture. The next phase in this evolution will be widespread adoption of AI at the edge. Just as the data center has been revolutionized, the edge computing landscape is poised for a complete transformation. SiMa.ai possesses the essential trifecta of a best-in-class team, cutting-edge technology, and forward momentum, positioning it as a key player for customers traversing this tectonic shift. We’re excited to join forces with SiMa.ai to seize this once-in-a-generation opportunity,” said Andrew Homan, senior managing director at Maverick Capital, in a statement.
"The San Jose–headquartered startup, which targets the market segment between 5W and 25W of energy usage"

Doesn't sound like "Edge" to me..

"The new GenAI SoC (quarter 1 2025) would adapt to any framework, network, model and sensor — similar to the company’s existing ML platform — and will also be compatible with any modality, including audio, speech, text and image. It would work as a single-edge platform for all AI across computer vision, transformers and multimodal GenAI, the startup said"

Can't we already do all that and more?..

They consider Nvidia their biggest competitor (although no real Edge offerings) and seem to think Hailo, is the closest among their peers.

The complete absence, of any mention of BrainChip, or AKIDA, is very telling, in my opinion.


eae63b3f89d17b7d_KmwYpbo_zps27599e63.gif
 
  • Like
Reactions: 10 users
Top Bottom