BRN Discussion Ongoing

Much appreciated @Sirod69. Edge Impulse has a very interesting blog that I try to keep up.

https://www.edgeimpulse.com/blog

They are the perfect partner to help sell, teach, and increase adoption of Brainchip solutions. Their non-paid value is immense. 🧑‍🎓👩‍🎓👨‍🎓

If your cat needs a friend, there is a new one available soon!



View attachment 27002

That's for the robot dog companion part of the post.
You'll never get this type of joy from a bot.
Screenshot_20230114-055932-241.png
 

Attachments

  • FmX0fnbWAAIq8AF~2.jpeg
    FmX0fnbWAAIq8AF~2.jpeg
    43.1 KB · Views: 63
Last edited:
  • Like
  • Love
  • Haha
Reactions: 28 users
  • Like
  • Love
  • Fire
Reactions: 24 users

Getupthere

Regular

It’s all about DATA….DATA….DATA!


HPE acquires Pachyderm as looks to bolster its AI dev offerings

Hewlett Packard Enterprise, the company better known as HPE, announced today that it acquired Pachyderm, a startup developing a data science platform for "explainable, repeatable" AI. The terms of the deal weren't disclosed nor was the purchase price. But HPE said that it plans to integrate Pachyderm's capabilities into a platform that'll deliver a pipeline for automatically preparing, tracking and managing machine learning processes.

Pachyderm’s software will remain available to current and new customers -- for now, at least. HPE says that the transaction isn't subject to any regulatory approvals and will likely close this month.

Co-founded in 2014 by Joey Zwicker and Joe Doliner, a former Airbnb software engineer, Pachyderm delivers tools for versioning (i.e., creating and managing) "enterprise-scale" machine learning and AI projects. Using Pachyderm's cloud-based and on-premises products, users could automate some aspects of AI system development through data transformations, data workflows and connectors.

Pachyderm also offered versioning features for machine learning datasets and a "Git-like" structure to facilitate collaboration among data scientists, as well as the ability to generate an immutable record for all activities and assets on the platform. It also hosted Pachyderm Hub, a fully managed service with an on-demand compute cluster for AI development.

Prior to the HPE acquisition, Pachyderm managed to attract $28.1 million in venture capital from backers including Benchmark, Microsoft's M12, Y Combinator and HEP's own Hewlett Packard Pathfinder. (Pathfinder invested in February 2022.) Among its customers were Shell, LogMeIn, Battelle Ecology and AgBiome.

HPE sees Pachyderm bolstering its flagship AI development product, the HPE Machine Learning Development Environment, which provides software to build and train machine learning models for applications like computer vision, natural language processing and data analytics. In a press release, HPE lays out what it sees as the major benefits Pachyderm brings to the table, including incremental data processing, visibility on the origin of data and the ability to track different versions of data to understand when it was created or changed.

"As AI projects become larger and increasingly involve complex data sets, data scientists will need reproducible AI solutions to efficiently maximize their machine learning initiatives, optimize their infrastructure cost and ensure data is reliable and safe no matter where they are in their AI journey," HPE EVP of high-performance compute (HPC) and AI Justin Hotard said in a statement. "Pachyderm’s unique reproducible AI software augments HPE’s existing AI-at-scale offerings to automate and accelerate AI and unlock greater opportunities in image, video and text analysis, generative AI and other emerging large language model needs to realize transformative outcomes."

Pachyderm is HPE's second AI-related acquisition since Determined AI in June 2021. Determined AI, similarly, was focused on creating a platform for building and retraining machine learning models.

HPE sees AI and HPC as a potential major profit driver, but the company has struggled to maintain momentum in the increasingly competitive market. In its Q4 2022 earnings report, HPE's HPC and AI revenue dipped 14% year over year to $862 million, bringing the operating profit margin down to 3.5% compared to 14.2% in the prior-year period.
 
  • Like
  • Fire
Reactions: 7 users

TopCat

Regular
  • Like
  • Fire
  • Love
Reactions: 28 users
  • Like
  • Fire
  • Love
Reactions: 53 users

Attachments

  • 1673645325292.png
    1673645325292.png
    83.9 KB · Views: 1,144
  • Like
  • Fire
  • Love
Reactions: 51 users

stuart888

Regular
View attachment 27011
View attachment 27012

View attachment 27010

View attachment 27009


View attachment 27013


Thanks @thelittleshort!

What a professional twitter comment from the Prophesee's Sales Director. I believe he is saying: "Brainchip's Akida SNN solution can accomplish a similar task without....."!

That is the way I read it?

1673646418367.png
 
  • Like
  • Fire
Reactions: 31 users

TopCat

Regular
View attachment 27014
1673645325292-png.27015


View attachment 27016

Interesting that I can see SynSense has liked this as well yet have not reposted it on their LinkedIn page like their other product Speck. If they were involved in Aeveon you think they would celebrate it as well. Time will tell!
 
  • Like
  • Fire
  • Thinking
Reactions: 9 users

chapman89

Founding Member
  • Like
  • Fire
  • Love
Reactions: 109 users

stuart888

Regular
Wow, the Brainchip website is fantastic, and loaded with clues is the Markets page.

Logic says, they would only boost about the use-case-implementations that they walk into a meeting and prove. The focus seems to be high confidence target markets, and it starts with the photo at the top: looks like John Deere! 🚜

The apples 🍎 on the conveyor belt was a new photo to me.

The photos tell a story, so does the text.

https://brainchip.com/markets/

You can click through: Auto Home Industrial. Well done Brainchip Staff! 🥁🥁

1673647982823.png
 
  • Like
  • Fire
  • Love
Reactions: 34 users

Sirod69

bavarian girl ;-)




SiFive

Did you miss us at the 2022 #RISCVSummit? We’ve got you covered! Learn more about our collaboration with
@Intel and the new Horse Creek development board from SiFive’s Jack Kang:

 
  • Like
  • Fire
  • Love
Reactions: 22 users

Sirod69

bavarian girl ;-)
  • Like
  • Fire
  • Love
Reactions: 22 users

Quatrojos

Regular
  • Like
  • Thinking
  • Fire
Reactions: 10 users

stuart888

Regular
Everyone needs an AI Friend! First time I ever heard that.

They already have a use-case for ChatGPT, the cure for loneliness. Mental health is important!

Replika is on Bloomberg TV now, and they sell it very well on TV.

Also for "the romantic side of things"! Flirting and role play! Is it ok to have an AI girl friend?

Who needs a dog or a cat!

I am just repeating what they are saying on TV. So funny, and likely a big winner.

https://gpt3demo.com/apps/replika-ai

1673649698340.png
 
  • Like
  • Thinking
  • Fire
Reactions: 7 users

Mccabe84

Regular
Not to sure if this has been posted but edge impulse put this tweet out about an hour ago

 
  • Like
  • Fire
  • Love
Reactions: 37 users

chapman89

Founding Member
A blog on the Brainchip website, take note of the language used.


“Conventional AI silicon and cloud-centric inference models do not perform efficiently at the automotive edge. As many semiconductor companies have already realized, latency and power are two primary issues that must be effectively addressed before the automotive industry can manufacture a new generation of smarter and safer cars. To meet consumer expectations, these vehicles need to feature highly personalized and responsive in-cabin systems while supporting advanced assisted driving capabilities.

That’s why automotive companies are untethering edge AI functions from the cloud – and performing distributed inference computation on local neuromorphic silicon using BrainChip’s AKIDA. This Essential AI model sequentially leverages multiple AKIDA-powered smart sensors and AKIDA AI SoC accelerators to efficiently capture and analyze inference data within designated regions of interest or ROI (samples within a data set that provide value for inference).

brainchip-diagram1-300x118.png


In-Cabin Experience
According to McKinsey analysts, the in-cabin experience is poised to become one of the most important differentiators for new car buyers. With AKIDA, automotive companies are designing lighter, faster, and more energy efficient in-cabin systems that can act independent of the cloud. These include advanced facial detection and customization systems that automatically adjust seats, mirrors, infotainment settings, and interior temperatures to match driver preferences.

AKIDA also enables sophisticated voice control technology that instantly responds to commands – as well as gaze estimation and emotion classification systems that proactively prompt drivers to focus on the road. Indeed, the Mercedes-Benz Vision EQXX features AKIDA-powered neuromorphic AI voice control technology which is five to ten times more efficient than conventional systems.

Neuromorphic silicon – which processes data with efficiency, precision, and economy of energy – is playing a major role in transforming vehicles into transportation capsules with personalized features and applications to accommodate both work and entertainment. This evolution is driven by smart sensors capturing yottabytes of data from the automotive edge to create holistic and immersive in-cabin experiences.

Assisted Driving – Computer Vision
In addition to redefining the in-cabin experience, AKIDA allows advanced driver assistance systems (ADAS) such as computer vision to detect vehicles, pedestrians, bicyclists, signs, and objects with incredibly high levels of precision.

Specifically, pairing two-stage object detection inference algorithms with local AKIDA silicon enables computer vision systems to efficiently perform processing in two primary stages – at the sensor (inference) and AI accelerator (classification). Using this paradigm, computer vision systems rapidly and efficiently analyze vast amounts of inference data within specific ROIs.

Intelligently refining inference data eliminates the need for compute heavy hardware such as general-purpose CPUs and GPUs that draw considerable amounts of power and increase the size and weight of computer vision systems. It also allows ADAS to generate incredibly detailed real-time 3D maps that help drivers safely navigate busy roads and highways.

Assisted Driving – LiDAR
Automotive companies are also leveraging a sequential computation model with AKIDA-powered smart sensors and AI accelerators to enable the design of new LiDAR systems. This is because most LiDAR typically rely on general-purpose GPUs – and run cloud-centric, compute-heavy inference models that demand a large carbon footprint to process enormous amounts of data.

Indeed, LiDAR sensors typically fire 8 to 108 laser beams in a series of cyclical pulses, each emitting billions of photons per second. These beams bounce off objects and are analyzed to identify and classify vehicles, pedestrians, animals, and street signs. With AKIDA, LiDAR processes millions of data points simultaneously, using only minimal amounts of compute power to accurately detect – and classify – moving and stationary objects with equal levels of precision.

Limiting inference to a ROI helps automotive manufacturers eliminate compute and energy heavy hardware such as general-purpose CPUs and GPUs in LiDAR systems – and accelerate the rollout of advanced assisted driving capabilities.

Essential AI
Scalable AKIDA-powered smart sensors bring common sense to the processing of automotive data – freeing ADAS and in-cabin systems to do more with less by allowing them to infer the big picture from the basics. With AKIDA, automotive manufacturers are designing sophisticated edge AI systems that deliver immersive end-user experiences, support the ever-increasing data and compute requirements of assisted driving capabilities, and enable the self-driving cars and trucks of the future.

To learn more about how BrainChip brings Essential AI to the next generation of smarter cars, download our new white paper here.
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 91 users

Mccabe84

Regular

Attachments

  • 59752E48-EC08-4468-A3B1-15F407E5FAB3.png
    59752E48-EC08-4468-A3B1-15F407E5FAB3.png
    971.7 KB · Views: 131
  • Like
  • Fire
  • Love
Reactions: 55 users

TopCat

Regular
A blog on the Brainchip website, take note of the language used.


“Conventional AI silicon and cloud-centric inference models do not perform efficiently at the automotive edge. As many semiconductor companies have already realized, latency and power are two primary issues that must be effectively addressed before the automotive industry can manufacture a new generation of smarter and safer cars. To meet consumer expectations, these vehicles need to feature highly personalized and responsive in-cabin systems while supporting advanced assisted driving capabilities.

That’s why automotive companies are untethering edge AI functions from the cloud – and performing distributed inference computation on local neuromorphic silicon using BrainChip’s AKIDA. This Essential AI model sequentially leverages multiple AKIDA-powered smart sensors and AKIDA AI SoC accelerators to efficiently capture and analyze inference data within designated regions of interest or ROI (samples within a data set that provide value for inference).

brainchip-diagram1-300x118.png


In-Cabin Experience
According to McKinsey analysts, the in-cabin experience is poised to become one of the most important differentiators for new car buyers. With AKIDA, automotive companies are designing lighter, faster, and more energy efficient in-cabin systems that can act independent of the cloud. These include advanced facial detection and customization systems that automatically adjust seats, mirrors, infotainment settings, and interior temperatures to match driver preferences.

AKIDA also enables sophisticated voice control technology that instantly responds to commands – as well as gaze estimation and emotion classification systems that proactively prompt drivers to focus on the road. Indeed, the Mercedes-Benz Vision EQXX features AKIDA-powered neuromorphic AI voice control technology which is five to ten times more efficient than conventional systems.

Neuromorphic silicon – which processes data with efficiency, precision, and economy of energy – is playing a major role in transforming vehicles into transportation capsules with personalized features and applications to accommodate both work and entertainment. This evolution is driven by smart sensors capturing yottabytes of data from the automotive edge to create holistic and immersive in-cabin experiences.

Assisted Driving – Computer Vision
In addition to redefining the in-cabin experience, AKIDA allows advanced driver assistance systems (ADAS) such as computer vision to detect vehicles, pedestrians, bicyclists, signs, and objects with incredibly high levels of precision.

Specifically, pairing two-stage object detection inference algorithms with local AKIDA silicon enables computer vision systems to efficiently perform processing in two primary stages – at the sensor (inference) and AI accelerator (classification). Using this paradigm, computer vision systems rapidly and efficiently analyze vast amounts of inference data within specific ROIs.

Intelligently refining inference data eliminates the need for compute heavy hardware such as general-purpose CPUs and GPUs that draw considerable amounts of power and increase the size and weight of computer vision systems. It also allows ADAS to generate incredibly detailed real-time 3D maps that help drivers safely navigate busy roads and highways.

Assisted Driving – LiDAR
Automotive companies are also leveraging a sequential computation model with AKIDA-powered smart sensors and AI accelerators to enable the design of new LiDAR systems. This is because most LiDAR typically rely on general-purpose GPUs – and run cloud-centric, compute-heavy inference models that demand a large carbon footprint to process enormous amounts of data.

Indeed, LiDAR sensors typically fire 8 to 108 laser beams in a series of cyclical pulses, each emitting billions of photons per second. These beams bounce off objects and are analyzed to identify and classify vehicles, pedestrians, animals, and street signs. With AKIDA, LiDAR processes millions of data points simultaneously, using only minimal amounts of compute power to accurately detect – and classify – moving and stationary objects with equal levels of precision.

Limiting inference to a ROI helps automotive manufacturers eliminate compute and energy heavy hardware such as general-purpose CPUs and GPUs in LiDAR systems – and accelerate the rollout of advanced assisted driving capabilities.

Essential AI
Scalable AKIDA-powered smart sensors bring common sense to the processing of automotive data – freeing ADAS and in-cabin systems to do more with less by allowing them to infer the big picture from the basics. With AKIDA, automotive manufacturers are designing sophisticated edge AI systems that deliver immersive end-user experiences, support the ever-increasing data and compute requirements of assisted driving capabilities, and enable the self-driving cars and trucks of the future.

To learn more about how BrainChip brings Essential AI to the next generation of smarter cars, download our new white paper here.
This is great stuff! To me it says that there are multiple car makers designing systems with Akida right now.
 
  • Like
  • Fire
  • Love
Reactions: 53 users

Quatrojos

Regular
Posted 10 minutes ago by Edge Impulse

Posted 10 minutes ago by Edge Impulse

Could the production of these products be related to the cap call?
 
  • Like
  • Thinking
  • Love
Reactions: 19 users

chapman89

Founding Member
Could the production of these products be related to the cap call?
I don’t think so, no! Far more bigger projects to spend money on 😉
 
  • Like
  • Fire
  • Love
Reactions: 32 users
Top Bottom