BRN Discussion Ongoing

Tothemoon24

Top 20
  • Love
  • Like
  • Wow
Reactions: 6 users

buena suerte :-)

BOB Bank of Brainchip
Is it this one? Esq.111.



Learning 🏖 🪩🎛🎧🎼

Love that 🎶🎵🎶... listen to this music and Ibiza Chill out often through Spotify 🕺 :cool:
 
Last edited:
  • Like
  • Love
  • Haha
Reactions: 5 users

TechGirl

Founding Member
I gave you a laughing emoji and then I thought you might be a serious investor as well so I changed it to a like. 😎🤓🥸😇

Serious Business GIF by TahKole Bio Integration
 
  • Haha
  • Love
Reactions: 6 users

Evermont

Stealth Mode
News keeps coming. Nice.

Edge Impulse and BrainChip Partner to Further AI Development with Support for the Akida platform​

edgeimpulse.com (PRNewsfoto/Edge Impulse)

NEWS PROVIDED BY
Edge Impulse
Jan 03, 2023, 08:00 ET

SHARE THIS ARTICLE​


SAN JOSE, Calif., Jan. 3, 2023 /PRNewswire/ -- Edge Impulse announced today official support for BrainChip's neural processor AI IP, making BrainChip the first strategic IP partner on the Edge Impulse platform. This integration will enable users to leverage the power of Edge Impulse's machine learning platform, combined with the high-performance neural processing capabilities of BrainChip's Akida™ to develop and deploy powerful edge-based solutions.

Among the products supported by Edge Impulse is BrainChip's AKD1000 SoC, the first available device with Akida IP for developers. This comes with a BrainChip Akida PCIe reference board, which can be plugged into a developer's existing system to unlock capabilities for a wide array of edge AI use cases, including Automotive, Consumer, Home, and Industrial applications.

The Akida IP platform provides low-power, high-performance edge AI acceleration, designed to enable real-time machine learning inferencing on-device. Based on neuromorphic principles that mimic the brain, Akida supports today's models and workloads while future-proofing for emerging trends in efficient AI. Now devices with the Akida IP supported by Edge Impulse can enable users to sample raw data, build models, and deploy trained embedded machine learning models directly from Edge Impulse Studio to create the next generation of low-power, high-performance ML applications.

"This integration will provide users with a powerful and easy-to-use solution for building and deploying machine learning models on the edge," said Zach Shelby, co-founder and CEO of Edge Impulse. "We look forward to seeing what our users will create with BrainChip's AI offering."

"BrainChip's goal is to push the limits of on-chip AI compute to extremely energy-constrained sensor devices, the kind of performance that is only available in much higher power systems." said Sean Hehir, BrainChip's CEO. "Having our Akida IP supported and implemented into the Edge Impulse platform helps ensure that developers are able to deploy ML solutions quickly and easily to create a much more capable, innovative, and truly intelligent edge."

Edge Impulse and BrainChip have an established relationship, previously announcing cross-platform support, including support for deploying Edge Impulse projects on the BrainChip MetaTF platform. Some of the features in which the user community can leverage include:
  • BrainChip's transfer learning block on Edge Impulse design studio
  • Quantization Aware Training (QAT)
  • The introduction of FOMO for BrainChip's Akida
  • Generation of BrainChip's compatible Edge Learning Models
  • No-code binary generation for quick AKD1000 deployment
  • Performance metrics for model profiling
The ongoing combination of BrainChip's Akida technology and Edge Impulse's platform, tools, and services will allow customers to achieve their ML objectives with fast and efficient development cycles to get to market quicker and achieve a competitive advantage.
Visit the documentation page to learn more about how the collaboration between Edge Impulse and BrainChip can benefit edge ML projects.

About Edge Impulse
Edge Impulse is the leading machine learning platform, enabling all enterprises to build smarter edge products. Their technology empowers developers to bring more ML products to market faster, and helps enterprise teams rapidly develop industry-specific solutions in weeks instead of years. The Edge Impulse platform provides powerful automation and low-code capabilities to make it easier to build valuable datasets and develop advanced ML with streaming data. With over 58,000 developers, and partnerships with the top silicon vendors, Edge Impulse offers a seamless integration experience to validate and deploy with confidence across the largest hardware ecosystem. To learn more, visit
edgeimpulse.com.

About BrainChip Holdings Ltd (ASX: BRN, OTCQX: BRCHF, ADR: BCHPY)
BrainChip is the worldwide leader in edge AI on-chip processing and learning. The company's first-to-market neuromorphic processor, Akida™, mimics the human brain to analyze only essential sensor inputs at the point of acquisition, processing data with unparalleled efficiency, precision, and economy of energy. Keeping machine learning local to the chip, independent of the cloud, also dramatically reduces latency while improving privacy and data security. In enabling effective edge compute to be universally deployable across real world applications such as connected cars, consumer electronics, and industrial IoT, BrainChip is proving that on-chip AI, close to the sensor, is the future, for its customers' products, as well as the planet. Explore the benefits of Essential AI at brainchip.com.

Follow BrainChip on Twitter: twitter.com/BrainChip_inc

Follow BrainChip on LinkedIn: linkedin.com/company/7792006

SOURCE Edge Impulse



rt.gif
 
  • Like
  • Fire
  • Love
Reactions: 111 users

Evermont

Stealth Mode
Also covered as a PR for BrainChip.

Just brilliant, well done team. 🔥

"Edge Impulse announced todayofficial support for BrainChip's neural processor AI IP, making BrainChip the first strategic IP partner on the Edge Impulse platform."

"This integration will provide users with a powerful and easy-to-use solution for building and deploying machine learning models on the edge," said Zach Shelby, co-founder and CEO of Edge Impulse. "We look forward to seeing what our users will create with BrainChip's AI offering."

"BrainChip's goal is to push the limits of on-chip AI compute to extremely energy-constrained sensor devices, the kind of performance that is only available in much higher power systems." said Sean Hehir, BrainChip's CEO. "Having our Akida IP supported and implemented into the Edge Impulse platform helps ensure that developers are able to deploy ML solutions quickly and easily to create a much more capable, innovative, and truly intelligent edge."

 
  • Like
  • Love
  • Fire
Reactions: 71 users
Also covered as a PR for BrainChip.

Just brilliant, well done team. 🔥

"Edge Impulse announced todayofficial support for BrainChip's neural processor AI IP, making BrainChip the first strategic IP partner on the Edge Impulse platform."

"This integration will provide users with a powerful and easy-to-use solution for building and deploying machine learning models on the edge," said Zach Shelby, co-founder and CEO of Edge Impulse. "We look forward to seeing what our users will create with BrainChip's AI offering."

"BrainChip's goal is to push the limits of on-chip AI compute to extremely energy-constrained sensor devices, the kind of performance that is only available in much higher power systems." said Sean Hehir, BrainChip's CEO. "Having our Akida IP supported and implemented into the Edge Impulse platform helps ensure that developers are able to deploy ML solutions quickly and easily to create a much more capable, innovative, and truly intelligent edge."

This is just more amazing news. I honestly feel sorry for those who think nothing has been happening with BRN due to the asx radio silence
Errr.... Nah I don't 🤣
Was going to let a few of the people I got invested know about this.
But it's much less frustrating if I just🤯💪 give myself a bunch of uppercuts.
 
Last edited:
  • Like
  • Haha
  • Fire
Reactions: 33 users

goodvibes

Regular
Tobii Driver Monitoring System…looks like NVISO/Brainchip power.

Test our driver monitoring system #DMS and #XR solutions at the world’s most influential tech event #CES2023. Sign up to meet our experts🦾https://lnkd.in/dAmG3TDA

 
  • Like
  • Fire
  • Love
Reactions: 29 users
Tobii Driver Monitoring System…looks like NVISO/Brainchip power.

Test our driver monitoring system #DMS and #XR solutions at the world’s most influential tech event #CES2023. Sign up to meet our experts🦾https://lnkd.in/dAmG3TDA

Interesting, TseX release the hounds.🕵️‍♂️ 😂
Thanks will have a dig around and see if I can find something related to this.
 
  • Like
  • Fire
Reactions: 5 users
Last edited:
  • Like
  • Fire
  • Love
Reactions: 29 users

goodvibes

Regular
Apart from the obvious collaboration between tobii and NVISIO 😂
LoL . thank goodness the asx is
closed.
Tobii is allready a partner / client from NVISO.


Scroll down to the end - there are thje partners/clients.
 
  • Like
  • Love
Reactions: 23 users
Tobii is allready a partner / client from NVISO.


Scroll down to the end - there are thje partners/clients.
Yeah I know that,just looking to see if I can find any data directly mentioning Akida stats for this system. Got a spare 50 k burning a hole in my pocket.
I added the collaboration in an edit of the post just in case some here didn't know.
 
  • Like
  • Love
Reactions: 8 users

goodvibes

Regular
…and CEVA can be another hidden customer…partner of NVISO and using same language like <ubiquitous>


Voice Control 🗣 is one step closer to becoming ubiquitous in a wide range of applications, from remote controls to major appliances and public kiosks, thanks to ultra low power VUI engine MCU deployments. In this blog, we will discuss the why and how of voice control deployment on low power and resource constrained micro-controller-units (MCUs) and its translation into real world product innovation > https://lnkd.in/dWmugieY

Join the CEVA team at CES 2023 (January 5-8) in Las Vegas to learn how our intelligent sensing and wireless connectivity IPs, along with our co-creation design services, helps to accelerate your next-generation smart edge chip design.

Visit us at the Westgate suite #2937. Access to our meeting suite is by invitation.
Contact us now and a member of our team will be in touch with you to schedule your meeting.


We will be showcasing a broad range of our technologies at the show with demonstrations of our best-in-class IP solutions and CEVA-powered devices on display, including:
  • Multi-standard IoT connectivity - with Auracast broadcast audio, Bluetooth 5.3, Wi-Fi 6 and UWB
  • 5G and cellular IoT connectivity - solutions for Open RAN, small-cells, massive IoT, and NB-IoT
  • AI and computer vision – powering camera-enabled applications in automotive, robotics, surveillance, and smartphones, capable of delivering more than 1200 TOPS of AI processing
  • AI and sound processing – Amazon certified voice user interfaces and clear conversations powered by neural networks for TWS earbuds, wearables, speakers, conferencing devices and the smart home
  • Wireless audio – featuring a fully integrated Bluetooth Audio solution and a complete 3D audio reference design, with a CEVA audio DSP, audio/voice algorithms and head tracking for an immersive experience
  • Sensor fusion technologies – motion control for robotics, XR, handheld controllers, and smart TVs, and human presence detection for PCs and smart displays

 
Last edited:
  • Like
  • Fire
  • Thinking
Reactions: 26 users

Diogenese

Top 20
Apart from the obvious collaboration between tobii and NVISIO 😂
LoL . thank goodness the asx is
closed.


nViso makes much of their ability to track each eye independently. Tobii can do this too:

EP4024169A1 AN EYE GAZE TRACKING SYSTEM, ASSOCIATED METHODS AND COMPUTER PROGRAMS

1672758354529.png

An eye tracking system configured to: receive a plurality of right-eye-images of a right eye of a user; receive a plurality of left-eye-images of a left eye of a user, each left-eye-image corresponding to a right-eye-image in the plurality of right-eye-images; detect a pupil and determine an associated pupil-signal, for each of the plurality of right-eye-images and each of the plurality of left-eye-images; calculate a right-eye-pupil-variation of the pupil-signals for the plurality of right-eye-images and a left-eye-pupil-variation of the pupil-signals for the plurality of left-eye-images; and determine a right-eye-weighting and a left-eye-weighting based on the right-eye-pupil-variation and the left-eye-pupil-variation. For one or more right-eye-images and one or more corresponding left-eye-images, the eye tracking system can: determine at least one right-eye-gaze-signal based on the right-eye-image and at least one left-eye-gaze-signal based on the corresponding left-eye-image; and calculate a combined-gaze-signal from a weighted sum of the right-eye-gaze-signal and the left-eye-gaze-signal using the right-eye-weighting and the left-eye-weighting.
 
  • Like
  • Fire
  • Love
Reactions: 28 users

charles2

Regular
Not just for farms....imagine the AKIDA potential to spiff up your lawn with this and related products....freeing more time for you pursue your favorite hobbies.

Here is a 'dandy' place to start

 
  • Like
  • Love
  • Fire
Reactions: 18 users
Yeah I know that,just looking to see if I can find any data directly mentioning Akida stats for this system. Got a spare 50 k burning a hole in my pocket.
I added the collaboration in an edit of the post just in case some here didn't know.

Well this article further down from a year ago about Tobii discusses the move to automotive / DMS at that time and it's roadmap plus acquiring Phasya.

The dude heading up Tobii Tech is ex 15 years at Intel.

Highlighted a comment by the author noting NVISO helping to develop DMS further so you'd like to trust they introduced some Akida to the recipe :)

Also appears NVISO and Tobii both involved together originally in the Horizon 2020 FANCI project.


Screenshot_2023-01-03-23-24-34-08_4641ebc0df1485bf6b47ebd018b5ee76.jpg


IMG_20230103_232742.jpg




Tobii Explores New Markets for Eye-Tracking Technology​

LATEST NEWS VIRTUAL REALITYPOSTED ON1 YEAR AGO —BY NEOMI


Tobii, the Sweden-based tech company is known for developing innovative eye-tracking technology dedicated to gaming, behavioral research, healthcare, as well as augmented and virtual reality. Now, they are reportedly expanding into new markets of automotive systems.​

The Sweden-based company has been a leading brand for making eye-tracking technology since its inception 20 years ago. They have successfully provided cutting-edge eye-tracking solutions for a variety of industrial and professional applications and thus, have established an impeccable reputation in the fields of AR and VR technology. Their latest endeavor is targeted at the automobile industry, where they aim to develop an automotive Driver Monitoring System. The automotive system will reportedly have the capabilities of tracking several aspects, such as the drivers’ alertness on road.

Anand Srivatsa

The major part of this Driver Monitoring System has been developed from the work of Anand Srivatsa, the current CEO of Tobii Tech division. Before joining the company in 2019, he has worked in Intel for 15 long years. As Tobii is set to spin off its Dynavox medical division on the Stockholm NASDAQ public market, Srivatsa is said to take over the two remaining divisions of the company, namely Tobii Pro and Tobii Tech. The former is responsible for developing eye-tracking solutions utilized for understanding human behavior as well as improving performance. The transaction is still pending. However, Tobii Group has recently announced that the company is set to acquire the automotive systems firm Phasya for an estimated $4.7 million.

Move into automotive

Commenting on their move into automotive, Srivatsa remarked that this was fairly an imminent effort for Tobii, considering the vast potential lying ahead in the field. As the CEO of Tobii Tech, he will oversee the process of enabling Tobii’s technology to be applied in different industries, including automotive. Similar to research facilities working on psychology or neuroscience, a lot of commercial enterprises need to understand human behavior and performance as well. They leverage this information to identify customer preferences and improve their products. Tobii Tech will strive to deliver solutions for them built on their eye-tracking technology.

2025

Srivatsa stated that Tobii Driver Monitoring System (DMS) is expected to hit the automobile market by 2025. Currently, it is working with various automotive suppliers, such as Nviso and Sunny Smart Lead to refine the system further. The core idea behind the Tobii DMS, Srivatsa said, is that it would ideally apply both artificial intelligence and eye-tracking technology to interpret key data points regarding a range of traits, such as driver’s alertness, attention, and drowsiness. This will help to enhance traffic safety by monitoring the driver. Srivatsa added that the DMS can track multiple people at once, and is capable of monitoring gestures, review body movements, and detect the emotions of the drivers.

Renowned VR products

To date, Tobii Tech is associated with renowned VR products like HTC Vive Pro Eye, Pico Neo 2, and Neo 3. With the expansion of the VR industry, the company is hopeful that eye-tracking will have a key role in the future of VR technology. However, that is not going to be limited to gaming and esports, according to Srivatsa. The expansion into automotive is one strong proof for the claim, he said.
 
  • Like
  • Love
  • Fire
Reactions: 35 users
nViso makes much of their ability to track each eye independently. Tobii can do this too:

EP4024169A1 AN EYE GAZE TRACKING SYSTEM, ASSOCIATED METHODS AND COMPUTER PROGRAMS

View attachment 26062
An eye tracking system configured to: receive a plurality of right-eye-images of a right eye of a user; receive a plurality of left-eye-images of a left eye of a user, each left-eye-image corresponding to a right-eye-image in the plurality of right-eye-images; detect a pupil and determine an associated pupil-signal, for each of the plurality of right-eye-images and each of the plurality of left-eye-images; calculate a right-eye-pupil-variation of the pupil-signals for the plurality of right-eye-images and a left-eye-pupil-variation of the pupil-signals for the plurality of left-eye-images; and determine a right-eye-weighting and a left-eye-weighting based on the right-eye-pupil-variation and the left-eye-pupil-variation. For one or more right-eye-images and one or more corresponding left-eye-images, the eye tracking system can: determine at least one right-eye-gaze-signal based on the right-eye-image and at least one left-eye-gaze-signal based on the corresponding left-eye-image; and calculate a combined-gaze-signal from a weighted sum of the right-eye-gaze-signal and the left-eye-gaze-signal using the right-eye-weighting and the left-eye-weighting.
Thanks Dio,
I reckon we got a good shot being involved though considering other articles that are floating around.
Will have a quick flick in my bookmarks.....
🤣 Get out of my head @Fullmoonfever

FMF beat me to it .
 
  • Like
  • Haha
Reactions: 13 users
Not just for farms....imagine the AKIDA potential to spiff up your lawn with this and related products....freeing more time for you pursue your favorite hobbies.

Here is a 'dandy' place to start

Favourite hobbies....reckon generations to come might be in strife....if anyone has kids or seen WALL-E a few years back :LOL:


63b44d643ec11766011301.gif
 
  • Like
  • Haha
Reactions: 14 users
If anyone gets bored or has a bunch of free time, you can trawl through all the CES press releases for anything of interest :LOL:

They have a media page linking all the exhibitor releases...mind you there's 39 pages worth :oops:



Screenshot_2023-01-03-23-55-50-65_4641ebc0df1485bf6b47ebd018b5ee76.jpg
 
  • Like
  • Fire
  • Haha
Reactions: 22 users

Sirod69

bavarian girl ;-)
Need help navigating the Hashtag#neuromorphic landscape from a software perspective (as there are a lot of technologies out there) - make sure you check out our whitepaper which you can download here: https://www.nviso.ai/en/technology

1672761902711.png


TEACHING MACHINES
TO UNDERSTAND HUMANS​

More than 90% of our communication is non-verbal.
This is what makes us unique in the way of expressing and communicating emotions.
We develop artificial intelligence software to sense, assess, and act upon human behaviour.

DOWNLOAD WHITEPAPER
 
  • Like
  • Fire
  • Love
Reactions: 13 users
Top Bottom