BRN Discussion Ongoing

If anyone gets bored or has a bunch of free time, you can trawl through all the CES press releases for anything of interest :LOL:

They have a media page linking all the exhibitor releases...mind you there's 39 pages worth :oops:



View attachment 26066
Brainchip research actually gets me close to blowing a fuse in the ole noggin. So many leads this way that way backwards , forwards, up, down any which way. I often find myself sidetracked with even more leads.
 
  • Like
  • Love
  • Wow
Reactions: 17 users
Screenshot_20230104-030312.png
 
  • Like
  • Fire
Reactions: 18 users

charles2

Regular
Autonomous robotic killing machines (drones) imminent.

At age 77 makes me question how much of the civilized world will remain to enjoy the positive fruits of AI/AKIDA

Sorry for the downer on such a fine day.

 
  • Like
  • Sad
Reactions: 13 users

Sirod69

bavarian girl ;-)
https://www.linkedin.com/in/sherman-bizdev?miniProfileUrn=urn:li:fs_miniProfile:ACoAAABlGYoBuDfCSF7QokosU-TWPjIl52VORUs

BrainChip
BrainChip
20 Min. • vor 20 Minuten
Edge Impulse announced today official support for BrainChip's neural processor AI IP, making BrainChip the first strategic IP partner on the Edge Impulse platform- Read here: https://lnkd.in/g7Rp8Kwb #AI #ML #neuromorphic #IP #neuralnetworks

SAN JOSE, Calif., Jan. 3, 2023 /PRNewswire/ -- Edge Impulse announced today official support for BrainChip's neural processor AI IP, making BrainChip the first strategic IP partner on the Edge Impulse platform. This integration will enable users to leverage the power of Edge Impulse's machine learning platform, combined with the high-performance neural processing capabilities of BrainChip's Akida™ to develop and deploy powerful edge-based solutions.

edgeimpulse.com (PRNewsfoto/Edge Impulse)
edgeimpulse.com (PRNewsfoto/Edge Impulse)
Among the products supported by Edge Impulse is BrainChip's AKD1000 SoC, the first available device with Akida IP for developers. This comes with a BrainChip Akida PCIe reference board, which can be plugged into a developer's existing system to unlock capabilities for a wide array of edge AI use cases, including Automotive, Consumer, Home, and Industrial applications.

The Akida IP platform provides low-power, high-performance edge AI acceleration, designed to enable real-time machine learning inferencing on-device. Based on neuromorphic principles that mimic the brain, Akida supports today's models and workloads while future-proofing for emerging trends in efficient AI. Now devices with the Akida IP supported by Edge Impulse can enable users to sample raw data, build models, and deploy trained embedded machine learning models directly from Edge Impulse Studio to create the next generation of low-power, high-performance ML applications.


"This integration will provide users with a powerful and easy-to-use solution for building and deploying machine learning models on the edge," said Zach Shelby, co-founder and CEO of Edge Impulse. "We look forward to seeing what our users will create with BrainChip's AI offering."

"BrainChip's goal is to push the limits of on-chip AI compute to extremely energy-constrained sensor devices, the kind of performance that is only available in much higher power systems." said Sean Hehir, BrainChip's CEO. "Having our Akida IP supported and implemented into the Edge Impulse platform helps ensure that developers are able to deploy ML solutions quickly and easily to create a much more capable, innovative, and truly intelligent edge."



 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 53 users
edgeimpulse.com (PRNewsfoto/Edge Impulse)
I like this bit

Among the products supported by Edge Impulse is BrainChip's AKD1000 SoC, the first available device with Akida IP for developers.
 
  • Like
  • Love
  • Fire
Reactions: 42 users

Gies

Regular

Nice option for a Brainchip house on the island
 
  • Like
  • Haha
  • Love
Reactions: 16 users

Slymeat

Move on, nothing to see.
https://www.linkedin.com/in/sherman-bizdev?miniProfileUrn=urn:li:fs_miniProfile:ACoAAABlGYoBuDfCSF7QokosU-TWPjIl52VORUs

BrainChip
BrainChip
20 Min. • vor 20 Minuten
Edge Impulse announced today official support for BrainChip's neural processor AI IP, making BrainChip the first strategic IP partner on the Edge Impulse platform- Read here: https://lnkd.in/g7Rp8Kwb #AI #ML #neuromorphic #IP #neuralnetworks

SAN JOSE, Calif., Jan. 3, 2023 /PRNewswire/ -- Edge Impulse announced today official support for BrainChip's neural processor AI IP, making BrainChip the first strategic IP partner on the Edge Impulse platform. This integration will enable users to leverage the power of Edge Impulse's machine learning platform, combined with the high-performance neural processing capabilities of BrainChip's Akida™ to develop and deploy powerful edge-based solutions.

edgeimpulse.com (PRNewsfoto/Edge Impulse)
edgeimpulse.com (PRNewsfoto/Edge Impulse)
Among the products supported by Edge Impulse is BrainChip's AKD1000 SoC, the first available device with Akida IP for developers. This comes with a BrainChip Akida PCIe reference board, which can be plugged into a developer's existing system to unlock capabilities for a wide array of edge AI use cases, including Automotive, Consumer, Home, and Industrial applications.

The Akida IP platform provides low-power, high-performance edge AI acceleration, designed to enable real-time machine learning inferencing on-device. Based on neuromorphic principles that mimic the brain, Akida supports today's models and workloads while future-proofing for emerging trends in efficient AI. Now devices with the Akida IP supported by Edge Impulse can enable users to sample raw data, build models, and deploy trained embedded machine learning models directly from Edge Impulse Studio to create the next generation of low-power, high-performance ML applications.


"This integration will provide users with a powerful and easy-to-use solution for building and deploying machine learning models on the edge," said Zach Shelby, co-founder and CEO of Edge Impulse. "We look forward to seeing what our users will create with BrainChip's AI offering."

"BrainChip's goal is to push the limits of on-chip AI compute to extremely energy-constrained sensor devices, the kind of performance that is only available in much higher power systems." said Sean Hehir, BrainChip's CEO. "Having our Akida IP supported and implemented into the Edge Impulse platform helps ensure that developers are able to deploy ML solutions quickly and easily to create a much more capable, innovative, and truly intelligent edge."



The momentum continues for BrainChip with Edge Impulse shouting out about their partnership.

Following on from partnership announcements by NVISIO and VVDN, plus their involvement in CES 2023 (to start tomorrow in Las Vegas), one wonders how many other partners will come out of the woodwork and throw away the protection of their NDAs.

Soon we will see a #ME2-AKIDA movement—in a good way.

It’s not long now before companies will have to start shouting out to the world that they also are on the BrainTrain. Doing so after it reaches the launch pad will turn the protection of their NDA into a shackle stopping them from advertising their progressive approach of being early adopters.
 
  • Like
  • Love
  • Fire
Reactions: 55 users

stuart888

Regular
The Prophesee CTO/Co-Founder had something interesting to say. He was asked the following question at the end of his 9:06 minute raving giddy 🗣️ over event vision with SNN processing.

Important Note: Synsense, Intel, and other Not-SNN-Primetime players were part of the session speaking group. The person asking the question is from Intel (see her Intel logo in the background monitor).

Q: What is the role of Neuromorphic Computing Hardware, to all those chips we heard about before, play with Event Based Vision?

A: We believe of course, Spiking Neural Networks are a good match to event data that such a sensor produces. Of course, we are looking very carefully at many different implementations of Spiking Neural Network Chips, including Loihi, but many others. But I am not exactly sure yet if the optimal architecture, processing-architechture for event based vision has yet to be identified. This is a very good question, that I don't really have an answer for, as I think it is not that clear yet because there are several constraints. Because you have such a relatively large array of pixel information and you are interested in the spacial information first, but you also have this time domain information that you want to benefit from. These things come together for an architecture and maybe it is not really understood.

My guess is this is top secret, and he cannot answer Brainchip with this group. He had to answer that vaguely. I think he wanted to rave over Akida, just could not at this event. So he just raved over SNN event processing in general. The woman asking was from Intel, and listening was Synsense and others!

Maybe others have a better viewpoint? The video starts at the question, 9:06.

 
  • Like
  • Thinking
  • Fire
Reactions: 23 users

Gies

Regular
  • Like
  • Fire
  • Thinking
Reactions: 26 users

Makeme 2020

Regular
Apple our customer?


Apple's AR headset to automatically adjust lenses for perfect images

Malcolm Owen's Avatar Malcolm Owen | Jan 03, 2023
A render of a potential Apple headset [AppleInsider]


2 Comments
Facebook Facebook
Twitter Twitter
Reddit Reddit

AppleInsider may earn an affiliate commission on purchases made through links on our site.
Apple's long-rumored AR and VR headset will use motors to automatically adjust lenses for the user, new details about the inbound product surface, as Apple gets even closer to launching it.

The mixed-reality headset has been the subject of many rumors over time, and despite an extended development time, is still thought to be on the way. In a Tuesday report on the head-mounted device, the headset is expected to include many quality-of-life improvements to the existing augmented reality and virtual reality headset format.
So far, Apple's headset is believed to be reminiscent to ski goggles, with an external array of cameras to power both AR and VR apps. According to details from interviews with people who worked on the project published by The Information, the headset may offer more than the usual experience.
For a start, small motors will be used to shift the lenses inside the headset, to provide the user with the best experience possible. A dial is also anticipated for the headset, which will allow users to switch between fully-immersive VR and a blended view to see their surroundings.
Watch the Latest from AppleInsider TV
























































































While some headsets include their own sound systems or enable headphones to be used, it is claimed that Apple has already worked on adding support for AirPods Pro. It is claimed Apple has included tech in the newest AirPods Pro release so that they will work better with the rumored headset.

Battery placement and displays​

To make the headset more balanced, a 2022 design had Apple using an external battery pack attached to the headband and tethered via a cable to the headset itself. The design was controversial for engineers, in part due to Apple's preference for a cable-free design, and that the battery could've been integrated into the headband itself.
The cable magnetically attaches to the headband, rather than requiring clips or tethers. As for battery life, the unit will last for two hours from one battery pack, but another could be swapped in when required.
For the display, Apple is said to use a pair of inward-facing displays for each eye, and a larger outward-facing one for the front. The outer screen, intended to show facial expressions of the user, will also be a low-refresh-rate version, similar to the tech behind Apple's always-on displays.
Those internal displays will offer an 8K resolution in total, with 4K for each eye and they will also be micro OLED panels produced by Sony.
On wearing the headset, the lens motors will automatically adjust the positions to match the user's interpupillary distance, to give the best image of its 120-degree field of view.

Sensors and Chips​

The headset will have more than a dozen cameras and sensors, which will capture facial expressions and body movements, including the user's legs. At least one camera will be used to capture eye movements, which can enable battery-saving foveated rendering to be performed.
While cameras were originally thought to track eyebrow and jaw movements, one source claims the headset now uses machine learning to capture facial expressions.
The sensor list will include short and long-range LiDARs, used to track the environment around the user. Those sensors will be hidden as much as possible, with the aluminum and glass construction used to maintain Apple's famed aesthetics.
A pair of chips will be used for processing, with "Bora" being an image signal processor, while "Staten" is a SoC. Produced using a 5-nanometer process, the two will work together for tracking the environment and movements, however Apple had to create more silicon to minimize lag between the two.
Multiple swappable headbands are touted, including one intended for developers and another with speakers for consumers. Rather than using Bluetooth or a physical headphone jack, Apple instead intends for users to use AirPods and the H2 chip for low-latency communications.
On controls, hand tracking and voice recognition is preferred, however a wand and finger thimble have also been tested. A final control scheme has yet to be decided upon.
One thing that hasn't changed in the rumors is cost, as the supposed pricing of $3,000 is still apparently on the table.
AppleInsider will be covering the 2023 Consumer Electronics Show in person on January 2 through January 8 where we're expecting Wi-Fi 6e devices, HomeKit, Apple accessories, 8K monitors and more. Keep up with our coverage by downloading the AppleInsider app, and follow us on YouTube, Twitter @appleinsider and Facebook for live, late-breaking coverage. You can also check out our official Instagram account for exclusive photos throughout the event.
 
  • Like
  • Thinking
  • Fire
Reactions: 18 users

charles2

Regular
In US trading Brainchip finished up almost 7%

Not much volume but as we know, up is up
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 25 users

wilzy123

Founding Member
Valeo at CES

 
  • Like
  • Fire
Reactions: 27 users

misslou

Founding Member
Who else is hoping the DJ is Esq111
Me!!

Gutted I missed the Brainchip forum dance party last night. But I love that kind of music any time of the day. If anyone else is up for some endorphins in the morning, here’s what I’ve been listening to recently whenever I visualise the new life BRN is bringing. Tiesto: “I can’t wait”



Have a great day everyone 🤗
 
  • Like
  • Love
  • Fire
Reactions: 27 users

IloveLamp

Top 20
The Verge: LG's 2023 OLED TVs are brighter (again) and make webOS smarter.

LG says that’s thanks to its new α9 AI Processor Gen6 chip. Here’s the marketing spiel:

The latest Alpha series processor utilizes LG’s most sophisticated AI-assisted Deep Learning tech to ensure outstanding picture and sound quality. AI Picture Pro now offers improved upscaling for better clarity, and enhanced dynamic tone mapping, which helps reveal the depth and detail in every frame. AI Picture Pro also integrates a picture processing technology that detects and refines important objects, such as people’s faces, to give them a more lifelike HDR quality. In addition to fine-tuning image reproduction, the α9 AI Processor Gen6 powers LG’s AI Sound Pro; a feature that helps viewers get swept up in the onscreen action by delivering virtual 9.1.2 surround sound from the TV’s built-in speaker system
 
  • Like
  • Thinking
  • Fire
Reactions: 23 users

Mccabe84

Regular
  • Like
  • Fire
  • Love
Reactions: 34 users

wilzy123

Founding Member
  • Like
  • Thinking
  • Fire
Reactions: 21 users

Learning

Learning to the Top 🕵‍♂️
"Exciting times" 🎉🥳🎊 for the start of 2023
Screenshot_20230104_082434_LinkedIn.jpg


It's great to be a shareholder 🏖
 
  • Like
  • Love
  • Fire
Reactions: 73 users

buena suerte :-)

BOB Bank of Brainchip
edgeimpulse.com (PRNewsfoto/Edge Impulse)
I like this bit

Among the products supported by Edge Impulse is BrainChip's AKD1000 SoC, the first available device with Akida IP for developers.
There's that old 'Plural' .... Products /Processors.... getting dropped again!! :love::cool::cool:
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 40 users

JK200SX

Regular
  • Haha
  • Like
  • Wow
Reactions: 27 users

BaconLover

Founding Member
Just woke up, and this great news is better than morning wood .
Now the SP has to go up just as high.
:)
SP goes up .01% :cautious: 🤔
 
  • Haha
  • Like
  • Sad
Reactions: 22 users
Top Bottom