BRN Discussion Ongoing

Author just on hackster hedging bets either Gen 1 or even possibly Gen 2 :unsure:



BrainChip Partners with VVDN, Teases an Industrial "Akida Edge Box" for On-Device ML and AI​

Compact edge AI compute device will be demonstrated live during the Consumer Electronics Show (CES) 2024, BrainChip promises.​

https://www.hackster.io/ghalfacree
ghalfacree
1 minute ago • Machine Learning & AI

image_dTaMb0tjFG.png


BrainChip, which offers neuromorphic processing technology for high-efficiency machine learning and artificial intelligence (ML and AI) at the edge, has announced that it will be opening pre-sales for its Akida Edge Box — the "industry's first edge box powered by neuromorphic AI IP" — in January next year, following a demonstration at the Consumer Electronics Show (CES) 2024.

“BrainChip's neuromorphic technology gives the Akida Edge box the 'edge' in demanding markets such as industrial, manufacturing, warehouse, high-volume retail, and medical care," claims BrianChip's chief executive officer Sean Hehir of the company's impending launch. "We are excited to partner with an industry leader like VVDN technologies to bring groundbreaking technology to the market."

Loading video
BrainChip has teased an upcoming "Akida Edge Box" device for its neuromorphic processing technology, due to be unveiled at CES in January 2024. (📹: BrainChip)

"There is a strong demand for cost-effective, flexible edge AI computation across many industries," adds VVDN Technologies co-founder and chief executive officer Puneet Agarwal, whose company partnered with BrainChip to bring the Akida Edge Box to life. "VVDN is excited to offer OEMs [Original Equipment Manufacturers] its experience and expertise in bringing the advanced, transformative technology integrations that meet market needs and eventually help them with faster time to market.”

BrainChip showed off the original brain-inspired Akida neuromorphic processing concept at the Linley Fall Processor Conference back in 2019, refining the ecosystem over the next two years before the public launch of development kits built around compact Intel and Raspberry Pi computing systems.

In January last year the company announced full commercialization followed by a partnership with Edge Impulse to make Akida more accessible, and in October this year placed Akida 2.0 in "early access" as what Hehir claimed was a "significant step in BrainChip’s vision to bring unprecedented AI processing power to edge devices, untethered from the cloud."
image_qraSgZ2pi5.png
The company has not yet shown off the Akida Edge Box, but had already launched development kits built around compact PCs and SBCs. (📷: BrainChip)

Technical details of the Akida Edge Box have not yet been disclosed, but the device is likely to take the form of a small form factor or single-board computer coupled with one or more of the company's Akida or Akida 2.0 neuromorphic processing units. This, the company says, will offer support for "cost-effective and low-latency" on-device artificial intelligence workloads like visual detection, patient monitoring, security and surveillance, and manufacturing.

BrainChip and VVDN Technologies plan to formally unveil the Akida Edge Box at the Consumer Electronics Show 2024 in Las Vegas between January 9-12 2024 with a live demo, after which it will go up for pre-order ahead of an unspecified launch date. More information on the company's technology is available on the BrainChip website.
It won't be Gen 2 at this stage.
 
  • Like
Reactions: 2 users

JB49

Regular
Dont be expecting any decent revenue from the box. Sean said on his stock down under interview "were just gonna try this to get some work loads out there, so you'll see some revenue, but we're not expecting anything big"
 
  • Like
  • Sad
  • Thinking
Reactions: 13 users

Taproot

Regular
Author just on hackster hedging bets either Gen 1 or even possibly Gen 2 :unsure:



BrainChip Partners with VVDN, Teases an Industrial "Akida Edge Box" for On-Device ML and AI​

Compact edge AI compute device will be demonstrated live during the Consumer Electronics Show (CES) 2024, BrainChip promises.​

https://www.hackster.io/ghalfacree
ghalfacree
1 minute ago • Machine Learning & AI

image_dTaMb0tjFG.png


BrainChip, which offers neuromorphic processing technology for high-efficiency machine learning and artificial intelligence (ML and AI) at the edge, has announced that it will be opening pre-sales for its Akida Edge Box — the "industry's first edge box powered by neuromorphic AI IP" — in January next year, following a demonstration at the Consumer Electronics Show (CES) 2024.

“BrainChip's neuromorphic technology gives the Akida Edge box the 'edge' in demanding markets such as industrial, manufacturing, warehouse, high-volume retail, and medical care," claims BrianChip's chief executive officer Sean Hehir of the company's impending launch. "We are excited to partner with an industry leader like VVDN technologies to bring groundbreaking technology to the market."

Loading video
BrainChip has teased an upcoming "Akida Edge Box" device for its neuromorphic processing technology, due to be unveiled at CES in January 2024. (📹: BrainChip)

"There is a strong demand for cost-effective, flexible edge AI computation across many industries," adds VVDN Technologies co-founder and chief executive officer Puneet Agarwal, whose company partnered with BrainChip to bring the Akida Edge Box to life. "VVDN is excited to offer OEMs [Original Equipment Manufacturers] its experience and expertise in bringing the advanced, transformative technology integrations that meet market needs and eventually help them with faster time to market.”

BrainChip showed off the original brain-inspired Akida neuromorphic processing concept at the Linley Fall Processor Conference back in 2019, refining the ecosystem over the next two years before the public launch of development kits built around compact Intel and Raspberry Pi computing systems.

In January last year the company announced full commercialization followed by a partnership with Edge Impulse to make Akida more accessible, and in October this year placed Akida 2.0 in "early access" as what Hehir claimed was a "significant step in BrainChip’s vision to bring unprecedented AI processing power to edge devices, untethered from the cloud."
image_qraSgZ2pi5.png
The company has not yet shown off the Akida Edge Box, but had already launched development kits built around compact PCs and SBCs. (📷: BrainChip)

Technical details of the Akida Edge Box have not yet been disclosed, but the device is likely to take the form of a small form factor or single-board computer coupled with one or more of the company's Akida or Akida 2.0 neuromorphic processing units. This, the company says, will offer support for "cost-effective and low-latency" on-device artificial intelligence workloads like visual detection, patient monitoring, security and surveillance, and manufacturing.

BrainChip and VVDN Technologies plan to formally unveil the Akida Edge Box at the Consumer Electronics Show 2024 in Las Vegas between January 9-12 2024 with a live demo, after which it will go up for pre-order ahead of an unspecified launch date. More information on the company's technology is available on the BrainChip website.
Like Dio said, the VVDN / BRN Edge Box will be equipped with Akida 1000.

At this stage in BRN development, I really think the most important thing to be doing is to benchmark this box against the other better name brand edge boxes like Nvidia and Qualcomm. There need to be comparisons made between the different edge boxes in like for like situations.
So that clear benchmarking can be defined, which can be turned into marketing gold. Less Power, smaller more compact device, with less components ie ( fans )., making the device instantly " greener " and more environmentally friendly. Then technical and performance benchmarks in like for like comparisons. We need to show that BRN / Neuromorphic / Akida is a better option / choice, and we need to clearly define why this is so. Unless Brainchip can show on paper and in real life demonstrations why this product can outperform the competition in clearly defined and understandable benchmarks, then consumers will continue to purchase the name brands every time. Lets hope that VVDN and Brainchip have a great presentation lined up for Jan 12 +
 
  • Like
  • Love
  • Fire
Reactions: 29 users
@DingoBorat @Taproot

Most likely correct hence my thinking emoji in my post.

However we know Gen 2 available to early adopters Q3, VVDN Ann was beginning of Sept and no doubt we would have been working with them prior you'd think in developing the EB.

Personally I'd like it to be the Gen 2 as what better way to showcase our latest tech with ViT, TENNS at the premier tech event in CES with essentially our first commercially available off the shelf consumer / business product.

Wishful thinking maybe though.
 
  • Like
  • Fire
  • Thinking
Reactions: 12 users

JDelekto

Regular
@DingoBorat @Taproot

Most likely correct hence my thinking emoji in my post.

However we know Gen 2 available to early adopters Q3, VVDN Ann was beginning of Sept and no doubt we would have been working with them prior you'd think in developing the EB.

Personally I'd like it to be the Gen 2 as what better way to showcase our latest tech with ViT, TENNS at the premier tech event in CES with essentially our first commercially available off the shelf consumer / business product.

Wishful thinking maybe though.
I think that as Akida 2.0 was announced in March and the IP was made available as early access in October, they would most likely be using Akida 2.0 in their design.

The announcement claims that the Akida Edge Box was designed for vision-based AI workloads, so BrainChip would want to show off its TENN feature and vision transformers.

Putting this new device on display at CES is a chance to market the latest Akida technology, so I would bet that 2.0 would be at play here, possibly even with the hardware using the IP possibly being manufactured at Intel's Global Foundries.

If the Edge boxes were using the Akida 1.0 IP and could stand against the competitors in performance and power requirements, imagine how much more compelling it would make the Akida 2.0 IP for those still on the fence.
 
  • Like
  • Love
  • Fire
Reactions: 20 users

Mt09

Regular
@DingoBorat @Taproot

Most likely correct hence my thinking emoji in my post.

However we know Gen 2 available to early adopters Q3, VVDN Ann was beginning of Sept and no doubt we would have been working with them prior you'd think in developing the EB.

Personally I'd like it to be the Gen 2 as what better way to showcase our latest tech with ViT, TENNS at the premier tech event in CES with essentially our first commercially available off the shelf consumer / business product.

Wishful thinking maybe though.


Can’t remember at what time point but Sean specifically says the edge box contains akd1000, the chip we had made and have stock of. At this point there’s no akd2000 reference chip ie “hardware” no doubt that will come in the future.
 
  • Like
  • Fire
  • Love
Reactions: 18 users

TECH

Regular
Good morning back in Aussie,

I think an important point to remember with regards VVDN is this: they already make Edge Box's for both of our potential competitors,
namely Nvidia and Qualcomm, adding us to the mix sends a very strong message before a single confirmed sale has even been registered.

Can the others really compete, we shall all find out as CES unfolds, benchmarking is about to be ramped up a number of notches in my
opinion, the more product/s to hit the market will confirm our dominance at the edge, both Nvidia and Qualcomm representatives will be
keenly observing this launch at CES, the new kid on the block has finally arrived....❤️ Akida.

Tech.
 
  • Like
  • Love
  • Fire
Reactions: 58 users
@DingoBorat @Taproot

Most likely correct hence my thinking emoji in my post.

However we know Gen 2 available to early adopters Q3, VVDN Ann was beginning of Sept and no doubt we would have been working with them prior you'd think in developing the EB.

Personally I'd like it to be the Gen 2 as what better way to showcase our latest tech with ViT, TENNS at the premier tech event in CES with essentially our first commercially available off the shelf consumer / business product.

Wishful thinking maybe though.
AKIDA 2.0 is available, but there hasn't been an announcement of a physical AKD2000 reference chip being produced..

Not sure exactly when the AKD1000 chips were originally made, but there was the engineering samples and then the reference chips, which were greatly improved.

Must be getting close to two years though and they are going into Edge Boxes now and will be State of the Art!

To do that today, in these times of rapid technological development, just goes to show, how far AKIDA is ahead of the curve.

They could also use AKD1500 chips, if they needed to?..

We really have no idea, how many AKD1000 chips were produced, or what percentage of the run were "good".

There's always a non viable percentage and we don't know what the device yield is, when producing AKIDA chips, but I'm guessing, with the engineering samples coming out practically perfect first time (a huge credit to Anil Mankar, as was stated by Louis DiNardo, who had rarely seen that) that it is pretty high, by industry standards.
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 12 users


Can’t remember at what time point but Sean specifically says the edge box contains akd1000, the chip we had made and have stock of. At this point there’s no akd2000 reference chip ie “hardware” no doubt that will come in the future.

Thanks for confirming that MT.

Makes sense seeing we got stock sitting around, just a shame then it wasn't our current flagship product at the flagship Tech event of the year.

No doubt Gen 2 will be on display anyway and big bonus if someone else is using it too.
 
  • Like
  • Fire
Reactions: 5 users

Glen

Regular
  • Like
Reactions: 4 users

IloveLamp

Top 20
  • Like
  • Fire
Reactions: 12 users

This article was updated in July this year.

"According to the report, Samsung Foundry's yield for its 3nm chip manufacturing process is 60%. In comparison, TSMC's 3nm chip yield is around 55%. It means Samsung finally has the upper hand over TSMC in ultra-advanced chip manufacturing technology. Since TSMC is behind Samsung Foundry in the 3nm segment, it is possible that Samsung could win back the clients it lost to TSMC for 4nm and 5nm processes"

Such low yields still in the 3nm process (4nm is 75 to 80% from the article) and they are going to 2nm next year?? 🤔..

Moore's law for Von Neumann architecture is dead, when will they realise that?..
 
  • Like
  • Fire
Reactions: 4 users

MDhere

Top 20
Qualcomm website just got a facelift......


View attachment 52145

Well well well once again you have made my
Sunday @IloveLamp ❤ I know I saw this exact same picture somewhere before and here it is -
Screenshot_20231217-141653_Chrome.jpg

This is now no coincidence my opinion. And wait what's the catchy words - sense, think, act.
Thankyou Nokia i have no further questionsfor more ❤
 
Last edited:
  • Like
  • Fire
  • Thinking
Reactions: 14 users

IloveLamp

Top 20
Well well well once again you have made my
Sunday @IloveLamp ❤ well well well I know I saw this exact same picture somewhere before and here it is -
View attachment 52146
This is now no coincidence my opinion. And wait what's the catchy words - sense, think, act.
Thankyou Nokia i have no further questionsfor more ❤
Nokia a given with us imo agreed @MDhere . NOKIA and GOOGLE also involved with the aus gov in satellites/ space etc etc
 
  • Like
  • Fire
Reactions: 6 users
Well well well once again you have made my
Sunday @IloveLamp ❤ well well well I know I saw this exact same picture somewhere before and here it is -
View attachment 52146
This is now no coincidence my opinion. And wait what's the catchy words - sense, think, act.
Thankyou Nokia i have no further questionsfor more ❤
Nokia uses Qualcomm processors, so that's no surprise.

I want to see a rock solid connection, between us and them 😛
 
  • Like
Reactions: 9 users

hotty4040

Regular
Like Dio said, the VVDN / BRN Edge Box will be equipped with Akida 1000.

At this stage in BRN development, I really think the most important thing to be doing is to benchmark this box against the other better name brand edge boxes like Nvidia and Qualcomm. There need to be comparisons made between the different edge boxes in like for like situations.
So that clear benchmarking can be defined, which can be turned into marketing gold. Less Power, smaller more compact device, with less components ie ( fans )., making the device instantly " greener " and more environmentally friendly. Then technical and performance benchmarks in like for like comparisons. We need to show that BRN / Neuromorphic / Akida is a better option / choice, and we need to clearly define why this is so. Unless Brainchip can show on paper and in real life demonstrations why this product can outperform the competition in clearly defined and understandable benchmarks, then consumers will continue to purchase the name brands every time. Lets hope that VVDN and Brainchip have a great presentation lined up for Jan 12 +

Looks, very impressive to me, the uninitiated " clever dick - sometimes " learner, one.

Dodgy Ness. It's a good time now for you to put some thoughts together, to explain what this edgybox ( The Akida Ballista one ) is, i.e. so that I, and other " not so techy understandy " individuals can grasp this Technology in more detail. It appears to be a capable device, I think !! Perhaps you could explain why and how it is, and it's significance in terms of it's usage and it's current and future applications.

At this time, my head is a bit hurty trying to understand, " what's going on ( pretty similar to the the " Black box " ( that in actual fact is orange in colour in my understanding ) invention some years ago now ), here in Australia. Still not sure what and how that device works. Will this " edgebox " be a game changer at all, now and into the future, and could it improve/replace the " black box " in any way ?.

These questions/thoughts are open to all on this forum, so that I/others can become aware of what this device " edgebox " is designed to achieve, to enable us all to understand it's usefulness, moving forward.

And what could sales of this device bring realistically. I have other questions in regard to this, but they can wait until I've got my head around this edgebox device first.

TIA......

Akida Ballista ( Still, I'm Sure )....

hotty...
 
  • Like
  • Haha
Reactions: 8 users
Nice to see Sony stepping up to neuromorphic as well.

Be great to have a formal link / overlap with Sony / Prophesee / Akida.

Yes, I know we partnered with Prophesee but yet to see where that's truly at yet.



Sony Europe

Neuromorphic Computing Research Intern​

Sony Europe Schlieren, Zurich, Switzerland
1 week ago 156 applicants

See who Sony Europe has hired for this role

Apply

We look for the risk-takers, the collaborators, the inspired and the inspirational. We want the people who are brave enough to work at the cutting edge and create solutions that will enrich and improve the lives of people across the globe. So, if you want to make the world say wow, let's talk. The conversation starts here. If this role matches your ambitions and skillset, let's get started with your application.

The Stuttgart Laboratory 1 (SL1), which is part of the Sony Corporate Research and Development Center, conducts research in the areas of "Computational Imaging, Perception Systems, RF Communications, Artificial Intelligence and Speech & Sound Processing". SL1 is mainly located at the Stuttgart Technology Center (STC) in Germany, but also includes an office in Zürich, Switzerland. We want to accelerate the redefinition of computer vision based on the distinct characteristics of the novel Event-based Vision Sensor. Therefore, we are looking for a talented, motivated and software affine.

Neuromorphic Computing Research Intern (m/f)

to conduct basic algorithm development, work on demonstration prototypes with sensing and processing hardware and help to build a strong research portfolio from bottom up. This is an Internship or Master Thesis position at SL1 in Zürich, to start preferably in April, 1st 2024.

Your Responsibilities Will Be

  • Basic algorithm research with event-based vision sensor technology
  • Development and implementation of neuromorphic computing algorithms for off-line training and on-line inference of spiking neural networks on neuromorphic hardware
  • Benchmarking and comparison of performance (accuracy, latency and energy-efficiency) with conventional computer vision approaches
  • Support establishment of a R&D infrastructure which facilitates teamwork for research & development
Profile:

  • Currently pursuing an MSc degree or equivalent in artificial intelligence, computer science or engineering (robotics, mechanical, electrical, or similar)
  • Interested in inter-disciplinary research with a brain-inspired computing approach
  • Experience with computer vision principles and related linear algebra and machine learning theory
  • Experience with event-based cameras, robotics or autonomous vehicles is a plus
  • We expect passion for good software with strong skills in Python and related machine learning frameworks (PyTorch). Additionally, knowledge of version control (git) is a plus
  • We hope for good communication skills and willingness to work in an agile team
  • Excellent oral and written English
 
  • Like
  • Fire
  • Love
Reactions: 12 users

TheDrooben

Pretty Pretty Pretty Pretty Good
  • Like
  • Fire
Reactions: 10 users
Looks, very impressive to me, the uninitiated " clever dick - sometimes " learner, one.

Dodgy Ness. It's a good time now for you to put some thoughts together, to explain what this edgybox ( The Akida Ballista one ) is, i.e. so that I, and other " not so techy understandy " individuals can grasp this Technology in more detail. It appears to be a capable device, I think !! Perhaps you could explain why and how it is, and it's significance in terms of it's usage and it's current and future applications.

At this time, my head is a bit hurty trying to understand, " what's going on ( pretty similar to the the " Black box " ( that in actual fact is orange in colour in my understanding ) invention some years ago now ), here in Australia. Still not sure what and how that device works. Will this " edgebox " be a game changer at all, now and into the future, and could it improve/replace the " black box " in any way ?.

These questions/thoughts are open to all on this forum, so that I/others can become aware of what this device " edgebox " is designed to achieve, to enable us all to understand it's usefulness, moving forward.

And what could sales of this device bring realistically. I have other questions in regard to this, but they can wait until I've got my head around this edgebox device first.

TIA......

Akida Ballista ( Still, I'm Sure )....

hotty...
🤣

trains-planes-and-automobiles-john-candy.gif


Respect, to the Great late, John Candy.
 
  • Haha
  • Like
Reactions: 4 users

Tothemoon24

Top 20

Nokia Announces Speech-Based AI Network Management Tools​

Nokia unveiled its cutting-edge AI solutions amid its plans to scale up networks for the industrial metaverse
6
Nokia-telecom-network.jpg

MIXED REALITYLATEST NEWS
Published: November 1, 2023

Demond Cureton

Nokia Bell Labs announced on Wednesday a novel natural language processing (NLP) solution for configuring networks using artificial intelligence (AI) and machine learning (ML).
Developed by Nokia Bell Labs’ UNEXT research initiative, the company’s Natural-Language Network is completely transforming how networks operate. This is crucial as the Espoo, Finland-based firm scales up its network infrastructure amid the rise of the industrial metaverse.
The company revealed the new digital tools at the Brooklyn 6G Summit in New York, NY, which took place from 31 Oct to 2 November.
At the event, it stated the new NLP solution can configure networks using prompts and speech. It will also understand user intentions and operate autonomously using its AI neural networking.

Nokia AI Researchers Develops NLP for the Telecom​

Using Natural-Language Networks, Nokia can streamline network management with AI, moving away from complex setups to more agile, responsive systems to serve the end user.
Artificial intelligence will power the networks, allowing service providers to maintain operations with rapid configuration capabilities. These intelligent systems will also monitor and learn from previous prompts, responses, and other data to optimise networks after each successful request, the company explained.


As the networking tool builds its neural networks across the infrastructure, it can operate without human intervention.

Csaba Vulkan, Network Systems Automation Research Leader, Nokia Bell Labs, said in a statement,

“Operators won’t need to explore technical catalogues or complex API descriptions when they configure networks. Instead, a simple statement like ‘Optimize the network at X location for Y service’ will work. Those requests could be used to configure a wireless network in a factory for robot automation or optimize networks at a concert for a barrage of social media uploads”

Thank You, Who’s UNEXT?​

Nokia Bell Labs created the UNEXT research initiative to support the company’s efforts to innovate its network infrastructure. This has become a key focus of the company as it focuses on the industrial metaverse, which is currently digitally transforming global enterprises with massive results.

According to the company, UNEXT draws inspiration from UNIX, the groundbreaking operating system (OS) Bell Labs invented in the 1960s with the Massachusetts Institute of Technology (MIT) and General Electric (GE).

The firm said in a press release that “UNEXT will redefine network software and systems the same way UNIX reshaped computing.”

It aims to achieve this by integrating a host of processes in the telecoms network, effectively evolving the network into an operating system.

Azimeh Sefidcon, Head of Network Systems and Security Research, Nokia Bell Labs, added,

“Natural-Language Networks offer a sneak peek into one of the many capabilities of UNEXT. Reducing the complexity of network management fits squarely with UNEXT’s goal of extending the reach of networked systems by breaking down barriers that prevent those systems from interoperating”

AI to Become Central to Next-Gen Networks​

The announcement comes at a critical time when Nokia aims to scale-up its network capacities amid a massive surge in network demand. Many of the world’s tech innovations are now forcing telecoms to rethink their strategies in infrastructural development and solutions, as the former puts exponential pressure on 4G and 5G networks.

Unprecedented demand for telecoms bandwidth, speed, and reliability have not only come from consumers and their handheld devices like smartphones and tablets, but also from a growing need for low-latency, high-bandwidth, and high-speed networks to facilitate immersive tools.

Recent global innovations in virtual, augmented, and mixed reality at the consumer, enterprise, and industry are expected to place enormous pressure on telecoms infrastructure leading up to 2030 and beyond.

Thomas Hainzel, Head of Digital Industries Evolution & Partnerships, Nokia, explained to XR Today the challenges that Nokia faces amid the rise of multiple network-intensive metaverses in Industry 4.0.

Nokia is currently developing tools across the technology stack like the Internet of Things, AI, and cloud and edge computing, to secure network integrity and meet sustainability pledges amid future environmental challenges.


World Leaders Focus on AI Safety​

However, as companies innovate their offerings with AI, governments are developing their roles in ensuring the safety of their respective citizens, at the national and global level.

Nokia’s solution comes just days after United States President Joe Biden signed the Executive Order on Safe, Secure, and Trustworthy Artificial Intelligence.
 
  • Like
  • Love
Reactions: 5 users
Top Bottom