BRN Discussion Ongoing

7fĂźr7

Top 20
I'll translate that to..

"I've painted myself into a corner and I can't run away, so I'm just going to close my eyes and pretend you're not there"

🤣
There are some translation mistakes.. I will correct you… he says “I never baught shared and I even have never buy any property! I am still 16, live with my parents somewhere in the outback and try to mock people around… what? I’m board bro!”
 
  • Haha
  • Like
  • Sad
Reactions: 7 users

Slade

Top 20
  • Haha
  • Like
Reactions: 11 users

Gearitup

Member
The question that should have been raised at the AGM to Sean is: what is BRN’s strategy if none of the current client engagements are successful or extend? If there are 2-3 year lead times involved in evaluating Akida prior to any product involvement, where does it leave the business if these current engagements are unsuccessful? Clearly none have been successful to date and in my view the company has been poorly mismanaged and Sean is not the right CEO to instil confidence for shareholders and steer BRN in the right direction.
 
  • Like
  • Fire
  • Thinking
Reactions: 8 users

Frangipani

Regular
Last night (my time), when trying to catch up on last week’s postings, I wrote down my thoughts about the “sold-out” VVDN Edge AI Boxes, but was too tired to read over it again and post it before going to bed. Since the overnight trading halt has not altered my conjecture, though, here we go:



I agree with @itsol4605 that this is not a good look for a company supposed to be manufacturing a disruptive product that was announced ten (!) months ago.

Are VVDN possibly pursuing a manufacture on-demand strategy rather than stocking inventory in a warehouse? What I also find odd is that they don’t even list a unit price on their website, not to mention any rebates for purchasing those envisaged much larger orders.

(By the way, those of you imagining a stifling-the-competition conspiracy here, suggesting VVDN might deliberately give the NVIDIA Jetson Xavier NX Edge AI Box an advantage over the Akida one, take note that the relevant VVDN NVIDIA Edge AI Box webpage is equally hard to find, doesn’t list a unit price either and features the same “ENQUIRE NOW” button instead of one saying “BUY NOW”).

On February 20, interested customers were finally able to place first orders for up to 10 Edge AI Boxes through the BrainChip website. This in itself was already a twice-shifted timeline, as the original September 2023 announcement had promised the pre-sale (through BrainChip directly) to commence by December 2023 at the latest, while a second press release dated December 14, 2023 all of a sudden postponed the “expected” beginning of the pre-sale to January 15, three days after the end of CES 2024, the Las Vegas trade show where “the industry’s first Edge Box based on neuromorphic technology” was supposed to be demonstrated for the first time. While I personally didn’t attend CES 2024 and thus cannot vouch for the physical device’s absence from the BrainChip booth, I merely recall Nandan Nayampally’s CES podcast with Kalpesh Chauhan from VVDN, but no photos or videos of the demo that was to have taken place.

View attachment 66935
(…)


View attachment 66940

View attachment 66941


(…)


View attachment 66937

View attachment 66938





While I can’t say whether or not this issue really concerns “a lot of people”, I am indeed aware of a case of someone who claims he had ordered (and fully paid) the Edge AI Box through the BrainChip website months ago. Yet, our company’s communication regarding the delivery date has allegedly been extremely poor - for many weeks, there seems to have been no information regarding the shipment date at all, and since early May, the only update provided to the buyer has apparently been that the expected timeframe for the delivery will be the early 3rd quarter 2024 (which should be any day now) and that a follow-up email would be sent once the shipping date was getting closer: it concerns the poster on HC who goes by the username mapp and who strikes me as far from being a downramper. True, mapp is an anonymous poster like all of us, so it is difficult to verify his claim, but from the posts I’ve read, I don’t see a reason to doubt the accuracy of that claim.

I took the trouble to sift through mapp’s posts on HC (there are far more irrelevant posts in heated political threads than relevant ones to BRN if you click on the username) and selected about half of them to give you sort of a timeline

  • from order placement and full payment in late February
  • to the first email thereafter dated May 8 (two and a half months later!) with the estimated delivery timeframe early 3rd Q/2024 (even though our company had announced by mid-February they were going to communicate specific shipment dates by April 30)

View attachment 66949
  • to no update, yet, as of early July, regarding the exact delivery date.
And I am pretty sure mapp would by now have shared any further progress, if there had been any.

View attachment 66942

View attachment 66943

View attachment 66945


View attachment 66944



Again, not a good look, neither for VVDN nor our company.

I had also been wondering why there was no Edge AI Box anywhere to be seen in BrainChip’s social media posts about the 2024 embedded world in April, despite NVISO and VVDN being listed as demo partnerships on https://brainchip.com/embeddedworld.

If I am not mistaken, the first time we actually got to see a physical Akida Edge AI Box was at the AGM? Three days later, I spotted (the same or another?) one in the photos featuring our 2024 Embedded Vision Summit booth, and a video of Todd Vierra demoing one of its many use cases at said conference was uploaded two weeks later. By that time, I would have expected them to have at least one shelf of ready-to-be shipped units.





For a short while, it was possible to pre-order a single unit or up to 10 Edge AI Boxes through BrainChip directly, before the item was marked as “sold out” on https://shop.brainchipinc.com/products.

While it is perfectly legitimate to call itsol4605’s statement an unsubstantiated claim and ask him/her for some evidence to back it up, accusing him/her of “telling lies” would imply that itsol4605 is making this up despite knowing it is not true and is thus intentionally trying to deceive us.

So can you, @7fĂźr7, at least provide any counter-evidence? Do you happen to know anyone who pre-ordered an Akida Edge AI Box and has already received theirs and could hence post a photo of the invoice (blackening out personal details) on top of their Edge AI Box as proof?

Or have you enquired with BrainChip directly whether or not any of the pre-ordered Edge AI Boxes have been shipped to customers, yet, before you accuse another poster of being a liar?

To me, the circumstantial evidence (if I may call it that) all points to the VVDN Akida Edge AI Boxes simply not being ready for shipping, yet - whatever the underlying issue is and whoever is to blame. I would advise those of you envisaging substantial revenue from the initial sale of these Edge AI Boxes in the upcoming 4C to massively lower their expectations: IMO, the “sold out” resp “enquire now” does not seem to be a case of too high a demand and waiting for a second batch to be manufactured, or else customers who had ordered their Edge AI Box by late February should have long received it by now (provided mapp’s story is true).

If I were a customer who had paid US$ 800 up front for a product almost five months ago and still hadn’t received my order by now, I would at least expect transparent communication on behalf of the company in case of financial disagreements with VVDN, production delays or other logistics failures. Such things happen and it may not be the fault of anyone in particular - it could even be due to reasons beyond either company’s control, such as a sea freight container going overboard - but setbacks like that should be dealt with in a more professional manner rather than charging customers up front and then keep them waiting and waiting for months, overtaxing their patience. Although those buying only a single or a few units are considered small fry in the grand scheme of things by our company, they should still be treated with the same respect as the big fish our IP business model is chasing.

Informing those pre-order customers on the shop website these days that “BrainChip Sales will provide delivery dates once orders are processed” is something I perceive as a slap in the face of those who put trust in the company by placing their orders right away once the pre-sale had opened instead of waiting for favourable reviews or benchmarking reports before making a decision.

How many more months do they supposedly need for processing those orders?? It all reeks of some unforeseen problems, if you ask me, and the sales department should at least have the decency to admit to this rather than continue to shift the timelines without explanation.

I’d be more than happy to be proven wrong, though.

Is inflation in California by any chance not slowing down after all, but on the contrary getting out of hand?! 😉

While it is good to see that the Akida Edge AI Boxes are no longer marked as “sold out” on https://shop.brainchipinc.com/products/akida™-edge-ai-box, those who’ve been toying with the idea of ordering one (and are not deterred by the shipping estimate of 10-12 weeks) will be shocked to see the price has all of a sudden increased by approximately 87 %, without any explanation! 😳


Screenshot taken on July 21:

9F16863E-3720-4361-A4B9-33DCF06B785C.jpeg





Screenshot taken on September 3:

527800CB-CC65-45C2-8AD1-884425FE7E56.jpeg
 
  • Like
  • Wow
  • Fire
Reactions: 22 users

Labsy

Regular
Just picked up some more...couldn't resist ..
🎉💪🏆
 
  • Like
  • Love
  • Fire
Reactions: 19 users

Kachoo

Regular
Is inflation in California by any chance not slowing down after all, but on the contrary getting out of hand?! 😉

While it is good to see that the Akida Edge AI Boxes are no longer marked as “sold out” on https://shop.brainchipinc.com/products/akida™-edge-ai-box, those who’ve been toying with the idea of ordering one (and are not deterred by the shipping estimate of 10-12 weeks) will be shocked to see the price has all of a sudden increased by approximately 87 %, without any explanation! 😳


Screenshot taken on July 21:

View attachment 68925




Screenshot taken on September 3:

View attachment 68926
Well if the edge box had issues selling you would see a price drop quite interesting thanks
 
  • Like
  • Love
  • Fire
Reactions: 21 users

Kachoo

Regular
Just picked up some more...couldn't resist ..
🎉💪🏆
By a few edge boxes that will help you current holding Mate.

On a serious not its quite interesting.

We have finally been given guidance of some growth numbers be it what it is but it's progress.

Edge boxes for sale again with a price increase to 1495 USD.

Clearly not indicative of what the companies SP reflects in a way just my opinion.

So from what I see it looks like Akida 1000 is and will still be produced where the Akida 2.0 seems to be tabYoo IP only for some reason likely the higher cost to make and better performance. Maybe some other reasons one can speculate with a non competitive clause for a buyer as was mentioned
 
  • Like
  • Thinking
Reactions: 7 users

Quercuskid

Regular
  • Like
  • Fire
  • Love
Reactions: 9 users

Kachoo

Regular
Edge AI Box Computer Market: Growth Analysis and Anticipated Developments by 2032


For those that want to compare this article shows who is making boxes.
 
  • Fire
  • Like
Reactions: 2 users

Labsy

Regular
Forget th edge boxes guys, we will start to get traction in th next 3 quarters from auto, industrial and euro space industry!!! 🚀 🌌..... Yeah! ....
Caution, delusional up ramper... Who is creeping into top holder territory on pure faith and speculation....
God save us.
 
  • Like
  • Haha
  • Fire
Reactions: 11 users

Slade

Top 20
It’s nice seeing the Akida Edge Box on VVDNs website along with all of the applications that it targets. The price hike is a good sign in my opinion.

 
  • Like
  • Fire
  • Love
Reactions: 18 users

Kachoo

Regular
Forget th edge boxes guys, we will start to get traction in th next 3 quarters from auto, industrial and euro space industry!!! 🚀 🌌..... Yeah! ....
Caution, delusional up ramper... Who is creeping into top holder territory on pure faith and speculation....
God save us.
I think all revenue streams will help. The Auto I think is only from Valeo and it's software as Dio mentioned so it's not huge. Though I believe that is part of the forecasted revenue they mentioned.

I leave MB out as I'm sceptical about the partnership commercial side not being ready yet just my opinion.

I do not see them using the 820 k revenue forcast with this edge box as the buyers are not known yet IMO but to up the price clearly there was demand for the first run.
 
  • Like
  • Love
Reactions: 8 users

keyeat

Regular
Well if the edge box had issues selling you would see a price drop quite interesting thanks
or they got the price wrong to begin with

Making It Up Music Video GIF by Tate McRae
 
  • Love
  • Like
Reactions: 2 users
Is inflation in California by any chance not slowing down after all, but on the contrary getting out of hand?! 😉

While it is good to see that the Akida Edge AI Boxes are no longer marked as “sold out” on https://shop.brainchipinc.com/products/akida™-edge-ai-box, those who’ve been toying with the idea of ordering one (and are not deterred by the shipping estimate of 10-12 weeks) will be shocked to see the price has all of a sudden increased by approximately 87 %, without any explanation! 😳


Screenshot taken on July 21:

View attachment 68925




Screenshot taken on September 3:

View attachment 68926
This is only a guess so take it with a grain of salt. Those that placed pte- orders were given a lower price to gauge support for the box to see if interest was strong enough to do a production run. The support was strong enough to complete production run with large enough surplus. Anyone wanting in now pay this price. Just a guess. Sounds promising to me.

SC
 
  • Like
  • Fire
  • Love
Reactions: 15 users

Kachoo

Regular
This is only a guess so take it with a grain of salt. Those that placed pte- orders were given a lower price to gauge support for the box to see if interest was strong enough to do a production run. The support was strong enough to complete production run with large enough surplus. Anyone wanting in now pay this price. Just a guess. Sounds promising to me.

SC
From the other site it looks like 700 boxes are available.

Be interesting to see but it sure looms like they are being cautious not to over produce.

Price increase is always positive
 
  • Like
Reactions: 8 users
Forget th edge boxes guys, we will start to get traction in th next 3 quarters from auto, industrial and euro space industry!!! 🚀 🌌..... Yeah! ....
Caution, delusional up ramper... Who is creeping into top holder territory on pure faith and speculation....
God save us.
What’s your final amount you are going for labsy?
 

KKFoo

Regular
  • Like
  • Fire
Reactions: 3 users

Cardpro

Regular
.......sigh...
 
  • Like
  • Haha
Reactions: 3 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Isn't this what Tony Lewis is focussed on? I remember he posted a reply on his Linkedin saying something like we are the first he's aware of to be able to run SOTA SML's on edge devices or something like that.

Does anyone have his Linkedin post handy?

IBM & NTT Explain How AI Works on Edge Computing​

avatar_user_286688_1698354675-42x42.jpg

by Senior Technology Journalist
Ray Fernandez
Fact Checked byEddie Wrenn
Updated on 3 September 2024
title


IBM & NTT Explain How AI Works on Edge Computing

One trillion plus AI parameter models are already banging hard at our front door. These new artificial intelligence models, with unprecedented computing power, will become the norm in the coming months and years ahead.
While the technological advancements of new generative AI models are promising and expected to benefit most sectors and industries, the world still has a big AI-size-edge-infrastructure problem (we will break this down in a moment).
How will giant AI models operate on the edge of networks while offering low latency and real-time services? It’s a ‘shrinking giant’ problem.
In this report, Techopedia talks with NTT and IBM experts to understand an approach to solving fast, resource-intensive artificial intelligence without over-burdening a network.

Key Takeaways​

  • Large AI models are often too resource-intensive for edge devices.
  • Smaller, more efficient AI models offer a practical solution for deploying AI at the edge.
  • Industries such as manufacturing need AI solutions that can operate effectively in distributed environments.
  • Successful edge AI deployment requires collaboration between silos of information, and it is beginning to happen today.
Table of Contents

Is the Solution an Edge Infrastructure Combined with Smaller AI Models?​

Compute power is rapidly moving from the data center to the edge. Organizations expect edge computing to significantly impact all aspects of operations. Meanwhile, worldwide spending on edge computing is expected to be $232 billion in 2024, an increase of 15.4% over 2023 — largely driven by AI.
Techopedia spoke to Paul Bloudoff, Senior Director, Edge Services, NTT DATA, which focuses on edge AI — particularly smaller, more efficient AI models. These models are use-case specific and sized to be simple to deploy and run.
Advertisements

“When we look at bringing AI to the edge, we need to understand the business use case and what organizations are trying to accomplish. Not every deployment will need monolithic AI models to support real-life use cases.”
Bloudoff explained how IT and Operational Technology (OT) teams can benefit from small AI models running on lightweight edge.
For example, in factory floors, these AI solutions can enhance maintenance by breaking down siloes and bringing together a suite of information provided by the Internet of Things (IoT) to, for instance, track data on vibration, temperature, machine wear, and more.
Industrial predictive maintenance AI can save organizations thousands of dollars by reducing downtime.
“To accomplish this (use case), organizations will not need a million-dollar AI platform to process this kind of information.
“For many organizations, the resource investment to bring monolithic AI solutions to the edge is too complex and costly.”
Shrinking AI Giants into TinyAI to Drive Sustainable Development
Scentistific researchers already advocate for TinyAI models as the solution to the complex transition of data center AI models into edge computing.
Scientists argue that TinyAI models can be deployed in healthcare, agriculture, and urban development and contribute to the development of the United Nations Sustainable Development Goals. Tiny models are designed for specific use cases and, therefore, are more cost-efficient, and consume less power, driving sustainability targets.
Nick Fuller, Vice President of AI and Automation at IBM Research, told Techopedia that for edge workloads requiring low latency, inferencing on devices is more than highly desirable; it is essential.
“To this end, serving ‘small’ foundation models on such devices (robotic arms, mobile devices, cameras, etc.) facilitates on-device inferencing within the budget (memory, compute, accelerator) constraints of such devices.”
“Models, of course, can still be trained on-premise or on the cloud. To this end, lightweight AI models are very appealing to the edge market and especially to specific workloads where latency requirements are essential.”

Speaking the Many Languages of Edge Computing​

Many developers fear that edge infrastructure is constrained in terms of computing power processing, data storage, and memory. Tehcopedi asked Bloudoff from NTT how the company approached this problem.
Bloudoff explained that one of the challenges organizations face when deploying solutions at the edge is the silos between machines and devices across their network — machines and devices from different manufacturers do not always communicate well with each other.
“Think of it as different individuals speaking different languages who are all providing data that needs to be collected, analyzed, processed, and transformed into action,” Bloudoff said.
“What’s powerful about an Edge AI platform is that its software layer automatically discovers devices and machines across an IT or OT environment through its smaller, more efficient language learning model,” Bloudoff added.
NTT’s Edge AI platform runs auto-discovery and unifies and processes the data to provide a comprehensive diagnostic report that can be used for AI-powered solutions.
“Once you break down silos and connect with stakeholders across different teams, you will have a better understanding of your processing power and memory needs.
“Bigger isn’t always better.”
Speaking to Techopedia, IBM Fellow Catherine Crawford added that the challenge is not just in how to move AI to the edge.
“There are multiple existing use cases that can leverage edge using smaller AI models where the technical challenges still exist and assets can be developed for distributed, secure Edge to Cloud continuous development AIOps.”
Crawford explained that there is ongoing interest and research in understanding how task-tuned, smaller genAI and foundational models can be developed and leveraged for edge use cases considering constraints like compute, memory, storage, and power (battery life).
“For instance, perhaps having multiple, tuned smaller models on-board devices with hierarchical inferencing algorithms will be sufficient for edge use cases.
“AI algorithms and corresponding models will continue to evolve and, with the focus in our industry on sustainability, these models will quickly evolve to take less resources, becoming more appropriate for edge systems.”

An AI Built for Your Edge​

The 2023 Edge Advantage Report — which surveyed 600 enterprises across multiple industries — found that approximately 70% of organizations use edge solutions to solve business challenges. Still, nearly 40% worry that their current infrastructure will not support advanced solutions.
Bloudoff from NTT said that the company is well aware of the market´s concern and focuses on smaller, more efficient language models. These are easier to deploy and can run in real-time without demanding advanced edge hardware updates.
Smaller AI models represent enormous savings in edge hardware costs for industries and businesses across the world. By going tiny, AI on the edge can maximize the edge computing power already in play through optimization.
The company is also moving forward by implementing dozens of proofs of concepts with customers across manufacturing, automotive, healthcare, power generation, logistics, and other industries.

The Bottom Line​

Supercomputers, quantum computers, and state-of-the-art AI racks hosted in massive data centers, are undoubtedly the tip of the spear of modern innovation. However, the real world works at the edge level.
From the smartphone in your pocket to the automation of machines in industrial environments, healthcare, and agriculture, modern edge networks connect us all.
As bigger, faster, harder, and stronger AI models roll out, NTT and IBM invest in small and tiny models. They believe it is the solution to the future of giant AIs in our edge world.

 
  • Like
  • Love
  • Fire
Reactions: 28 users

Gazzafish

Regular
Sorry if I missed something but people are referring to forecast revenue. Can someone please show me the link to where we have this forecast? Thanks in advance 👍
 
  • Like
Reactions: 3 users
Top Bottom