BRN Discussion Ongoing

cosors

👀
Honestly, I’m losing patience. I’ve been an investor for close to 5 years and nothing has changed. My main concern is the punctuation infelicities on this forum. So, for the first and final time, here’s how to use an apostrophe. They are only used for to indicate the possessive case, as in Sean’s pony tail, contractions as in BRN can’t make sales, and omitted letters, as in it’s only a matter of time. Thank you!
Then sell because of spelling mistakes. Your freedom.
...that you were able to endure it at all.

____
Be glad that this is not about the German language. Are you serious?
 
Last edited:
  • Haha
  • Fire
  • Like
Reactions: 17 users

manny100

Regular
BRN has all AGM bases covered.
The funds including super funds holding retail accounts with BRN shares will be canvassed. They will vote YES to all except NO for a BOD spill. Together with large and other positive retail holders there is no way a 2nd strike will lead to a spill.
 
  • Like
  • Thinking
  • Love
Reactions: 9 users

Beebo

Regular
BRN has all AGM bases covered.
The funds including super funds holding retail accounts with BRN shares will be canvassed. They will vote YES to all except NO for a BOD spill. Together with large and other positive retail holders there is no way a 2nd strike will lead to a spill.
Once Sean successfully navigates through the dark cloud of a 2nd strike, there is blue sky. Fingers crossed!

To me the glass is clearly half-full. One IP license for Akida 2.0 and the dominos will fall.

PS. How was my punctuation? English is my 2nd language. German is my 57th.
 
  • Like
  • Haha
  • Love
Reactions: 35 users

Fenris78

Regular
Screenshot_20240503-075850_hotcrapper.jpg
 
  • Like
  • Love
  • Fire
Reactions: 54 users

BrainShit

Regular
  • Like
  • Fire
  • Love
Reactions: 63 users

toasty

Regular
If the board spills Sean keeps his role as CEO.

It is the non-executives that lose their seat and have to renominate and win their seat back.

Sean is safe no matter the outcome.
Not unless he is the MD. If there is a spill ALL directors are terminated except the MD. As I've said before, he will likely keep his role as CEO for the meantime but will lose his board seat - unless he's been appointed MD........which appears to be the case.
 
Last edited:
  • Like
Reactions: 1 users

Tuliptrader

Regular
Courtesy of Rayz on HC

Interesting interview with our partner emotion3D. He seems positive.



TT
 
  • Like
  • Love
  • Fire
Reactions: 20 users

Kachoo

Regular
  • Like
  • Wow
  • Thinking
Reactions: 5 users
Not unless he is the MD. If there is a spill ALL directors are terminated except the MD. As I've said before, he will likely keep his role as CEO for the meantime but will lose his board seat - unless he's been appointed MD........which appears to be the case.
Page 29 of the AGM notice as posted previously.

"Each of these Directors would be eligible to stand for re-election at the Spill Meeting, however there is no guarantee that they would do so. As Mr Sean Hehir is an Executive Director of the Company, he is excluded from the requirements under the Corporations Act to seek re-election at the Spill Meeting (if held) and will continue to hold office regardless of the outcome of this Resolution or the Spill Meeting (if held)."
 
  • Like
  • Fire
  • Love
Reactions: 12 users

Quiltman

Regular
Courtesy of Rayz on HC

Interesting interview with our partner emotion3D. He seems positive.



TT


Great Interview.

Just a reminder to everyone on our relationship with emotion3D
This announcement over 1 year ago ... I just don't think people understand the extended timeline for implementation on these projects.
Being a listed company at this point of business development cycle is just hard yards ... I wouldn't want to be in Sean's boots.

Onwards we go ...


1714693313394.png


1714693348599.png
 
  • Like
  • Love
  • Fire
Reactions: 32 users

Labsy

Regular
As a small but committed shareholder I find the ongoing debate around BRN’s ability to make an impact in this huge market a little tiring.

To those who seem focused on complaining about the management, the lack of deals etc. I can certainly sympathise – all stockholders want to see BRN succeed and the SP surge, however this is a market segment that is still in its infancy and as such time is the key consideration for success.

The fundamentals of the company remain the same and the ever-increasing awareness of BRN and its technology as a game changer demonstrates that the company is bound to succeed in my opinion.

Patience is sometimes the only, albeit the most annoying, option for eventual success and to those who are convinced that the Company is in a crisis can I suggest you sell your stock and seek another opportunity that aligns with your expectations – it may be out there.
Very well said...
 
  • Like
  • Fire
Reactions: 14 users

7für7

Top 20

Weird jump 32% after hours in seaking alpha but the volume can anyone confirm this on another site for after hours?
But the OTC market is not connected with the ASX. So, even if it rise up to 500% it have no impact to our SP.
Americans are crazy man… they sometimes react dramatically with no reason. Maybe a trap or something
 
  • Haha
Reactions: 1 users

Labsy

Regular
The board has my full support, I hope my holding is enough to neutralise the negative impact some here are trying to have on our little ripper...
I will remain in the trenches and battle this out.
Each to their own but I have strong faith that this will be a great year if we can pull ourselves together and hold strong...
I will vote in the affirmative to renumeration and want to retain our talent and keep them happy.
Very sad to see Nandan and Rob go. All the best to them and appreciate their efforts.
 
  • Like
  • Love
  • Fire
Reactions: 40 users

jla

Regular
But the OTC market is not connected with the ASX. So, even if it rise up to 500% it have no impact to our SP.
Americans are crazy man… they sometimes react dramatically with no reason. Maybe a trap or something
Are you shore it would have no impact.?????
 

RobjHunt

Regular
Sheeeee’s Friday. My bones are telling me it will be a GREEEEEN DAY 😉
 
  • Like
  • Haha
Reactions: 8 users

AARONASX

Holding onto what I've got
While Sean, as the CEO of Brainchip is new to the gig...the consideration of acquiring a more seasoned CEO carries a significant financial investment, it remains uncertain how many avenues Sean has paved and the timeframe we must wait before revenue that meets the expectations of our shareholders.

From the time Sean has been in the role Brainchips as streamlined there website with is now much more professional and mature.

Despite the challenges highlighted in the recent quarterly report, including personal disappointment, we remain unfairly stuck to Sean's prior statement "watch me on the financials," thereby amplifying any perceived shortfalls.

While our current trajectory may appear unclear, the trajectory of AI technology is steadfast. Brainchip possessing this technology lead that is positioning us favorably amidst potential competition.

The existence of many, many non-disclosure agreements (NDAs) with various companies for extended periods underscores the formidable barrier to entry in replicating our advancements.

Even if competitors were to announce similar AI chip products for the consumer market today, the reality of research and development cycles dictates a significant lag time before they could pose substantial competition, time we have and they don't. IMO
 
  • Like
  • Fire
  • Love
Reactions: 20 users

7für7

Top 20
  • Like
Reactions: 1 users
Honestly, I’m losing patience. I’ve been an investor for close to 5 years and nothing has changed. My main concern is the punctuation infelicities on this forum.
1714696691085.gif
 
  • Haha
  • Like
Reactions: 7 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
I believe AKIDA 2.00 has been scaled up to 131 TOPS.

At The Heart Of The AI PC Battle Lies The NPU​

Anshel Sag
Contributor

Apr 29, 2024,09:21pm EDT
Green microchip set in a blue printed circuit board

NPUs will be a key battleground for AI chip vendors.
GETTY
There is a clear battle underway among the major players in the PC market about the definition of what makes an AI PC. It’s a battle that extends to how Microsoft and other OEMs interpret that definition as well. The reality is that an AI PC needs to be able to run AI workloads locally, whether that’s using a CPU, GPU or neural processing unit. Microsoft has already introduced the Copilot key as part of its plans to combine GPUs, CPUs and NPUs with cloud-based functionality to enable Windows AI experiences.

The bigger reality is that AI developers and the PC industry at large cannot afford to run AI in the cloud in perpetuity. More to the point, local AI compute is necessary for sustainable growth. And while not all workloads are the same, the NPU has become a new and popular destination for many next-generation AI workloads.

What Is An NPU?​

At its core, an NPU is a specialized accelerator for AI workloads. This means it is fundamentally different from a CPU or a GPU because it does not run the operating system or process graphics, but it can easily assist in doing both when those workloads are accelerated using neural networks. Neural networks are heavily dependent on matrix multiplication tasks, which means that most NPUs are designed to do matrix multiplication at extremely low power in a massively parallel way.


GPUs can do the same, which is one reason they are very popular for neural network tasks in the cloud today. However, GPUs can be very power-hungry in accomplishing this task, whereas NPUs have proven themselves to be much more power-efficient. In short, NPUs can perform selected AI tasks quickly, efficiently and for more sustained workloads.

The NPU’s Evolution​

Some of the earliest efforts in building NPUs came from the world of neuromorphic computing, where many different companies tried to build processors based on the architecture of the human brain and nervous system. However, most of those efforts never panned out, and many were pruned out of existence. Other efforts were born out of the evolution of digital signal processors, which were originally created to convert analog signals such as sound into digital signals. Companies including Xilinx (now part of AMD) and Qualcomm both took this approach, repurposing some or all of their DSPs into AI engines. Ironically, Qualcomm already had an NPU in 2013 called the Zeroth, which was about a decade too early. I wrote about its transition from dedicated hardware to software in 2016.

One of the advantages of DSPs is that they have traditionally been highly programmable while also having very low power consumption. Combining these two benefits with matrix multiplication has led companies to the NPU in many cases. I learned about DSPs in my early days with an electronic prototype design firm that worked a lot with TI’s DSPs in the mid-2000s. In the past, Xilinx called its AI accelerator a DPU, while Intel called it a vision processing unit as a legacy from its acquisition of low-power AI accelerator maker Movidius. All of these have something in common, in that they all come from a processor designed to analyze analog signals (e.g., sound or imagery) and process those signals quickly and at extremely low power.

Qualcomm’s NPU​

As for Qualcomm, I have personally witnessed its journey from the Hexagon DSP to the Hexagon NPU, during which the company has continually invested in incremental improvements for every generation. Now Qualcomm’s NPU is powerful enough to claim 45 TOPS of AI performance on its own. In fact, as far as back as 2017, Qualcomm was talking about AI performance inside the Hexagon DSP, and about leveraging it alongside the GPU for AI workloads. While there were no performance claims for the Hexagon 682 inside the Snapdragon 835 SoC, which shipped that year, the Snapdragon 845 of 2018 included a Hexagon 685 capable of a whopping 3 TOPS thanks to a technology called HVX. By the time Qualcomm put the Hexagon 698 inside the Snapdragon 865 in 2019, the component was no longer being called a DSP; now it was a fifth-generation “AI engine,” which means that the current Snapdragon 8 Gen 3 and Snapdragon X Elite are Qualcomm’s ninth generation of AI engines.


The Rest Of The AI PC NPU Landscape​

Not all NPUs are the same. In fact, we still don’t fully understand what everyone’s NPU architectures are, nor how fast they run, which keeps us from being able to fully compare them. That said, Intel has been very open about the NPU in the Intel Core Ultra model code-named Meteor Lake. Right now, Apple’s M3 Neural Engine ships with 18 TOPS of AI performance, while Intel’s NPU has 11 and the XDNA NPU in AMD’s Ryzen 8040 (a.k.a. Hawk Point) has 16 TOPS. These numbers all seem low when you compare them to Qualcomm’s Snapdragon X Elite, which has an NPU-only TOPS of 45 and a complete system TOPS of 75. In fact, Meteor Lake’s complete system TOPS is 34, while the Ryzen 8040 is 39—both of which are lower than Qualcomm’s NPU-only performance. While I expect Intel and AMD to downplay the role of the NPU initially and Qualcomm to play it up, it does seem that the landscape may become much more interesting at the end of this year moving into early next year.

Shifting Apps From The Cloud To The NPU​

While the CPU and GPU are still extremely relevant for everyday use in PCs, the NPU has become the center of attention for many in the industry as an area for differentiation. One open question is whether the NPU is relevant enough to justify being a technology focus and, if so, how much performance is enough to deliver an adequate experience? In the bigger picture, I believe that NPUs and their TOPS performance have already become a major battlefield within the PC sector. This is especially true if you consider how many applications might target the NPU simultaneously—and possibly bog it down if there isn’t enough performance headroom.
With so much focus on the NPU inside the AI PC, it makes sense that there must be applications that take advantage of that NPU to justify its existence. Today, most AI applications live in the cloud because that’s where most AI compute resides. As more of these applications shift from the cloud to a hybrid model, there will be an increased dependency on local NPUs to offload AI functions from the cloud. Additionally, there will be applications that require higher levels of security for which IT simply won’t allow data to leave the local machine; these applications will be entirely dependent on local compute. Ironically, I believe that one of those key application areas will be security itself, given that security has traditionally been one of the biggest resource hogs for enterprise systems.
As time progresses, more LLMs and other models will be quantized in ways that will enable them to have a smaller footprint on the local device while also improving accuracy. This will enable more on-device AI that has a much better contextual understanding of the local device’s data, and that performs with lower latency. I also believe that while some AI applications will initially deploy as hybrid apps, there will still be some IT organizations that want to deploy on-device first; the earliest versions of those applications will likely not be as optimized as possible and will likely take up more compute, driving more demand for higher TOPS from AI chips.

Increasing Momentum​

However, the race for NPU dominance and relevance has only just begun. Qualcomm’s Snapdragon X Elite is expected to be the NPU TOPS leader when the company launches in the middle of this year, but the company will not be alone. AMD has already committed to delivering 40 TOPS of NPU performance in its next-generation Strix Point Ryzen processors due early next year, while at its recent Vision 2024 conference Intel claimed 100 TOPS of platform-level AI performance for the Lunar Lake chips due in Q4 of 2024. (Recall that Qualcomm’s Snapdragon X Elite claims 75 TOPS across the GPU, CPU and NPU.) While it isn’t official, there is an understanding across the PC ecosystem that Microsoft put a requirement on its silicon vendor partners to deliver at least 40 TOPS of NPU AI performance for running Copilot locally.
One item of note is that most companies are apparently not scaling their NPU performance based on product tier; rather, NPU performance is the same across all platforms. This means that developers can target a single NPU per vendor, which is good news for the developers because optimizing for an NPU is still quite an undertaking. Thankfully, there are low-level APIs such as DirectML and frameworks including ONNX that will hopefully help reduce the burden on developers so they don’t have to target every type of NPU on their own. That said, I do believe that each chip vendor will also have its own set of APIs and SDKs that can help developers take even more advantage of the performance and power savings of their NPUs.

Wrapping Up​

The NPU is quickly becoming the new focus for an industry looking for ways to address the costs and latency that come with cloud-based AI computing. While some companies already have high-performance NPUs, there is a clear and very pressing desire for OEMs to use processors that include NPUs with at least 40 TOPS. There will be an accelerated shift towards on-device AI, which will likely start with hybrid apps and models and in time shift towards mostly on-device computing. This does mean that the NPU’s importance will be less relevant early on for some platforms, but having a less powerful NPU may also translate to not delivering the best possible AI PC experiences.
There are still a lot of unknowns about the complete AI PC vision, especially considering how many different vendors are involved, but I hear that a lot of things will get cleared up at Microsoft’s Build conference in late May. That said, I believe the battle for the AI PC will likely drag on well into 2025 as more chip vendors and OEMs adopt faster and more capable NPUs.

 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 40 users
Rob is also part of Wavious staff since April 29, 2024.
  • Job Title Sales and Channel Leadership. GTM and Business Development. Global and Regional Experience

Lies, a lot of lies … by BrainChip 😂
Found the link.


1714698091539.png


Wavious seems to be a small co too, so the argument that he left for greener pastures doesn't hold water. It doesn't even have a website -- the website link on their linkedin page https://www.linkedin.com/company/wavious/about/ does not lead to anything.

From what I can find, it was founded in 2016 and has 11-50 employees which means it's probably no bigger than BC. According to its entry on crunchbase https://www.crunchbase.com/organization/wavious there is no record of any fundraising yet.

Also I wasn't aware that Jerome Nadel, who's supposed to be the CMO, is working PART TIME at another company (CMO at ProGlobalEvents) now? Yeah I know inflation's a b***h but still, if you have a job maybe give it your all eh?

1714699358999.png


1714699594182.png


I honestly think something is not right with the company.
 
Last edited:
  • Fire
  • Like
  • Wow
Reactions: 9 users
Top Bottom