BRN Discussion Ongoing

IloveLamp

Top 20
Wouldn't you just share it? That's what forums are for.

What if every shareholder emailed TD with the same question? Brainchip would have to hire a team of investor relations personnel costing all of us shareholders more money.

I don't understand your thought process 🤔
That guy needs something to do! He is not worth anywhere near the 250k he is getting paid imo. At least from a shareholders pov. Don't get help for him, get a replacement who actually is interested in answering shareholders very valid questions........sorry TRIGGERED.

Carry on.
 
  • Like
  • Love
  • Fire
Reactions: 18 users

TECH

Regular
  • Like
Reactions: 5 users

Boab

I wish I could paint like Vincent
Tiny volume but someone was pretty keen.
1729721719323.png
 
  • Like
  • Fire
  • Thinking
Reactions: 7 users

Baisyet

Regular
  • Like
Reactions: 3 users

TheDrooben

Pretty Pretty Pretty Pretty Good
Not sure if already posted.......

 
  • Like
  • Fire
Reactions: 7 users

Bravo

If ARM was an arm, BRN would be its biceps💪!

Hi Bravo,

As always, I was ready to apply the wet blanket, and I would ov too, until Sean disclosed our algorithm product line:

US2022210586A1 AUTOMATIC TINNITUS MASKER FOR AN EAR-WEARABLE ELECTRONIC DEVICE 20220630 Starkey
View attachment 70667

[0075] … the controller comprises, or is operatively coupled to, a processor configured with instructions to classify, via a first neural network, the acoustic environment of the wearer as a specified one of a plurality of disparate acoustic environments, and process one or more of the physiologic sensor signals, the non-physiologic sensor signals, and the contextual factor data, via a second neural network, to adjust the tinnitus masking sound produced by the sound generator using the one or more of the physiologic sensor signals, the non-physiologic sensor signals, and the contextual factor data, and parameter values associated with the specified acoustic environment.

[0080] Example Ex44. The device according to Ex43, wherein the neural network comprises one or more of a deep neural network (DNN), a feedforward neural network (FNN), a recurrent neural network (RNN), a long short-term memory (LSTM), gated recurrent units (GRU), light gated recurrent units (LiGRU), a convolutional neural network (CNN), and a spiking neural network
.

[0095] In accordance with any of the embodiments disclosed herein, the controller 120 can include, or be coupled to, a machine learning processor 124 configured to execute computer code or instructions (e.g., firmware, software) including one or more machine learning algorithms 126 . The machine learning processor 124 is configured to process one or more of the physiologic sensor signals, non-physiologic sensor signals, microphone signals, and contextual factor data via one or more machine learning algorithms 126 to detect one or more of presence, absence, and severity of tinnitus of the wearer of the hearing device 100 . Sensor, contextual factor data, and/or wearer input (e.g., manual overrides) received by the machine learning processor 124 are used to inform and refine one or more machine learning algorithms 126 executable by the machine learning processor 124 to automatically enhance and customize tinnitus detection and mitigation implemented by the hearing device 100 for a particular hearing device wearer.

This is looking very promising indeed!



Starkey CEO Brandon Sawalich Talks New Edge AI Hearing Aid In New Interview​

Steven Aquino

Oct 23, 2024,02:34pm EDT


EDGE AI_24_Lifestyle_Tile



The hearing aid market truly is having a banger of a last few weeks.

Earlier this month, hearing aid maker Starkey announced its all-new Edge AI hearing aid. On its website, the company describes it as using “cutting-edge technology mimics the brain’s auditory cortex” in an effort to repair the so-called “broken process” that occurs in the brain’s auditory cortex when someone has hearing loss. According to Starkey, artificial intelligence helps classify complex soundscapes, enhance speech, and reduce noise—all things done in real-time. Like the other hearing aids in its fleet, Starkey’s Edge AI hearing aid integrates with the company’s My Starkey companion app on iOS and Android, as well as the newly-released version on watchOS for Apple Watch.

In a recent interview with me conducted over email, Starkey president and CEO Brandon Sawalich explained the company is steadfastly committed to “innovating our technology as quickly as science will allow” with the overarching goal of “always [pushing] the edge of what’s possible.” The Edge AI, he told me, is the next generation of “intelligent hearing technology and far surpasses anything else on the market today.” The company’s work in the AI area dates back to 2018, a time when AI wasn’t nearly as en vogue and top of mind as it is today. Since those headier days, Sawalich boasted Starkey has cemented itself as a leader in the industry when it comes to meshing hearing aids with AI.

“Whether someone has a mild loss or a severe one, they are tech-savvy or prefer a hands-off experience, Edge AI provides better hearing for all—with a long list of hearing health features to help anyone live a better, healthier and more full life,” Sawalich said. “With Edge AI, we want people to be the best they can be each and every day.”

In technical terms, Sawalich said Edge AI builds on the success of Starkey’s Genesis AI technology. (I interviewed him and chief hearing officer Dave Fabry about Genesis AI earlier this year.) provide an added boost in any listening situation said Starkey’s new Neuro Sound Technology 2.0 includes what he called the Deep Neural Network Enhanced Sound Manager. The company’s AI technology, he added, is “always on” and “30% more accurate” compared to previous versions of the technology at detecting speech. Moreover, he said Starkey’s enhanced its Edge Mode+ feature such that it uses the aforementioned neural network to “provide an added boost in any listening situation.” The augmentation prioritizes clearer speech or listening comfort, whichever the wearer prefers. Edge Mode+ automatically scans for, and adapts to, the user’s changes in environment, Sawalich said to me.

Additionally, Sawalich said Starkey also gave a boost to its onboard digital assistant. The feature, he said, enables people to use their voice to interact with the assistant through the hearing aid itself. The software can answer questions on topics such as the day’s weather and more.

All told, Sawalich told me Edge AI has Starkey’s “most advanced processor.” Furthermore, due to the company’s continued development on its dedicated NPU, or neural processing unit, the company can focus on delivering better performance without commensurately compromising on battery life. Starkey, Sawalich said, is really proud to boast an “industry-leading” number of more than 51 hours of battery life.

Sawalich talked about the new Apple Watch app, saying there’s value in having the ability to control things like the hearing aid’s volume right from one’s wrist. Similarly, he said Edge AI users can do computer-y tasks like take calls, stream music and podcasts, and even enjoy real-time translation of 78 languages alongside the My Starkey app.

When asked about feedback, Sawalich said. Starkey tapped 560 patients to test Edge AI and give feedback, all in the name of “[confirming] we were delivering the very best hearing care before even one hearing aid went out the door.” People have been “blown away” by the difference Edge AI has made, with Sawalich saying sounds are clearer and crisper. The technology, he added, “allows them to hear more of their surroundings so it’s a full sound experience.” What’s more, Sawalich said some testers have reported being able to hear things they heretofore couldn’t. Plus, the battery life—everybody loves better battery life—and waterproofing is appreciated for everyday usage in different places.
Sawalich is exceedingly proud of his team’s work. He said Edge AI features what he described as “the world’s first and only use of sensors onboard the hearing aid to perform an accurate balance assessment.” The functionality, on which Starkey collaborated with Stanford University to assess the algorithms’ accuracy, is a great addition because, as Sawalich told me, it helps identify and manage balance before falls become recurring for people who are prone to it. Starkey and Stanford were able to conclude through their research that the algorithms are “comparable to that of a trained clinician,” Sawalich said. He went on to say Starkey was the first hearing aid manufacturer to add 3D sensors for health and wellness tracking. That work has been ongoing, as Sawalich told me there are many different types of activities in the tracker system.
“We lead the industry in incorporating overall health and wellness into the hearing aid,” Sawalich said.
 
  • Like
  • Fire
  • Wow
Reactions: 33 users

7für7

Top 20
Holy…. What’s going on here 😂

IMG_7060.jpeg
IMG_7061.jpeg
IMG_7062.jpeg
IMG_7064.jpeg
IMG_7065.jpeg
 

Attachments

  • IMG_7063.jpeg
    IMG_7063.jpeg
    198.8 KB · Views: 82
  • Wow
  • Like
  • Haha
Reactions: 16 users

Bravo

If ARM was an arm, BRN would be its biceps💪!

Ericsson and e& team to develop AI-powered autonomous networks

John Tanner22 October 20241266
Middle East North Africa Etisalat Ericsson UAE Artificial Intelligence Sustainability Network Automation
Ericsson and e& team to develop AI-powered autonomous networks



Ericsson says it has signed a Memorandum of Understanding (MoU) with UAE-based operator e& International to explore the potential for developing autonomous networks powered by AI over the next three years.
Under the terms of the MoU revealed on Friday, Ericsson and e& International will engage in joint research and experimentation to share knowledge and better understand the capabilities and potential of enabling AI-driven autonomous network.
Both companies also aim to develop a roadmap to guide the development and evolution of autonomous networks over the next several years. Among other things, Ericsson and e& plan to research how AI can optimize network performance and the customer experience.
The joint research will also explore how AI can contribute to environmental sustainability of the network via better energy efficiencies and reduced carbon emissions. Sabri Yehya, CTO at e& international, said this research will be applied to its goal of achieving net-zero carbon emissions across its operations in all markets by 2040, and by 2030 in the UAE.
“Our memorandum of understanding with Ericsson marks a critical step in our journey toward autonomous networks, which will not only transform the way we operate but also enable us to meet our ambitious net-zero targets,” he said in a statement.
The MoU was signed during last week's GITEX Global 2024 event in Dubai, where Ericsson and e& also announced they had successfully implemented a time-critical communications technology – low latency, low loss, scalable throughput (L4S) – in a 5G commercial network that delivers consistent low-latency connectivity and a lag-free experience for live online cloud gamers.
 
  • Like
  • Fire
  • Love
Reactions: 8 users

Bravo

If ARM was an arm, BRN would be its biceps💪!


What point is David Wyatt actually trying to make?

Is he suggesting that BrainChip's technology shouldn't be classified as spiking neural network (SNN) because it's not spiking in nature according to him?

Is he accusing BrainChip of false advertising?

Is he trying to tarnish BrainChip's reputation stating that " the lack of details and open technical information should be a red flag"?

Honestly, how can this type of denigrating communication on the world's largest professional network be considered appropriate?
 
  • Like
  • Love
  • Fire
Reactions: 28 users

7für7

Top 20
  • Like
  • Fire
Reactions: 7 users

7für7

Top 20
What point is David Wyatt actually trying to make?

Is he suggesting that BrainChip's technology shouldn't be classified as spiking neural network (SNN) because it's not spiking in nature according to him?

Is he accusing BrainChip of false advertising?

Is he trying to tarnish BrainChip's reputation stating that " the lack of details and open technical information should be a red flag"?

Honestly, how can this type of denigrating communication on the world's largest professional network be considered appropriate?
The point is that he might not be up to date on the latest developments. Neither regarding BrainChip, nor Intel, because he no longer works there. I can easily imagine that internally at Intel he was already acting like a know-it-all. He’s just an Intel lohi fanboy who wants to talk down BrainChip. Additionally, maybe he’s a shorter? Who knows. His profile say “investor” maybe he is investing in an other company trying to down ramp openly? DH
 
  • Like
  • Fire
Reactions: 10 users

Guzzi62

Regular
What point is David Wyatt actually trying to make?

Is he suggesting that BrainChip's technology shouldn't be classified as spiking neural network (SNN) because it's not spiking in nature according to him?

Is he accusing BrainChip of false advertising?

Is he trying to tarnish BrainChip's reputation stating that " the lack of details and open technical information should be a red flag"?

Honestly, how can this type of denigrating communication on the world's largest professional network be considered appropriate?
He is basically calling it a scam, no more spiking than his hair after a rough pillow.

That's an outright lie, and he should be forced to apologize or kicked out on Linkedin.
 
  • Like
  • Fire
  • Sad
Reactions: 14 users

7für7

Top 20
He is basically calling it a scam, no more spiking than his hair after a rough pillow.

That's an outright lie, and he should be forced to apologize or kicked out on Linkedin.
And the reactions on his posts compared to Lewis comment, is showing the quality of his behaviour
 
  • Like
Reactions: 5 users
Sean need to address his share holders as this is just about enough for everyone.
It’s been months since the AGM
It meant to happen before years end yet next week is November and nothing
We deserve to be updated Sean

Is it just me or do others feel this way

I am just feeling disappointed atm
Hopefully this will pass I am usually more positive
 
  • Like
  • Love
  • Fire
Reactions: 32 users
  • Like
Reactions: 2 users
Sean need to address his share holders as this is just about enough for everyone.
It’s been months since the AGM
It meant to happen before years end yet next week is November and nothing
We deserve to be updated Sean

Is it just me or do others feel this way

I am just feeling disappointed atm
Hopefully this will pass I am usually more positive
1729729713497.gif
 
  • Haha
  • Like
Reactions: 7 users

AARONASX

Holding onto what I've got
Sean need to address his share holders as this is just about enough for everyone.
It’s been months since the AGM
It meant to happen before years end yet next week is November and nothing
We deserve to be updated Sean

Is it just me or do others feel this way

I am just feeling disappointed atm
Hopefully this will pass I am usually more positive
All i want is a statement from Sean or anyone saying like this

"Brainchip is making significant progress. We are approaching the end of the initial phase and there is a strong likelihood of an important announcement in the coming weeks or months regarding potential future opportunities with these either—positive or negative—or partnerships that we can confidently announce to the market"

Plain, simple, doesn't not inflate or promote any profit, IMO
 
  • Like
  • Fire
  • Love
Reactions: 24 users

Diogenese

Top 20
What point is David Wyatt actually trying to make?

Is he suggesting that BrainChip's technology shouldn't be classified as spiking neural network (SNN) because it's not spiking in nature according to him?

Is he accusing BrainChip of false advertising?

Is he trying to tarnish BrainChip's reputation stating that " the lack of details and open technical information should be a red flag"?

Honestly, how can this type of denigrating communication on the world's largest professional network be considered appropriate?
I don't think Dave knows much about Akida. In fact he admits as much in complaining about available information. The BRN web page has lots of information. His knowledge of Akida seems to be derived from a single presentation at a tech expo.

I think he has confused TENNS with Akida. TENNS does use MACs. Maybe he just woke up for that slide?

Tony Lewis, in his response, stresses that Akida uses "events", so I guess this is to differentiate the new 4-bit Akida from the original 1-bit Akida "spikes".
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 26 users
Top Bottom