BRN Discussion Ongoing

Steve10

Regular
I watched this and ......

Well, .... for Brainchip stockholders such as myself, the Synaptics interview was unsettling. Starting at the 3 hour 52 minute mark it is remarkably what I imagine a Brainchip interview might sound like. Certainly I must be missing something in my understanding of the 3 year abvantage Brainchip says they have over the competition which I imagine Synaptics would be considered to be. Synaptics sure looks like they are a formibable competetitor.

I would love to see an interview with Anil Mankar commenting on a review of the Synoptics interview minute by minute.

Anyone else care to comment on the Synaptics segment? Regards, dippY

Synaptics is using DSP Group’s nNet Lite NN processor.

DBM10L

DSP Group’s DBM10L is an ultra-low-power, small-form-factor, cost-effective artificial intelligence (AI) and machine learning (ML) SoC based on a digital signal processor (DSP) and neural network (NN) engine, both optimized for voice and sensor processing. It is suitable for battery-operated devices such as smartphones, tablets, wearables, and hearables, including true wireless stereo (TWS) headsets, as well as smart home devices such as remote controls. The DBM10L can enable AI/ML, voice, and sensor fusion functions that include voice trigger (VT), voice authentication (VA), voice command (VC), noise reduction (NR), acoustic echo cancellation (AEC), sound event detection (SED), proximity and gesture detection, sensor data processing, and equalization.

The DBM10L’s NN engine comprises DSP Group’s nNet Lite NN processor, a standalone hardware engine that is designed to accelerate the execution of NN inferences. nNet Lite provides the DBM10L its ML capability and is optimized for maximum efficiency to ensure ultra-low power consumption for small- to medium-size NNs.

The DBM10L is supported by embedded memory, as well as serial and audio interfaces for communication with other devices in the system, such as an application processor (AP), codecs, microphones, and sensors.



DSP Group Unveils DBM10 Low-Power Edge AI/ML SoC with Dedicated Neural Network Inference Processor​

January 7 2021, 13:00
DSP Group announced the DBM10, a new low-power, cost-effective artificial intelligence (AI) and machine learning (ML) system-on-chip (SoC). This new open platform, with a cost- and power-optimized architecture, enables rapid development of AI and ML applications for mobile, wearables, hearables, and connected devices in general. It provides a complete platform in terms of voice and audio processing, without compromising the battery life of new designs, and allowing developers to implement their own differentiating algorithms.

DSP Group is a global provider of wireless and voice-processing chipset solutions with extensive experience in voice implementation and an increasing focus in advanced audio processing for personal audio with hearables (headphones, headsets, earbuds) and wearables (on-body electronics). Its new DBM10 SoC comprises a digital signal processor (DSP) and the company’s nNetLite neural network (NN) processor, both optimized for low-power voice and sensor processing in battery operated devices.

This dual-core architecture offers developers with full flexibility of partitioning innovative algorithms between DSP and NN processor and enables fast time to market for integration of voice and sensing algorithms such as noise reduction, AEC, wake-word detection, voice activity detection and other ML models.

The DBM10 features an open platform approach with a comprehensive software framework. This allows developers to quickly get next-generation designs to market with their own algorithms, or with DSP Group’s comprehensive and proven suite of optimized algorithms for voice, sound event detection (SED), and sensor fusion, as required by applications ranging from true wireless stereo (TWS) headsets to smartphones, tablets, wearables, and connected devices.

"Edge applications for AI are many and diverse, but almost all require the ultimate in terms of low power, small form factor, cost effectiveness, and fast time-to-market, so we are very excited about what the DBM10 brings to current and new customers and partners," says Ofer Elyakim, CEO of DSP Group. "Our team has worked to make the absolute best use of available processing power and memory for low-power AI and ML at the edge — including developing our own patent-pending weight compression scheme —while also emphasizing ease of deployment. We look forward to seeing how creatively developers apply the DBM10 platform."

The DBM10 adds to DSP Group’s SmartVoice line of SoCs and algorithms that are deployed globally in devices ranging from smartphones and laptops/PCs, to set-top boxes, tablets, remote controls, and smart IoT devices for the home. In 2020, SmartVoice shipments reached the 100 millionth milestone, and the new low-power DBM10 is already supported by an established ecosystem of third-party algorithm providers. Some of these have already begun running their NN algorithms on the nNetLite NN processor at the heart of the DBM10 to achieve maximum performance at the lowest power consumption.

1678835512240.png


Working alongside a programmable low-power DSP, the nNetLite processor supports all standard deep NN (DNN) and ML frameworks and employs a comprehensive cross-platform toolchain for model migration and optimization. The SoC device is supplied in a highly-compact form factor (~4 mm2), specified to support ultra-low-power inference at ~500 μW (typical) for voice NN algorithms and being able to run Hello Edge 30-word detection model @ 1 MHz (125 MHz available) as a reference. The DBM10 allows porting of large models (10s of megabytes) without significant accuracy loss using model optimization and compression.
www.dspg.com



Hisense Selects Synaptics’ DBM10L Processor For First AI-Enabled Always-On Voice Remote Control​

AMSTERDAM, The Netherlands, Sept. 09, 2022 – Synaptics® Incorporated (Nasdaq: SYNA) today announced that Hisense, a global leader in consumer electronics and home appliances, selected the DBM10L with its dedicated neural processing unit (NPU) to implement the first artificial intelligence (AI)-enabled always-on voice (AOV) remote control unit (RCU), the EFR3B86H. Hisense paired the DBM10L-equipped RCU with its state-of-the-art 65A9H 4K OLED TV, where Synaptics' high-performance edge-AI processing and low power are vital to ensure the ultimate AOV end-user experience.

“Hisense consistently stays ahead of the curve when it comes to enabling innovative and intuitive features,” said Venkat Kodavati, SVP and Chief Product Officer at Synaptics. “With end users’ increasing reliance upon voice and voice assistants such as Alexa, we are very excited to have worked with them to bring that same experience to TV remote controls. Our collaboration on a high-performance AOV implementation creates the opportunity for remotes to now become a more integral and critical user-engagement platform for the smart home.”

While a reliable and responsive AOV experience for remote controls is increasingly desirable, it is challenging to execute in battery-driven applications. “This is particularly true in noisy environments as more noise translates to more power consumption to prevent performance degradation,” said Shay Kamin Braun, Director of Product Marketing at Synaptics.

The DBM10L enables a superior AOV user experience that combines high performance with ultra-low power consumption, allowing devices to operate for extended periods using a single pair of AAA batteries. “Along with upcoming innovations such as biometrics for voice authentication for online purchases, AOV remote controls for TVs and other consumer devices can now provide greater convenience for users and higher attachment rates for equipment and service providers,” said Kamin Braun.

The DBM10L AOV solution
To solve the power consumption challenge while delivering the best performance for AOV applications, Synaptics built an ultra-low-power voice engine around its DBM10L system-on-chip (SoC), which combines the dedicated NPU with a low-power DSP. The solution comprises the DBM10L and proven algorithms for filtering, noise suppression, beamforming, wake word detection, and voice activity detection. Optimizations allow deep neural network (DNN)-based wake-word detection and other edge AI algorithms to run on the DBM10L's NPU, targeting high performance at ultra-low power with low latency, while different voice and audio processing algorithms run optimally on the integrated low-power DSP.

Availability
The EFR3B86H AOV remote control is shipping now with the Hisense TV model 65A9H. For more information on the DBM10L, visit the DBM10L webpage or contact your local Synaptics sales representative.

For more about the potential of AOV RCUs and how they are changing how we interact with home devices, see “Always-On Voice Makes Content Control Seamless and Intuitive”.

About Hisense
Founded in 1969, Hisense is one of the largest consumer electronics and home appliances companies in the world. Hisense offers a broad range of technology-driven products that are manufactured and distributed across the world, including smart TVs, smart phones, refrigerators, freezers, and air conditioners, among other products. Hisense has a workforce of over 70000 worldwide, and its flat-panel TV market share in China has been No.1 for 13 consecutive years. Currently, Hisense boasts several subsidiaries, with sales revenue reaching CNY 100.3 billion in 2016. For more, visit www.hisenseme.com.

 
  • Like
  • Fire
  • Thinking
Reactions: 17 users
I watched this and ......

Well, .... for Brainchip stockholders such as myself, the Synaptics interview was unsettling. Starting at the 3 hour 52 minute mark it is remarkably what I imagine a Brainchip interview might sound like. Certainly I must be missing something in my understanding of the 3 year abvantage Brainchip says they have over the competition which I imagine Synaptics would be considered to be. Synaptics sure looks like they are a formibable competetitor.

I would love to see an interview with Anil Mankar commenting on a review of the Synoptics interview minute by minute.

Anyone else care to comment on the Synaptics segment? Regards, dippY
So this is the same as BrainChip by the sounds of things
 
  • Like
Reactions: 1 users

mrgds

Regular
Sean has some good mates. 20 year ex-Nvidia veteran endorsing BRN's ecosystem approach.

Wonder when he will load up on some shares or will Nvidia load up soon?

Could the tranformers news be Nvidia related?Parting words,
Good podcast, .................... enthusiasm was very clear to hear by both Sean & Jeff

Parting words;
Sean; ................. lets do it again soon.
Jeff; ......................yes, hope so.

(y)(y)(y)(y)(y)(y)(y)(y)(y)(y)(y)

AKIDA BALLISTA
 
  • Like
  • Fire
Reactions: 22 users

Foxdog

Regular
  • Haha
  • Like
Reactions: 2 users

JDelekto

Regular
I watched this and ......

Well, .... for Brainchip stockholders such as myself, the Synaptics interview was unsettling. Starting at the 3 hour 52 minute mark it is remarkably what I imagine a Brainchip interview might sound like. Certainly I must be missing something in my understanding of the 3 year abvantage Brainchip says they have over the competition which I imagine Synaptics would be considered to be. Synaptics sure looks like they are a formibable competetitor.

I would love to see an interview with Anil Mankar commenting on a review of the Synoptics interview minute by minute.

Anyone else care to comment on the Synaptics segment? Regards, dippY
I did not find it unsettling at all. The representative noted that it was an AI accelerator, quite different from a neuromorphic processor.

I believe the Syntiant solution only does audio processing, whereas Akida can handle information from many different types of sensor input. The power draw in the milliwatt range is comparable between the two.

They also mention "loading the models in real-time" which is much different than the patented on-chip learning of which Akida is capable.

So it does seem that the Syntiant audio-processing AI accelerator is competition for Akida in specific use cases, but I think that BrainChip's processor is a much more flexible solution.
 
  • Like
  • Fire
  • Love
Reactions: 59 users

tjcov87

Member
Screenshot_20230315_104124_LinkedIn.jpg
 
  • Like
  • Fire
Reactions: 42 users

Yak52

Regular
Lithium batteries are causing a lot of trouble for us at work lately ( I’m a firefighter) once they crack it causes a thermal runaway.
Mmm......... interesting mccabe84. common event huh ?
I recently had to replace an aircraft battery (lead acid) and looked hard for a suitable Lithium replacement to help reduce weight (aerobatic aircraft) and improve performance.
After talking/researching the various types of Lithium batteries I decided to stay with a normal lead acid battery as I could not be convinced by salesmen or marketing that a Lithium battery would not suddenly have a "thermal runaway" event while pulling a 5 g maneuver.
One battery salesman even suggested I would be able to use my parachute if such event happened, to which I did not bother trying to explain that the battery is located behind the pilots seat some 8 inchs from the "parachute" . :rolleyes:

Back to a heavy old lead acid battery yet again! Maybe one day...........

Yak52 :cool:
 
  • Like
  • Fire
Reactions: 13 users
I did not find it unsettling at all. The representative noted that it was an AI accelerator, quite different from a neuromorphic processor.

I believe the Syntiant solution only does audio processing, whereas Akida can handle information from many different types of sensor input. The power draw in the milliwatt range is comparable between the two.

They also mention "loading the models in real-time" which is much different than the patented on-chip learning of which Akida is capable.

So it does seem that the Syntiant audio-processing AI accelerator is competition for Akida in specific use cases, but I think that BrainChip's processor is a much more flexible solution.
Now sitting in waiting room but Teksun is partnered with both Brainchip and Synaptics and clearly differentiates what they each provide.

I suspect just on this basis they can be discounted but will await @Diogenese to throw up a patent or two to confirm what they are actually doing that Teksun requires in addition to Brainchip.

I will say this that all the searching I have done on SNN has not thrown up any research papers in that area involving them or employees.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Fire
  • Love
Reactions: 37 users
People give this a thumbs up but it goes and creates work and the company isn't going to confirm this and it potentially creates issues with Teksun. Sometimes just let things be.
I respect what you said, but the information is in the public domain and it's a genuine question.

The Company is well aware of the 1000 eyes and so should their customers and partners.

If they don't, they should be made aware.

It's Tony's job to answer questions like this, it shouldn't be his job though, to have to answer questions from impatient shareholders (you are obviously not one of them).

The Company can easily say, that they can't comment and then maybe Teksun's website is changed.

It would then just remain speculation, as to its meaning.

Better that things like this are thrust into the open and then hopefully things are put into place, to reduce further occurances of information, that's not supposed to be available 👍
 
  • Like
  • Love
  • Fire
Reactions: 25 users
Why are we censoring links???

Here is the content:

Australian businesses adopting AI tech to bolster revenue, but challenges remain​

ASX News, Technology​

1974cbfad03c7a6cd8f728c5b11155fe

Louis AllenMarkets Reporterlouis.allen@***************.com.au15 March 2023 05:59(AEDT)
3 mins
https://***************.com.au/wp-c...HOTO-ILLUSTRATIONS-1280x720-800x430.jpgSource: Jakub Porzycki via Reuters Connect
https://***************.com.au/wp-content/themes/spotlight/images/logo-small-white.svg

Subscribe​

Be the first with the news that moves the market

Artificial intelligence (AI) is transforming the way businesses operate across the globe, with an increasing number of companies adopting the technology to gain a competitive edge in today’s digital landscape.
Companies across an array of sectors are using AI technologies to grow revenue and improve efficiency across their operations.
In Australia, the National AI Centre (NAIC) was set up to further develop the country’s AI and digital ecosystem. It’s funded by the Australian government and coordinated by the national science agency, CSIRO.
Recently, the NAIC released the ‘Australia’s AI ecosystem momentum’ report, which looked into the experiences of 200 information technology and business decision-makers and AI service providers to better understand AI adoption across businesses in Australia.
The report was commissioned by the NAIC and prepared by Forrester Consulting.

Findings of the report​

The AI ecosystem momentum report highlighted the growing appetite for AI use across Australian businesses but revealed certain barriers to implementing the solutions.
“Our research shows Australian businesses reported an average revenue growth of $361,315 for each AI-enabled solution that was implemented, regardless of which part of the business these efforts were targeted,” National AI Centre Director Stela Solar said.
“Over 80 per cent of businesses surveyed expected their year-on-year revenue to grow, with technology at the centre of their growth strategies,” she said.
Despite that, the report found that AI strategies were actually quite difficult to deliver.
From the results, a majority of respondents required engaging at least four AI technology and service providers to carry out an AI project, while 28 per cent of respondents reported working with more than six providers. Just 17 per cent worked on projects with a single provider.
This led the NAIC to believe that the successful implementation of AI strategies was a collaborative effort, and “businesses should be comfortable with the idea of working alongside several providers to ensure they get complete solutions that deliver business outcomes”.

Australia’s significant AI players​

Many companies worldwide are making investments in AI initiatives and programs in an effort to revolutionise business operations for the better.
Additionally, universities and research institutions are making significant strides in the field of AI, further fueling the growth of the industry.
In Australia, there are a number of players that stand to benefit from growing AI adoption, including ASX-listed Appen (APX), Nuix (NXL), Megaport (MP1), WiseTech Global (WT1), Brainchip (BRN) and Unith (UNT), among others.
Appen is a leading provider of high-quality training data for machine learning and artificial intelligence algorithms. As more businesses start to adopt AI, so will the demand for high-quality training in the new era of tech.
Nuix provides software that helps organisations investigate and analyse large amounts of data quickly and efficiently. The company’s software incorporates artificial intelligence and machine learning technologies.
Brisbane-based Megaport offers software-defined networking services, which enable businesses to connect their IT infrastructure to the cloud. As more businesses adopt AI and cloud computing, the demand for Megaport’s services is expected to increase.
ASX-200 giant WiseTech Global provides software solutions for the logistics industry, including AI-powered tools that help businesses optimise supply chains.
BrainChip is a provider of AI-powered semiconductor technology that can be used in various ways, including surveillance, autonomous vehicles, and industrial automation.
Amsterdam-based Unith is capitalising on the rapidly-growing global AI service market with its innovative Talking Head platform.
The Talking Head transforms the way businesses communicate with customers by allowing for real-time conversations, in multiple languages and through a full-stack platform, all powered by AI.
As AI adoption continues to grow across Australia and the globe, companies are discovering that successful implementation requires a collaborative approach with multiple providers to deliver complete solutions that help enable optimal business outcomes.
This trend is expected to continue as more businesses seek to leverage the benefits of AI to improve business operations.
 
  • Like
  • Love
Reactions: 18 users

stuart888

Regular
Lucky went fishing yesterday & missed the bloodbath.

Heaps of great BRN links & news posted. Takes a while to catch up here. LOL

Looking at markets, US CPI came in at 6% as forecast down from 6.4% YOY.

The US 2 year bond yield has dropped from 5% to 4.25%. This indicates 0.75% rate cuts by US Fed, however, bond market could be wrong. Fedwatch Tool indicates 20% probability of rate pause next week & 80% probability 0.25% rate rise.

The AU 2 year bond yield has dropped from 3.7% to 3.2% indicating 0.5% rate cuts. The RBA Futures chart indicates no more RBA rate hikes & rate cuts in H1 2024.



View attachment 32158
Fishing! Brainchip is prime bait! I really enjoy my growth stock picks. So fun to follow.

 
  • Like
  • Haha
Reactions: 6 users

wilzy123

Founding Member
Why are we censoring links???

Here is the content:

Australian businesses adopting AI tech to bolster revenue, but challenges remain​

ASX News, Technology​

1974cbfad03c7a6cd8f728c5b11155fe

Louis AllenMarkets Reporterlouis.allen@***************.com.au15 March 2023 05:59(AEDT)
3 mins
https://***************.com.au/wp-c...HOTO-ILLUSTRATIONS-1280x720-800x430.jpgSource: Jakub Porzycki via Reuters Connect

Subscribe​

Be the first with the news that moves the market

Artificial intelligence (AI) is transforming the way businesses operate across the globe, with an increasing number of companies adopting the technology to gain a competitive edge in today’s digital landscape.
Companies across an array of sectors are using AI technologies to grow revenue and improve efficiency across their operations.
In Australia, the National AI Centre (NAIC) was set up to further develop the country’s AI and digital ecosystem. It’s funded by the Australian government and coordinated by the national science agency, CSIRO.
Recently, the NAIC released the ‘Australia’s AI ecosystem momentum’ report, which looked into the experiences of 200 information technology and business decision-makers and AI service providers to better understand AI adoption across businesses in Australia.
The report was commissioned by the NAIC and prepared by Forrester Consulting.

Findings of the report​

The AI ecosystem momentum report highlighted the growing appetite for AI use across Australian businesses but revealed certain barriers to implementing the solutions.
“Our research shows Australian businesses reported an average revenue growth of $361,315 for each AI-enabled solution that was implemented, regardless of which part of the business these efforts were targeted,” National AI Centre Director Stela Solar said.
“Over 80 per cent of businesses surveyed expected their year-on-year revenue to grow, with technology at the centre of their growth strategies,” she said.
Despite that, the report found that AI strategies were actually quite difficult to deliver.
From the results, a majority of respondents required engaging at least four AI technology and service providers to carry out an AI project, while 28 per cent of respondents reported working with more than six providers. Just 17 per cent worked on projects with a single provider.
This led the NAIC to believe that the successful implementation of AI strategies was a collaborative effort, and “businesses should be comfortable with the idea of working alongside several providers to ensure they get complete solutions that deliver business outcomes”.

Australia’s significant AI players​

Many companies worldwide are making investments in AI initiatives and programs in an effort to revolutionise business operations for the better.
Additionally, universities and research institutions are making significant strides in the field of AI, further fueling the growth of the industry.
In Australia, there are a number of players that stand to benefit from growing AI adoption, including ASX-listed Appen (APX), Nuix (NXL), Megaport (MP1), WiseTech Global (WT1), Brainchip (BRN) and Unith (UNT), among others.
Appen is a leading provider of high-quality training data for machine learning and artificial intelligence algorithms. As more businesses start to adopt AI, so will the demand for high-quality training in the new era of tech.
Nuix provides software that helps organisations investigate and analyse large amounts of data quickly and efficiently. The company’s software incorporates artificial intelligence and machine learning technologies.
Brisbane-based Megaport offers software-defined networking services, which enable businesses to connect their IT infrastructure to the cloud. As more businesses adopt AI and cloud computing, the demand for Megaport’s services is expected to increase.
ASX-200 giant WiseTech Global provides software solutions for the logistics industry, including AI-powered tools that help businesses optimise supply chains.
BrainChip is a provider of AI-powered semiconductor technology that can be used in various ways, including surveillance, autonomous vehicles, and industrial automation.
Amsterdam-based Unith is capitalising on the rapidly-growing global AI service market with its innovative Talking Head platform.
The Talking Head transforms the way businesses communicate with customers by allowing for real-time conversations, in multiple languages and through a full-stack platform, all powered by AI.
As AI adoption continues to grow across Australia and the globe, companies are discovering that successful implementation requires a collaborative approach with multiple providers to deliver complete solutions that help enable optimal business outcomes.
This trend is expected to continue as more businesses seek to leverage the benefits of AI to improve business operations.

Probably because you're sharing content from a very questionable source
 
  • Like
Reactions: 4 users
Probably because you're sharing content from a very questionable source
Ok, didn't know that 7he nnarket Herald was a very questionable source.
 
  • Like
  • Haha
Reactions: 8 users

Shadow59

Regular
  • Like
  • Haha
  • Love
Reactions: 14 users
Ok, didn't know that 7he nnarket Herald was a very questionable source.
They link to HC. Enough said.
Regards
FF


AKIDA BALLISTA
 
  • Like
  • Love
  • Haha
Reactions: 21 users

Dhm

Regular
  • Like
  • Fire
  • Love
Reactions: 27 users

Learning

Learning to the Top 🕵‍♂️



Maybe I have selective hearing. Just my thought on the podcast.

"Ecosystem" = Essential in the tech industries

"Believes" = Benefits = Results AKD1000 AKD1500, AKD 2.0.

"Developer's" = Just Edge Impulse alone, 60000 developer's.

"Focus" = Management is focused in growing the business.

"Partners" ARM, Intel, SiFi, MegaChips, Socionext and more!

"AI is the most disruptive technology we are seeing in our lifetime"

"Inferencing" @ low-power at the edge= Akida

"Telchnology for today and tomorrow"

"Transformer" more to come 👏👏👏

"Exciting year"

"Will take time" = Patience will be require.

Learning 🏖
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 64 users

TechGirl

Founding Member
Probably posted before but good to see Micron was looking into us all the way back in 2018


Micron Places $100 Million AI Bet​

By Rick Merritt

10.11.2018


SAN FRANCISCO — Micron Technology announced a $100 million venture fund with a focus on AI at a coming-out party for its new chief executive here. Sanjay Mehrotra, former CEO and co-founder of SanDisk, vowed to accelerate new product introductions at the company best known as one of the survivors of consolidation in commodity DRAMs.

It’s a challenging time for the memory and storage giant that trails larger rivals Samsung, SK Hynix, and Toshiba. Breakout products such as the Hybrid Memory Cube and Automata processor failed to gain significant traction and 3DXPoint memories co-designed with Intel hit delays, with revenues for Micron now pushed out to 2020.

On the positive side, the company hopes to sample 3DXPoint devices late next year. It will also enter the market for high-bandwidth memory stacks, initially with HBM2 in 2019.

In its core business, Micron has a DRAM-like product in the lab that it aims to sample in 2021. It has a process technology roadmap with three new nodes ramping into production. In addition, it is privately showing customers software to accelerate NAND storage performance 3x to 5x.

At the event here, executives made the case that the rise of deep learning will help drive demand for memory and storage. Its new fund aims to help it ride the wave with future investments in hardware, software, and services.

“This is not just about the money, but technology partnerships as well,” said Sumit Sadana, Micron’s chief business officer and a former executive at SanDisk, Freescale, and IBM. “We have some emerging memory technologies not yet in production that we will seek partners to help bring to market.”

Micron researchers are exploring the kinds of processor-on-memory architectures that startups such as Mythic and Syntiant hope to pioneer.

They are also bullish on so-called neuromorphic chips that use a mesh of synapse-like cores that companies such as BrainChip are pursuing.

It’s unclear when or how Micron will be able to turn the concepts into products.

“We have a lot of innovative work in memory and future architectures, some of them are focused on deep learning,” said Sadana. “We have grants from the U.S. government to investigate frontiers of processing in memory and deep-learning acceleration, but a lot of the work is extremely sensitive and confidential.”

It’s still early days for machine learning, said speakers from Amazon and Microsoft.

For example, developers are working to enable Amazon’s Alexa to understand context as well as multiple commands in a sentence, said Prem Natarajan, who heads up natural language work for the web giant. Microsoft recently released a service to let companies create their own voice assistants and aims to enable them to someday engage in realistic conversations, said Lili Cheng, who leads Microsoft’s AI research.

Next page: Bullish on 3DXPoint and process tech


AI training could require 7x more DRAM and 2x more NAND in future servers. (Image: Micron)
AI training could require 7x more DRAM and 2x more NAND in future servers. (Image: Micron)

Related IoT Content: ‘We think AI has a natural relationship with IoT,’ says top Microsoft exec

Micron execs remain bullish on its core DRAM roadmap as well as the outlook for 3DXPoint in both main memory and storage.
The company expects to use 3DXPoint in both dense, fast main memory products and storage that delivers lower latency than Intel’s current Optane drives. Micron’s stated plan of sampling in late 2019 for revenues in 2020 suggests that it is waiting for a second-generation 3DXPoint device that it is co-developing with Intel.
Executives declined to provide any product details or market projections. “Our belief is that [3DXPoint] becomes a meaningful part of the data center hierarchy that will cannibalize a nibble on either side [of DRAM and NAND], but the aggregate market grows quite a bit,” said Tom Eby, who runs Micron’s compute and networking group.
“3DXPoint is still in its infancy compared to DRAM and NAND,” said Jeff VerHeul, who oversees Micron’s development for it and non-volatile memories. “It will evolve and become more efficient.”
In its core DRAM sector, “we see at least three technology nodes beyond the one we are working in now … we have more visibility than we have had in the last decade,” said Scott DeBoer, head of tech development at Micron. That said, “the cost reduction and bit-density increase per node is slowing and is not at the pace of the last 20 years.”
Micron’s DRAM roadmap does not require extreme ultraviolet lithography and is already using double- and quad-patterning. Advanced circuit designs are enabling operation with less charge stored in taller capacitors than used in the past, but otherwise, the overall architecture is a conventional one, added DeBoer.
Micron is already shipping DRAMs made in its sub-20-nm 1x process and expects first revenue for some made in its 1y process next quarter, said Eby. The transition to 16-Gbit chips from today’s mainstream 8-Gbit designs is near, but he would not comment on the viability of a 32G design.
— Rick Merritt, Silicon Valley Bureau Chief, EE Times Circle me on Google+


And in below screenshot BrainChips Update for the March 2019 Quarter Micron is mentioned


1678839314804.png
 
  • Like
  • Fire
  • Love
Reactions: 50 users
Can we put competitors to bed with the following extract:

“Currently existing neuromorphic architectures include
• IBM TrueNorth,
• Intel Loihi,
• Tianjic,
• SpiNNaker,
• BrainScaleS,
• NeuronFlow,
• DYNAP, and
• Akida.

Some of the above architectures are fully neuromorphic [31,32], while other remain hybrid, meaning that they use asynchronous circuits together with synchronous proces- sors [33,34].

Despite the field being still in its infancy, the first commercial neuromorphic processor was made available worldwide in August 2021. It is Akida from Australian company BrainChip. Unfortunately, these hardware platforms are very expensive at the time of writing and (apart from Akida) not feasibly available.”


My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Fire
  • Love
Reactions: 54 users

Evermont

Stealth Mode
Does Teksun use ChatGPT?

No mention of said names during an earlier grilling.

 
  • Like
Reactions: 3 users
Top Bottom