BRN Discussion Ongoing

ndefries

Regular
I get the sense those nefarious entities trying to supress the SP are playing the proverbial finger in the dyke.
So many partners starting to promote the BrainChip relationship ... it's seen as a real competitive advantage.
And more to follow soon.


View attachment 8341
Out of left field is the potential one day not for a take over but a merger with a company like Edge. It would only be if we could show that together we provide a way easier and attractive way to deal with the immense market needs for the better of both companies and shareholder risk and return measures are better. of course we are so agnostic and impartial that it would need to be a like firm.

Not saying it ever has to happen but critical mass and being able to offer muliple complementary skillsets may make this a valuable proposition.
 
  • Like
  • Thinking
  • Wow
Reactions: 8 users
Hi All,

I will start by saying - I see most people here including me are all fully invested in Brainchips future and the future revenues are quite exciting to contemplate. I actually just bought a few more shares today, just keen on topping up. I see any price below $3 as very cheap in the long term play for Brainchip - dare I say one day less than $10 will be very cheap.

I guess I do want to discuss the expectations out there around this next quarterly due from Brainchip as expectations on the last quarterly seemed to be a bit high for some people around revenue. What I really would not like to see is vocal disappointment and the flow on effect of retail selling due to people expecting sizeable revenue to appear in the next quarterly.

Prior to the last quarterly the company said they expect a starting step up in revenue in the last half of the calendar year 2022. That to me meant - expect not much revenue to show up yet in the next quarterly - and that proved true. It was around 250k

The same goes again for this quarter - I am not expecting much revenue as yet as we are not in the second half of the year yet - I hope most other people are setting their expectations in a similar way. I expect less than a 1 million revenue again. Any improvement to my view is a bonus.

Tbh July to Sept 2022 quarterly we most likely will not see any massive breakthrough for squillions of dollars - could be under 5 mil maybe even 3 mil. If there is more I see it as a bonus. What we want to see it decent growth in revenue from the end of this year - but there wont be much just yet it is clear.

What we know is Brainchip only recently got their fully tested silicon on the market - it takes time to sell it/the IP to the mass market - The process to a new interested partner: Prove the product to the potential partner, then cut the deals with the partner, then partner plans the products and then they build the integrated products(supply can be an issue) and then fully test them - then the partner has to commercialise their product and market that integrated product to the market - for Brainchip to get that singular revenue stream - and that is keeping it pretty simple - it is more complex than that( packaging, shipping. resellers etc etc etc). It takes a fair bit of time - years as we know.

Obviously Brainchip have been going at it for a few years with the EAP's. That is obviously where the early revenue is coming from in the second half of this year - how big and how fast the money will be....we don't know. We can see Brainchip are involved with a hell of a lot of partners and potential products - it does not mean immediate revenue - but it's all awesome future potential revenue.

Anyway I am just keen for there to be a reality based view of what we should expect this next quarterly report in July. Some awesome progression of partners and planning of products - the dollars will start to flow later in the year is clear. :)
 
  • Like
  • Love
  • Fire
Reactions: 61 users
Hi @maccareadsalot

“@factfinders 1% is just outrageously silly IMO (no criticism intended FF).”

I agree completely it is outrageously silly in my opinion as well.

The 1% was born out of the need to counter trolls over at HC and occasionally here though they die a painful death in obscurity here thanks to @zeeb0t

The trolls and manipulators would attack the market cap of Brainchip right through to $2.34 because of an absence of income.

However just like Peter van der Made and Sean Hehir I have a reasonable grasp of the size of the addressable market and as
stated by Marc Steimer every dumb sensor in the World and beyond needs to be made smart.

I have also understood thanks to others in particular @Diogenese that Brainchip has a rock solid wall of patents protecting, what even Russian academics have discovered, the ability to one shot and incrementally learn on device at ludicrously low power.

Knowing these facts and knowing that a credible assessment has been made that the semiconductor market for the Automotive Industry will be 200 billion by 2030 and that Brainchip has been taken on board by Mercedes Benz, Ford, Valeo, Renesas, MegaChips, ARM, SiFive and Nviso and others who play in this space, it is impossible to credibly argue that being first to market, patent protected and unique will not guarantee one tiny little percent of the US 200 billion dollar automotive semiconductor market.

One percent of US 200 billion is US 2 billion dollars.

TWO BILLION DOLLARS IN REVENUE EASILY JUSTIFIES A PRESENT SHARE PRICE OF THE $2.34 WHICH SOME CLAIMED WAS OVER VALUED.

To this potential Brainchip can add every other market under the sun where currently dumb sensors are utilised plus all those markets still to be discovered.

Again how can anyone who understands the Brainchip AKIDA Technology advances suggest they will not capture an additional one percent of these markets.

You see the strength of the ONE PERCENT ARGUMENT is that it is outrageously silly but even at one percent the revenue stream is unbelievable.

Anyone who understands Brainchip and its technology advantage could never credibly state that it was overvalued at these levels.

As I noted the other day ARM caused to be shipped 29 billion semiconductors in the last 12 months.

Renesas has shipped one billion of just one of its MCUs.

The addressable market is in the trillions of dollars as Kathy Woods has stated.

If you are unable to understand the success that is knocking at Brainchip’s door and what capturing as little as one percent of the addressable market will mean in my opinion you should not be invested here or you have another agenda that does not align with the interests of genuine investors.

My silly opinion with a purpose only so DYOR
FF

AKIDA BALLISTA
From the new Brainchip website:

"We’re on a mission.


To make every device with a sensor AI-smart.

We’re the worldwide leader in edge AI
on-chip processing and learning."
 
  • Like
  • Love
  • Fire
Reactions: 51 users

Diogenese

Top 20
From the new Brainchip website:

"We’re on a mission.


To make every device with a sensor AI-smart.

We’re the worldwide leader in edge AI
on-chip processing and learning."


 
  • Haha
  • Like
  • Love
Reactions: 26 users

Dolci

Regular
..... looks like BRN might be in June Index Rebalance...



Screenshot 2022-06-02 154451.png


Screenshot 2022-06-02 154530.png

Screenshot 2022-06-02 154631.png
 
  • Like
  • Love
  • Fire
Reactions: 97 users
I'll just say happy to be in an ecosystem with a partner like this.

arm-fy21-rev-info-1118x2048.png
 
  • Like
  • Fire
  • Love
Reactions: 47 users
D

Deleted member 118

Guest
  • Like
  • Haha
  • Love
Reactions: 16 users

Dolci

Regular
  • Like
Reactions: 9 users

mrgds

Regular
..... looks like BRN might be in June Index Rebalance...



View attachment 8346

View attachment 8347
View attachment 8348
Thanks @Dolci
What are peoples thoughts on a inclusion into the 200?
I have heard conflicting outcomes , some good/some not so , after inclusion.
Anyone had experiences they"d like too share?

Also, the continual "pushing down"of the s/p, would there be "entities" wanting the s/p to stay low, as not to force institutions to have to buy in?
Cheers .................. Akida Ballista
 
  • Like
  • Thinking
  • Love
Reactions: 12 users
A walk down recent memory lane:

ISL Using Neuromorphic Chip-Based Radar Research Solution for Air Force

Information Systems Laboratories, an employee-owned company, is using BrainChip Holdings’ Akida neural network processor to develop an AI-driven radar research solution for the Air Force Research Laboratory.
Akida is an ultra-low-power chip with a neuromorphic architecture, an artificial intelligence design inspired by the biology of the human brain. ISL’s work on the AI-powered solution is sponsored by the Air Force Research Laboratory, BrainChip said.
BrainChip added that its Akida platform is built to provide power consumption improvements, design flexibility and edge-learning capabilities.
Sean Hehir, CEO of BrainChip, said that ISL adopted Akida because of its production-ready status and go-to-market advantages. “We feel that the combination of technologies will help expedite its deployment into the field,” Hehir said.
As a member of BrainChip’s early access program, ISL also has access to boards with the Akida device, software and hardware support and additional engineering resources.
Jamie Bergin, ISL senior vice president and senior manager of research, development and engineering solutions at ISL, added that the early access program has allowed ISL to evaluate first-hand the capabilities provided by Akida.
BrainChip said that Akida is available for licensing, as well as for orders for production release in silicon. The platform will also unlock benefits in health care, smart cities, transportation and smart home applications, the company added.
Headquartered in San Diego, California, ISL is a technology company that provides research, analysis and hardware design services to customers in the defense and energy sectors, according to its LinkedIn profile.
ISL’s specialties include advanced signal processing, space exploration, undersea systems, surveillance, tracking, cybersecurity, advanced radar systems and energy independence, BrainChip said
."



AND FROM ISL’S WEBSITE:

Cognitive Systems

ISL literally wrote the first book on Cognitive Radar in 2010, 2nd Edition 2020 (https://us.artechhouse.com/Cognitiv...y-Adaptive-Approach-Second-Edition-P2093.aspx ), and continues to provide cutting edge solutions in advanced cognitive sensing.

What distinguishes a cognitive system from a more traditional adaptive one, is its contextual “awareness” and advanced embedded computational and AI capabilities. In the case of active sensors such as radar, cognitive sensors also employ active “probing” of the environment to better characterize the entire channel (targets, clutter, interference).


Neuromorphic Engineering and Artificial Intelligence
ISL is focused on replicating the analog nature of biological computation and the role of neurons in cognition. ISL’s team of scientists/engineers continue to understand how the morphology of individual neurons, circuits, applications, and overall architectures creates desirable computations. Leveraging this understanding and the newly developed and emerging commercial neuromorphic chips, ISL is developing a new low-power, lightweight detect and avoid (DAA) system for very small UAS platforms that exploits automotive radar hardware, light-weight EO/IR sensors, advanced data fusion algorithms, and neuromorphic computing.

I know many here if not most know about ISL, Brainchip and the US Air Force Research Laboratory project but interestingly having just trawled through the ISL website and social media Brainchip & AKIDA are not named anywhere yet this project is still live. When you read ISL's website it is clear those working there are very much invested in the US Airforce as well as nuclear energy research.

On one occasion only Rob Telson mentioned nuclear energy and that is all it was a mention with no other details.

Brainchip making every sensor smart even in drones (perhaps???) that are being used to autonomously survey nuclear facilities.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Love
  • Thinking
Reactions: 29 users

GDJR69

Regular
Hi All,

I will start by saying - I see most people here including me are all fully invested in Brainchips future and the future revenues are quite exciting to contemplate. I actually just bought a few more shares today, just keen on topping up. I see any price below $3 as very cheap in the long term play for Brainchip - dare I say one day less than $10 will be very cheap.

I guess I do want to discuss the expectations out there around this next quarterly due from Brainchip as expectations on the last quarterly seemed to be a bit high for some people around revenue. What I really would not like to see is vocal disappointment and the flow on effect of retail selling due to people expecting sizeable revenue to appear in the next quarterly.

Prior to the last quarterly the company said they expect a starting step up in revenue in the last half of the calendar year 2022. That to me meant - expect not much revenue to show up yet in the next quarterly - and that proved true. It was around 250k

The same goes again for this quarter - I am not expecting much revenue as yet as we are not in the second half of the year yet - I hope most other people are setting their expectations in a similar way. I expect less than a 1 million revenue again. Any improvement to my view is a bonus.

Tbh July to Sept 2022 quarterly we most likely will not see any massive breakthrough for squillions of dollars - could be under 5 mil maybe even 3 mil. If there is more I see it as a bonus. What we want to see it decent growth in revenue from the end of this year - but there wont be much just yet it is clear.

What we know is Brainchip only recently got their fully tested silicon on the market - it takes time to sell it/the IP to the mass market - The process to a new interested partner: Prove the product to the potential partner, then cut the deals with the partner, then partner plans the products and then they build the integrated products(supply can be an issue) and then fully test them - then the partner has to commercialise their product and market that integrated product to the market - for Brainchip to get that singular revenue stream - and that is keeping it pretty simple - it is more complex than that( packaging, shipping. resellers etc etc etc). It takes a fair bit of time - years as we know.

Obviously Brainchip have been going at it for a few years with the EAP's. That is obviously where the early revenue is coming from in the second half of this year - how big and how fast the money will be....we don't know. We can see Brainchip are involved with a hell of a lot of partners and potential products - it does not mean immediate revenue - but it's all awesome future potential revenue.

Anyway I am just keen for there to be a reality based view of what we should expect this next quarterly report in July. Some awesome progression of partners and planning of products - the dollars will start to flow later in the year is clear. :)
I agree, I'm not expecting revenue in the next quarterly. If revenue is predicted to start flowing in the second half of 2022 then the October quarterly is the first one that is likely to show any revenue. None of us should be surprised or discouraged by this.
 
Last edited:
  • Like
Reactions: 37 users

robsmark

Regular
Thanks @Dolci
What are peoples thoughts on a inclusion into the 200?
I have heard conflicting outcomes , some good/some not so , after inclusion.
Anyone had experiences they"d like too share?

Also, the continual "pushing down"of the s/p, would there be "entities" wanting the s/p to stay low, as not to force institutions to have to buy in?
Cheers .................. Akida Ballista
From my understanding certain funds must hold a portion of the ASX200 on their books, so I guess it would bring revenue. That being said, the ‘no revenue brigade’ will no doubt be trying to short it.

It has to pass through the ASX200 though if it wants to be at the top of the ASX100. Let’s hope for some validation from the company soon in the form of material contracts and increased revenue, which will justify Brainchip well deserved spot on the next run of the ladder.
 
  • Like
  • Fire
Reactions: 23 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
OMG, has everyone seen this?!


Toyota's New ‘Intelligent Assistant' Learns Voice Commands and Gets Smarter Over Time Using Machine Learning​


【Summary】Toyota’s new in-car “Intelligent Assistant” is a highly advanced voice activated assistant that can learn a driver’s commands and get smarter over time. It’s available for both the Toyota Audio Multimedia and Lexus Interface, including the 2022 like the Toyota Tundra and Lexus NX. Toyota says that the rollout of these intelligent connected services will transform it from an automotive company to a mobility company.​


87_avatar_middle.jpg
Eric Walz Jun 01, 2022 3:30 PM PT





Toyota's New ‘Intelligent Assistant' Learns Voice Commands and Gets Smarter Over Time Using Machine Learning

Once a driver asks for help, Toyota's Intelligent Assistant listens and responds to natural language conversation.


As vehicles come packed with more technology and features, using some of it on the road can become a distraction. So Toyota developed a voice-activated assistant to lend a hand as part of the "Toyota Connected" vehicle services.
Toyota's new in-car "Intelligent Assistant" is a highly advanced voice activated assistant that can learn a driver's commands and get smarter over time. It's available for both the Toyota Audio Multimedia and Lexus Interface, including the 2022 like the Toyota Tundra and Lexus NX.
Toyota offers Connected Services trials for each vehicle, which includes the services Drive Connect, which offers Intelligent Assistant, Cloud Navigation and Destination Assist.
Toyota says that the rollout of these intelligent connected services will transform it from an automotive company to a mobility company.
Toyota's Intelligent Assist can be activated with the phrase "Hey,Toyota" and supplements the onboard Voice Assistant that's embedded in Toyota vehicles, according to Ryan Oehler, product owner at Toyota Connected, an independent software and innovation company, who leads the team responsible for Toyota's new Intelligent Assistant.
Once a driver asks for help, the Intelligent Assistant listens and responds to natural language conversation. For example, if a driver says "I would like a coffee", the system will bring up a list of nearby coffee shops on the vehicle's infotainment screen, then ask the driver if they want directions to one of them for a midday break.
Suppose a driver is looking for a nearby Starbuck and the assistant brings up multiple listings, a driver can then say "take me to the third one" and the navigation system will launch the turn-by-turn directions.
To make navigation easier, the map can be brought up on the infotainment screen at any time by saying, "Hey, Toyota, show the map." Drivers can also ask the assistant to zoom in on the map for easier reading.
Each request is fulfilled using a complex web of commands, and behind the scenes processing on the vehicle itself and the cloud to complete it.
Oehler said there has been a steady ramp of the Intelligent Assistant to meet both mainstream American customers as well as tech-savvy "power users" with intuitive features for drivers. He also said that even more advanced features are in the pipeline.
Not only does Intelligent Assist listen to a driver's voice using automatic speech recognition (ASR), it also looks at waveforms in the audio. When the driver pushes the talk button in the vehicle, it transcribes that speech to text both in the vehicle's embedded computer and over the cloud.
By analyzing the audio waveforms, "the system is able to understand, phonetically, what that translates to and formulate transcriptions based on that audio input," Oehler said.
The technology is even able to recognize commands for different accents, dialects and even different pitches, to recognize what to do next. Since Intelligent Assistant is available in the U.S., Canada and Mexico, it's designed to recognize commands in English, Spanish and French.
Once a command is transcribed into text, its processed by Toyota Connected machine learning models in the cloud. The machine learning models can determine the intent of the words being used, from audio commands to windshield wipers or even finding a five-star-rated restaurant nearby.
From there, it's fed back to the vehicle where notes are compared between the embedded and cloud voice assistants to ensure the car is most accurately executing the command the user actually wants.
"This is where the Toyota Connected magic comes in," Oehler said.
The system first determines "top-line intent", which includes common vehicle controls such as temperature, audio and navigation. The next step is to determine "sub-intent", such as a specific temperature, a song title or frequently visited cafe.
If a drivers requests a certain song while already listening to their favorite streaming platform such as Spotify, the head unit will attempt to find the song on that specific platform. If it's not found, the head unit will switch to the next best option, such as satellite radio. Drivers can also ask Intelligent Assist to play specific genres of music by saying "Hey Toyota, play rock music."
The system gets smarter by looking at accuracy and tweaking commands to add more ways to ask for specific functions, according to Toyota. Drivers can also ask the system to send a text message to a stored contact just by saying the person's name then using their voice to send a message.
"Certain audiences expect to interact in completely different ways," Oehler said. "Younger audiences that are more familiar with their modern voice assistants tend to operate in a more fluid manner, whereas people that are less familiar would say, ‘Navigation' and then ‘Texas' in a series of steps," for instance. To help the user, we've added more contextual things, such as prompts, to help guide them through this new user experience. One of the other important things that we're learning is what our customers want to use on a daily basis."
Intelligent Assistant is already available in the Lexus NX, LX, and Toyota Tundra pickup for the 2022 model year, but will be introduced in more Toyota models for 2023 in North America. As it rolls out to more models, the speed at which the machine learning system gets smarter will only accelerate, according to Toyota.
Among the new features planned is asking the Intelligent Assistant to tell a joke. The sense of humor was added to the Virtual Assistant's repertoire after the team of engineers observed customers asking for it.
"There's a lot of opportunity," Oehler said. "There's value with a voice assistant because it adds depth to the experience, is smart, and can and likely will strengthen consumer confidence as more people use it and evolve in future iterations."
Using Intelligent Assistant is also optional on Toyota vehicles. For drivers that would rather not use it, the vehicle's embedded Voice Assistant can be used instead for more basic voice functions, such as adjusting the interior temperature.


 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 67 users
I agree, I'm not expecting revenue in the next quarterly. If revenue is predicted to start flowing in the second half of 2022 then the October quarterly is the first one that is likely to show any revenue. None of us should be surprised or discouraged by this.
I think 2023 BRN Will hit top gear
 
  • Like
Reactions: 23 users

KKFoo

Regular
Convinced my boss to invest when we were at 5cents. He threw 5k at it and has held on stong. Needless to say he is rather happy atm, but he keeps wanting another recommendation from me. He thinks I am a stock guru now. Little does he know I'm a one trick pony lol
 
  • Like
  • Haha
Reactions: 4 users

Diogenese

Top 20
A walk down recent memory lane:

ISL Using Neuromorphic Chip-Based Radar Research Solution for Air Force

Information Systems Laboratories, an employee-owned company, is using BrainChip Holdings’ Akida neural network processor to develop an AI-driven radar research solution for the Air Force Research Laboratory.
Akida is an ultra-low-power chip with a neuromorphic architecture, an artificial intelligence design inspired by the biology of the human brain. ISL’s work on the AI-powered solution is sponsored by the Air Force Research Laboratory, BrainChip said.
BrainChip added that its Akida platform is built to provide power consumption improvements, design flexibility and edge-learning capabilities.
Sean Hehir, CEO of BrainChip, said that ISL adopted Akida because of its production-ready status and go-to-market advantages. “We feel that the combination of technologies will help expedite its deployment into the field,” Hehir said.
As a member of BrainChip’s early access program, ISL also has access to boards with the Akida device, software and hardware support and additional engineering resources.
Jamie Bergin, ISL senior vice president and senior manager of research, development and engineering solutions at ISL, added that the early access program has allowed ISL to evaluate first-hand the capabilities provided by Akida.
BrainChip said that Akida is available for licensing, as well as for orders for production release in silicon. The platform will also unlock benefits in health care, smart cities, transportation and smart home applications, the company added.
Headquartered in San Diego, California, ISL is a technology company that provides research, analysis and hardware design services to customers in the defense and energy sectors, according to its LinkedIn profile.
ISL’s specialties include advanced signal processing, space exploration, undersea systems, surveillance, tracking, cybersecurity, advanced radar systems and energy independence, BrainChip said
."



AND FROM ISL’S WEBSITE:

Cognitive Systems

ISL literally wrote the first book on Cognitive Radar in 2010, 2nd Edition 2020 (https://us.artechhouse.com/Cognitiv...y-Adaptive-Approach-Second-Edition-P2093.aspx ), and continues to provide cutting edge solutions in advanced cognitive sensing.

What distinguishes a cognitive system from a more traditional adaptive one, is its contextual “awareness” and advanced embedded computational and AI capabilities. In the case of active sensors such as radar, cognitive sensors also employ active “probing” of the environment to better characterize the entire channel (targets, clutter, interference).


Neuromorphic Engineering and Artificial Intelligence
ISL is focused on replicating the analog nature of biological computation and the role of neurons in cognition. ISL’s team of scientists/engineers continue to understand how the morphology of individual neurons, circuits, applications, and overall architectures creates desirable computations. Leveraging this understanding and the newly developed and emerging commercial neuromorphic chips, ISL is developing a new low-power, lightweight detect and avoid (DAA) system for very small UAS platforms that exploits automotive radar hardware, light-weight EO/IR sensors, advanced data fusion algorithms, and neuromorphic computing.

I know many here if not most know about ISL, Brainchip and the US Air Force Research Laboratory project but interestingly having just trawled through the ISL website and social media Brainchip & AKIDA are not named anywhere yet this project is still live. When you read ISL's website it is clear those working there are very much invested in the US Airforce as well as nuclear energy research.

On one occasion only Rob Telson mentioned nuclear energy and that is all it was a mention with no other details.

Brainchip making every sensor smart even in drones (perhaps???) that are being used to autonomously survey nuclear facilities.

My opinion only DYOR
FF

AKIDA BALLISTA
So we've got the 1000 eyes, who know what we know, and the RoW (Rest of World) who are not getting the message.

I cannot conceive that anyone who knows what we know would not be at least ankle deep in BRN shares, but, apart from Germany, there does not seem to be any general awareness.

I know all about the revenue argument, but that just proves to me the inability of the market to even peek over the rim of the box.


When I bought in in early 2018, I was looking for AI shares, and my research happened upon BRN. At the time, the company was preparing to make a presentation in the US within days, and I contacted several friends to encourage them to get on board asap, because I believed it would explode when the US presentation was made. I even bought small parcels for friends who were uncontactable or lacked the knowhow.

So yes, FOMO played a part in my BRN initiation. But it was anticipatory FOMO and based on research. And we've had plenty of ups and downs since then, and weathered a couple of confidence-shattering dips and come out the other side.

So it looks like we are now only seeing the match approach the blue touch paper.

My question is, why, if the true believers were capable of seeing the true value of Akida, why are there still under 50k shareholders when, in America alone, there must be over 100 times that number of people with the knowledge to understand the capabilities of Akida and that the AI tsunami is gathering power?

Maybe if Ford had bought out a concept vehicle capable of 10 mpg and a 60 gallon tank and featuring "Hey Edsel!"?
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 55 users

Ahboy

Regular
  • Like
  • Fire
  • Sad
Reactions: 16 users
OMG, has everyone seen this?!


Toyota's New ‘Intelligent Assistant' Learns Voice Commands and Gets Smarter Over Time Using Machine Learning​


【Summary】Toyota’s new in-car “Intelligent Assistant” is a highly advanced voice activated assistant that can learn a driver’s commands and get smarter over time. It’s available for both the Toyota Audio Multimedia and Lexus Interface, including the 2022 like the Toyota Tundra and Lexus NX. Toyota says that the rollout of these intelligent connected services will transform it from an automotive company to a mobility company.​


87_avatar_middle.jpg
Eric Walz Jun 01, 2022 3:30 PM PT





Toyota's New ‘Intelligent Assistant' Learns Voice Commands and Gets Smarter Over Time Using Machine Learning's New ‘Intelligent Assistant' Learns Voice Commands and Gets Smarter Over Time Using Machine Learning

Once a driver asks for help, Toyota's Intelligent Assistant listens and responds to natural language conversation.


As vehicles come packed with more technology and features, using some of it on the road can become a distraction. So Toyota developed a voice-activated assistant to lend a hand as part of the "Toyota Connected" vehicle services.
Toyota's new in-car "Intelligent Assistant" is a highly advanced voice activated assistant that can learn a driver's commands and get smarter over time. It's available for both the Toyota Audio Multimedia and Lexus Interface, including the 2022 like the Toyota Tundra and Lexus NX.
Toyota offers Connected Services trials for each vehicle, which includes the services Drive Connect, which offers Intelligent Assistant, Cloud Navigation and Destination Assist.
Toyota says that the rollout of these intelligent connected services will transform it from an automotive company to a mobility company.
Toyota's Intelligent Assist can be activated with the phrase "Hey,Toyota" and supplements the onboard Voice Assistant that's embedded in Toyota vehicles, according to Ryan Oehler, product owner at Toyota Connected, an independent software and innovation company, who leads the team responsible for Toyota's new Intelligent Assistant.
Once a driver asks for help, the Intelligent Assistant listens and responds to natural language conversation. For example, if a driver says "I would like a coffee", the system will bring up a list of nearby coffee shops on the vehicle's infotainment screen, then ask the driver if they want directions to one of them for a midday break.
Suppose a driver is looking for a nearby Starbuck and the assistant brings up multiple listings, a driver can then say "take me to the third one" and the navigation system will launch the turn-by-turn directions.
To make navigation easier, the map can be brought up on the infotainment screen at any time by saying, "Hey, Toyota, show the map." Drivers can also ask the assistant to zoom in on the map for easier reading.
Each request is fulfilled using a complex web of commands, and behind the scenes processing on the vehicle itself and the cloud to complete it.
Oehler said there has been a steady ramp of the Intelligent Assistant to meet both mainstream American customers as well as tech-savvy "power users" with intuitive features for drivers. He also said that even more advanced features are in the pipeline.
Not only does Intelligent Assist listen to a driver's voice using automatic speech recognition (ASR), it also looks at waveforms in the audio. When the driver pushes the talk button in the vehicle, it transcribes that speech to text both in the vehicle's embedded computer and over the cloud.
By analyzing the audio waveforms, "the system is able to understand, phonetically, what that translates to and formulate transcriptions based on that audio input," Oehler said.
The technology is even able to recognize commands for different accents, dialects and even different pitches, to recognize what to do next. Since Intelligent Assistant is available in the U.S., Canada and Mexico, it's designed to recognize commands in English, Spanish and French.
Once a command is transcribed into text, its processed by Toyota Connected machine learning models in the cloud. The machine learning models can determine the intent of the words being used, from audio commands to windshield wipers or even finding a five-star-rated restaurant nearby.
From there, it's fed back to the vehicle where notes are compared between the embedded and cloud voice assistants to ensure the car is most accurately executing the command the user actually wants.
"This is where the Toyota Connected magic comes in," Oehler said.
The system first determines "top-line intent", which includes common vehicle controls such as temperature, audio and navigation. The next step is to determine "sub-intent", such as a specific temperature, a song title or frequently visited cafe.
If a drivers requests a certain song while already listening to their favorite streaming platform such as Spotify, the head unit will attempt to find the song on that specific platform. If it's not found, the head unit will switch to the next best option, such as satellite radio. Drivers can also ask Intelligent Assist to play specific genres of music by saying "Hey Toyota, play rock music."
The system gets smarter by looking at accuracy and tweaking commands to add more ways to ask for specific functions, according to Toyota. Drivers can also ask the system to send a text message to a stored contact just by saying the person's name then using their voice to send a message.
"Certain audiences expect to interact in completely different ways," Oehler said. "Younger audiences that are more familiar with their modern voice assistants tend to operate in a more fluid manner, whereas people that are less familiar would say, ‘Navigation' and then ‘Texas' in a series of steps," for instance. To help the user, we've added more contextual things, such as prompts, to help guide them through this new user experience. One of the other important things that we're learning is what our customers want to use on a daily basis."
Intelligent Assistant is already available in the Lexus NX, LX, and Toyota Tundra pickup for the 2022 model year, but will be introduced in more Toyota models for 2023 in North America. As it rolls out to more models, the speed at which the machine learning system gets smarter will only accelerate, according to Toyota.
Among the new features planned is asking the Intelligent Assistant to tell a joke. The sense of humor was added to the Virtual Assistant's repertoire after the team of engineers observed customers asking for it.
"There's a lot of opportunity," Oehler said. "There's value with a voice assistant because it adds depth to the experience, is smart, and can and likely will strengthen consumer confidence as more people use it and evolve in future iterations."
Using Intelligent Assistant is also optional on Toyota vehicles. For drivers that would rather not use it, the vehicle's embedded Voice Assistant can be used instead for more basic voice functions, such as adjusting the interior temperature.


New to me and so is this
Hey BMW artilce
Hey BMW Youtube
The latter is dated 2018 WTH... how did they let Mercedes beat them to proclaiming we are using Akida first?
 
Last edited:
  • Like
  • Haha
Reactions: 13 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
OMG, has everyone seen this?!


Toyota's New ‘Intelligent Assistant' Learns Voice Commands and Gets Smarter Over Time Using Machine Learning​


【Summary】Toyota’s new in-car “Intelligent Assistant” is a highly advanced voice activated assistant that can learn a driver’s commands and get smarter over time. It’s available for both the Toyota Audio Multimedia and Lexus Interface, including the 2022 like the Toyota Tundra and Lexus NX. Toyota says that the rollout of these intelligent connected services will transform it from an automotive company to a mobility company.​


87_avatar_middle.jpg
Eric Walz Jun 01, 2022 3:30 PM PT





Toyota's New ‘Intelligent Assistant' Learns Voice Commands and Gets Smarter Over Time Using Machine Learning's New ‘Intelligent Assistant' Learns Voice Commands and Gets Smarter Over Time Using Machine Learning

Once a driver asks for help, Toyota's Intelligent Assistant listens and responds to natural language conversation.


As vehicles come packed with more technology and features, using some of it on the road can become a distraction. So Toyota developed a voice-activated assistant to lend a hand as part of the "Toyota Connected" vehicle services.
Toyota's new in-car "Intelligent Assistant" is a highly advanced voice activated assistant that can learn a driver's commands and get smarter over time. It's available for both the Toyota Audio Multimedia and Lexus Interface, including the 2022 like the Toyota Tundra and Lexus NX.
Toyota offers Connected Services trials for each vehicle, which includes the services Drive Connect, which offers Intelligent Assistant, Cloud Navigation and Destination Assist.
Toyota says that the rollout of these intelligent connected services will transform it from an automotive company to a mobility company.
Toyota's Intelligent Assist can be activated with the phrase "Hey,Toyota" and supplements the onboard Voice Assistant that's embedded in Toyota vehicles, according to Ryan Oehler, product owner at Toyota Connected, an independent software and innovation company, who leads the team responsible for Toyota's new Intelligent Assistant.
Once a driver asks for help, the Intelligent Assistant listens and responds to natural language conversation. For example, if a driver says "I would like a coffee", the system will bring up a list of nearby coffee shops on the vehicle's infotainment screen, then ask the driver if they want directions to one of them for a midday break.
Suppose a driver is looking for a nearby Starbuck and the assistant brings up multiple listings, a driver can then say "take me to the third one" and the navigation system will launch the turn-by-turn directions.
To make navigation easier, the map can be brought up on the infotainment screen at any time by saying, "Hey, Toyota, show the map." Drivers can also ask the assistant to zoom in on the map for easier reading.
Each request is fulfilled using a complex web of commands, and behind the scenes processing on the vehicle itself and the cloud to complete it.
Oehler said there has been a steady ramp of the Intelligent Assistant to meet both mainstream American customers as well as tech-savvy "power users" with intuitive features for drivers. He also said that even more advanced features are in the pipeline.
Not only does Intelligent Assist listen to a driver's voice using automatic speech recognition (ASR), it also looks at waveforms in the audio. When the driver pushes the talk button in the vehicle, it transcribes that speech to text both in the vehicle's embedded computer and over the cloud.
By analyzing the audio waveforms, "the system is able to understand, phonetically, what that translates to and formulate transcriptions based on that audio input," Oehler said.
The technology is even able to recognize commands for different accents, dialects and even different pitches, to recognize what to do next. Since Intelligent Assistant is available in the U.S., Canada and Mexico, it's designed to recognize commands in English, Spanish and French.
Once a command is transcribed into text, its processed by Toyota Connected machine learning models in the cloud. The machine learning models can determine the intent of the words being used, from audio commands to windshield wipers or even finding a five-star-rated restaurant nearby.
From there, it's fed back to the vehicle where notes are compared between the embedded and cloud voice assistants to ensure the car is most accurately executing the command the user actually wants.
"This is where the Toyota Connected magic comes in," Oehler said.
The system first determines "top-line intent", which includes common vehicle controls such as temperature, audio and navigation. The next step is to determine "sub-intent", such as a specific temperature, a song title or frequently visited cafe.
If a drivers requests a certain song while already listening to their favorite streaming platform such as Spotify, the head unit will attempt to find the song on that specific platform. If it's not found, the head unit will switch to the next best option, such as satellite radio. Drivers can also ask Intelligent Assist to play specific genres of music by saying "Hey Toyota, play rock music."
The system gets smarter by looking at accuracy and tweaking commands to add more ways to ask for specific functions, according to Toyota. Drivers can also ask the system to send a text message to a stored contact just by saying the person's name then using their voice to send a message.
"Certain audiences expect to interact in completely different ways," Oehler said. "Younger audiences that are more familiar with their modern voice assistants tend to operate in a more fluid manner, whereas people that are less familiar would say, ‘Navigation' and then ‘Texas' in a series of steps," for instance. To help the user, we've added more contextual things, such as prompts, to help guide them through this new user experience. One of the other important things that we're learning is what our customers want to use on a daily basis."
Intelligent Assistant is already available in the Lexus NX, LX, and Toyota Tundra pickup for the 2022 model year, but will be introduced in more Toyota models for 2023 in North America. As it rolls out to more models, the speed at which the machine learning system gets smarter will only accelerate, according to Toyota.
Among the new features planned is asking the Intelligent Assistant to tell a joke. The sense of humor was added to the Virtual Assistant's repertoire after the team of engineers observed customers asking for it.
"There's a lot of opportunity," Oehler said. "There's value with a voice assistant because it adds depth to the experience, is smart, and can and likely will strengthen consumer confidence as more people use it and evolve in future iterations."
Using Intelligent Assistant is also optional on Toyota vehicles. For drivers that would rather not use it, the vehicle's embedded Voice Assistant can be used instead for more basic voice functions, such as adjusting the interior temperature.




What do you think @MC🐠? Do you think this might all be interrelated?🤞





Screen Shot 2022-06-02 at 5.13.08 pm.png
 
  • Like
  • Fire
  • Love
Reactions: 59 users
Top Bottom