BRN Discussion Ongoing

M_C

Founding Member
Do we know if we have any patents coming soon?
According to PvdM (or maybe Sean?) we have 100 this year I believe ? (same patents in different countries was my impression)
 
  • Like
  • Fire
  • Love
Reactions: 17 users
It makes me wonder why someone like Renasas would use anything other Akida for endpoint AI with spiking AI accelerators when the own Akida IP? Does it have anything with the extent of the IP licensing agreement that they can only use Akida for simpler applications?
I think the answer is to be found in the one word “TRUST”.

Rob Lincourt if DELL Technologies spoke about creating trust with consumers around Ai to facilitate its uptake in his Brainchip podcast.

Renesas produce billions of MCU’s that are not smart (Ai) and their customer base “Trust” them and have used them for decades.

Renesas are offering something revolutionary to these customers and are looking to build “trust” and understanding of the advantages that an AKIDA MCU made smart sensor will bestow on their products in the market place.

Assuming they build this “trust” then they will look to build on that “trust” with bigger and better more widespread application of the AKIDA IP.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Fire
  • Thinking
Reactions: 29 users
According to PvdM (or maybe Sean?) we have 100 this year I believe ? (same patents in different countries was my impression)
We have yet to see the second generation AKIDA LSTM patent/s.

FF

AKIDA BALLISTA
 
  • Like
  • Fire
  • Love
Reactions: 21 users

Violin1

Regular
According to PvdM (or maybe Sean?) we have 100 this year I believe ? (same patents in different countries was my impression)
MC - I think it was 50 to be lodged this year across different countries. Still a good number though - aggressively pursuing patents. When you combine that with the lead we have and the price we are charging it seems inevitable that we'll be ubiquitous.
 
  • Like
  • Fire
Reactions: 17 users

Diogenese

Top 20
Morning all about to go see the new Maverick movie but just before i go i thought this wuite coincidental and interesting. Paravision have called a white paper
"Understanding Edge AI and its impact on face recognition"
and within these pages (page 8 of 12) is the exact same picture that is on Brainchips website View attachment 8542
View attachment 8543 And there is a video that says "more with less" which also probably a coincidence....
The BrainChip photo is more recent.
 
  • Like
  • Love
Reactions: 6 users

Bravo

If ARM was an arm, BRN would be its biceps💪!


Hi MrNick,

This is all part of Honda Motor Co's very grand plan to develop and build a "collision free" system which uses ADAS sensors and cameras to recognize potential risks in the vehicle’s surroundings and to "connect all road users, both people and mobility products, through telecommunications, making it possible to predict potential risks and help people avoid such risks before collisions actually occur". They have a timeline to establish underlying technologies during the first half of the 2020s, then launch practical applications during the second half of the 2020s with an aim to STANDARDIZE the technology in the second half of the 2020s.

What I really like about this article is that it stipulates that the plan requires AI to detect driving risks and behaviours on a REAL-TIME basis and...and...and..also, that the "Cognitive Assist" communicates risk with visual, tactile and auditory sensation. Now, if that doesn't reek of AKIDA, then I'll be a monkey's uncle. He-he-he!

A lot of the technologies that will be rolled out under this plan are still in R&D phase, but I have no doubt that AKIDA is at the top of the VIP list!

B x🥳





Honda Unveils the World Premiere of Advanced Future Safety Technologies toward the Realization of its Goal for Zero Traffic Collision Fatalities by 2050​



November 25, 2021 — TOKYO, Japan


Honda Motor Co., Ltd. today held the world premiere of advanced future safety technologies Honda is currently developing for the realization of a society where everyone sharing the road will be liberated from the risk of traffic collisions and enjoy freedom of mobility with total peace of mind.
Honda will strive to attain its goal of realizing "zero traffic collision fatalities involving Honda motorcycles and automobiles globally by 2050" utilizing two key technologies. One is the world’s first artificial intelligence (AI)-powered "Intelligent Driver-Assistive Technology" providing assistance that is suited to the ability and situation of each individual to reduce driving errors and risks, helping the driver achieve safe and sound driving. The other is the "Safe and Sound Network Technology" which connects all road users, both people and mobility products, through telecommunications, making it possible to predict potential risks and help people avoid such risks before collisions actually occur.
b8c9cf888fb56589ed18617e70a864770a34c0c2


  • Realization of "zero traffic collision fatalities by 2050"
Striving for a collision-free society for everyone sharing the road, represented by the global safety slogan "Safety for Everyone," Honda has been pursuing the research and development of safety technologies from the perspective of both hardware and software.
For the pursuit of a collision-free society, Honda will expand the introduction of Honda SENSING 360, a recently announced omnidirectional safety and driver-assistive system, to all models to go on sale in all major markets by 2030. Moreover, Honda will continue working to expand application of a motorcycle detection function and further enhance functions of its ADAS (advanced driver-assistance system).
Furthermore, Honda also will continue to make progress in expanding application of motorcycle safety technologies and offering of safety education technologies (Honda Safety EdTech). Through these initiatives, Honda will strive to reduce global traffic collision fatalities involving Honda motorcycles and automobiles by half*1 by 2030.
Beyond that, Honda will strive to realize its ambitious goal of "zero traffic collision fatalities by 2050" through establishment of future safety technologies at the earliest possible timing.
  • 1) Safety suited to each individual:
Aiming for "zero human error" in driving with the "Intelligent Driver-Assistive Technology"
  • Honda has unraveled the factors behind human errors through its original fMRI*2-based study of the human brain and analysis of risk-taking behaviors.
  • The system presumes predictors of driving errors based on information obtained through a driver monitoring camera and pattern of the driving operations.
  • This technology is being developed to enable each individual driver to mitigate driving errors and enjoy mobility without any sense of anxiety.
  • Honda will strive for establishment of underlying technologies during the first half of the 2020s, with practical application during the second half of the 2020s.
With the goal to unravel underlying causes of driving errors that make the driver feel anxious, Honda has been conducting research and development of "technologies to understand people" with an original method that utilizes fMRI*2.
In addition to technologies to understand human behavior and conditions, which Honda has amassed to date, the "Intelligent Driver-Assistive Technology" unveiled today, the world’s first such technology, uses ADAS sensors and cameras to recognize potential risks in the vehicle’s surroundings, which enables AI to detect driving risks. At the same time, AI will determine optimal driving behavior on a real-time basis and offer assistance suited to the cognitive state and traffic situations of each individual driver.
With the next-generation driver-assistive functions currently under research and development, Honda will strive to offer the new value of "error-free" safety and peace of mind which are suited to the driving behavior and situation of each individual driver and keep them away from any potential risks.
Three values Honda will offer with its next-generation driver assist technology
  1. No driving operation errors (Operational assist):
    Vehicle offers AI-based assist to reduce drifting and prevent a delay in operations.
  2. No oversight/No prediction errors (Cognitive assist): Vehicle communicates risks with visual, tactile and auditory sensations.
    Technologies in R&D phase: Risk indicator, seatbelt control and 3D audio

  3. No errors due to daydreaming and careless driving (Attentiveness assist): Vehicle helps reduce driver fatigue/drowsiness
    Technologies in R&D phase: Bio feedback/vibration stimulus through the seatback
0ccb215d80866e475e65c93a263fb168f8007c0b


From here onward, Honda will further advance the "Intelligent Driver-Assistive Technology" unveiled today and continue making progress in development with the goal to establish underlying technologies during the first half of the 2020s, then launch practical applications during the second half of the 2020s.
With this technology, Honda will advance the conventional driver assist which helps the driver avoid risk while it is occurring to the AI-powered driver assist which will keep the driver away from the risks and strive to completely eliminate human errors, which are the cause of over 90% of traffic collisions*3.
  • 2) Safe coexistence of all road user:
Establishment of the "Safe and Sound Network Technology" which connects all road users through telecommunication
  • System understands/recognizes the situation and surrounding environment of each driver and road user
  • Through the communication network, information about potential risks in the traffic environment will be aggregated in the server, and risks are predicted using the reproduction of the traffic environment in the virtual space.
  • System derives the most appropriate support information, communicates it to each road user and encourages them to take actions to avoid potential risk before it actually happens.
  • Honda will accelerate industrywide and public-private collaboration with an aim to standardize the technology in the second half of the 2020s.
8692260a321451c6c8316f32eab9d9160f225b8a


To realize a "collision-free" mobility society for all road users, Honda is striving to realize a "cooperative safety society" where utilization of telecommunication technologies will enable everyone sharing the road to be connected and coexist.
With the ‘Safe and Sound Technology," information about potential risks in the traffic environment, which are detected based on information obtained from roadside cameras, on-board cameras and smartphones, will be aggregated in the server to reproduce that traffic environment in the virtual space. In that virtual space, in consideration of the conditions and characteristics of each individual road user, the system predicts/simulates the behaviors of road users at high risk of a collision. Then, the system derives the most appropriate support information to help the road users avoid risks.
Such support information will be communicated intuitively to automobile drivers, motorcycle riders and pedestrians through "cooperative risk HMI (human-machine interface)," which will make it possible for the system to encourage road users to take action to avoid a collision before it happens.
Aiming for real-world implementation of this technology after 2030, Honda will build the system and complete verification of effectiveness in the first half of the 2020s, then accelerate industrywide and public-private collaboration with an aim to standardize the technology in the second half of the 2020s.
  • Comments by Keiji Ohtsu, President and Representative Director of Honda R&D Co., Ltd.:
"Striving to completely eliminate mobility risks for everyone sharing the road, Honda will offer safety and peace of mind of each and every road user as a new value. Applying our future safety technologies which will embody such new value, Honda will work toward the realization of ‘zero traffic collision fatalities’ involving Honda motorcycles and automobiles globally by 2050. For the realization of a collision-free society where all road users care for each other and the freedom of mobility becomes possible, we will further accelerate our industry-wide and public-private initiatives."
*1 Reduce traffic collision fatalities involving Honda motorcycles and automobiles per 10,000 units sold by 50% by 2030 compared to 2020.
*2 The functional magnetic resonance imaging, fMRI (One of the methods to obtain images of brain’s functioning areas based on changes in blood flow.)
*3 Source: "Number of Fatalities in Traffic Accidents By Type of Violations of Laws," White Paper on Traffic Safety in Japan 2017
About Honda
Honda offers a full line of clean, safe, fun and connected vehicles sold through more than 1,000 independent U.S. Honda dealers. Honda has the highest fleet average fuel economy and lowest CO2 emissions of any major full-line automaker in America, according to the latest data from the U.S. Environmental Protection Agency (EPA). The Honda lineup includes the Civic, Insight, Accord and Clarity series passenger cars, along with the HR-V, CR-V, Passport and Pilot sport utility vehicles, the Ridgeline pickup and the Odyssey minivan.
Honda has been producing automobiles in America for 38 years and currently operates 18 major manufacturing facilities in North America. In 2020, more than 95 percent of all Honda vehicles sold in the U.S. were made in North America, using domestic and globally sourced parts.
More information about Honda is available in the Digital Fact Book.
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 38 users

Diogenese

Top 20
It makes me wonder why someone like Renasas would use anything other Akida for endpoint AI with spiking AI accelerators when the own Akida IP? Does it have anything with the extent of the IP licensing agreement that they can only use Akida for simpler applications?
Hi TFM,

Renesas do have their own in-house brand of AI called DRP-AI (Dynamically Reconfigurable Processor -AI) which they have worked on for a decade.

1654395397801.png
The recent Renesas announcement did disclose that they were using Akida in very low power products (battery operated?), but it seems they are set on using their DRP-AI wherever they can, even though Akida would give better power consumption and speed.

That's why I don't see Renesas using Akida with a 64-bit processor any time soon, even though it could be used to advantage as an image data accelerator as an input for the 64-bit von Neumann processor. However, competitive pressures may alter that.
 
  • Like
  • Love
  • Fire
Reactions: 30 users
Hi MrNick,

This is all part of Honda Motor Co's very grand plan to develop and build a "collision free" system which uses ADAS sensors and cameras to recognize potential risks in the vehicle’s surroundings and to "connect all road users, both people and mobility products, through telecommunications, making it possible to predict potential risks and help people avoid such risks before collisions actually occur". They have a timeline to establish underlying technologies during the first half of the 2020s, then launch practical applications during the second half of the 2020s with an aim to STANDARDIZE the technology in the second half of the 2020s.

What I really like about this article is that it stipulates that the plan requires AI to detect driving risks and behaviours on a REAL-TIME basis and...and...and..also, that the "Cognitive Assist" communicates risk with visual, tactile and auditory sensation. Now, if that doesn't reek of AKIDA, then I'll be a monkey's uncle. He-he-he!

A lot of the technologies that will be rolled out under this plan are still in R&D phase, but I have no doubt that AKIDA is at the top of the VIP list!

B x🥳





Honda Unveils the World Premiere of Advanced Future Safety Technologies toward the Realization of its Goal for Zero Traffic Collision Fatalities by 2050​



November 25, 2021 — TOKYO, Japan


Honda Motor Co., Ltd. today held the world premiere of advanced future safety technologies Honda is currently developing for the realization of a society where everyone sharing the road will be liberated from the risk of traffic collisions and enjoy freedom of mobility with total peace of mind.
Honda will strive to attain its goal of realizing "zero traffic collision fatalities involving Honda motorcycles and automobiles globally by 2050" utilizing two key technologies. One is the world’s first artificial intelligence (AI)-powered "Intelligent Driver-Assistive Technology" providing assistance that is suited to the ability and situation of each individual to reduce driving errors and risks, helping the driver achieve safe and sound driving. The other is the "Safe and Sound Network Technology" which connects all road users, both people and mobility products, through telecommunications, making it possible to predict potential risks and help people avoid such risks before collisions actually occur.
b8c9cf888fb56589ed18617e70a864770a34c0c2


  • Realization of "zero traffic collision fatalities by 2050"
Striving for a collision-free society for everyone sharing the road, represented by the global safety slogan "Safety for Everyone," Honda has been pursuing the research and development of safety technologies from the perspective of both hardware and software.
For the pursuit of a collision-free society, Honda will expand the introduction of Honda SENSING 360, a recently announced omnidirectional safety and driver-assistive system, to all models to go on sale in all major markets by 2030. Moreover, Honda will continue working to expand application of a motorcycle detection function and further enhance functions of its ADAS (advanced driver-assistance system).
Furthermore, Honda also will continue to make progress in expanding application of motorcycle safety technologies and offering of safety education technologies (Honda Safety EdTech). Through these initiatives, Honda will strive to reduce global traffic collision fatalities involving Honda motorcycles and automobiles by half*1 by 2030.
Beyond that, Honda will strive to realize its ambitious goal of "zero traffic collision fatalities by 2050" through establishment of future safety technologies at the earliest possible timing.
  • 1) Safety suited to each individual:
Aiming for "zero human error" in driving with the "Intelligent Driver-Assistive Technology"
  • Honda has unraveled the factors behind human errors through its original fMRI*2-based study of the human brain and analysis of risk-taking behaviors.
  • The system presumes predictors of driving errors based on information obtained through a driver monitoring camera and pattern of the driving operations.
  • This technology is being developed to enable each individual driver to mitigate driving errors and enjoy mobility without any sense of anxiety.
  • Honda will strive for establishment of underlying technologies during the first half of the 2020s, with practical application during the second half of the 2020s.
With the goal to unravel underlying causes of driving errors that make the driver feel anxious, Honda has been conducting research and development of "technologies to understand people" with an original method that utilizes fMRI*2.
In addition to technologies to understand human behavior and conditions, which Honda has amassed to date, the "Intelligent Driver-Assistive Technology" unveiled today, the world’s first such technology, uses ADAS sensors and cameras to recognize potential risks in the vehicle’s surroundings, which enables AI to detect driving risks. At the same time, AI will determine optimal driving behavior on a real-time basis and offer assistance suited to the cognitive state and traffic situations of each individual driver.
With the next-generation driver-assistive functions currently under research and development, Honda will strive to offer the new value of "error-free" safety and peace of mind which are suited to the driving behavior and situation of each individual driver and keep them away from any potential risks.
Three values Honda will offer with its next-generation driver assist technology
  1. No driving operation errors (Operational assist):
    Vehicle offers AI-based assist to reduce drifting and prevent a delay in operations.
  2. No oversight/No prediction errors (Cognitive assist): Vehicle communicates risks with visual, tactile and auditory sensations.
    Technologies in R&D phase: Risk indicator, seatbelt control and 3D audio

  3. No errors due to daydreaming and careless driving (Attentiveness assist): Vehicle helps reduce driver fatigue/drowsiness
    Technologies in R&D phase: Bio feedback/vibration stimulus through the seatback
0ccb215d80866e475e65c93a263fb168f8007c0b


From here onward, Honda will further advance the "Intelligent Driver-Assistive Technology" unveiled today and continue making progress in development with the goal to establish underlying technologies during the first half of the 2020s, then launch practical applications during the second half of the 2020s.
With this technology, Honda will advance the conventional driver assist which helps the driver avoid risk while it is occurring to the AI-powered driver assist which will keep the driver away from the risks and strive to completely eliminate human errors, which are the cause of over 90% of traffic collisions*3.
  • 2) Safe coexistence of all road user:
Establishment of the "Safe and Sound Network Technology" which connects all road users through telecommunication
  • System understands/recognizes the situation and surrounding environment of each driver and road user
  • Through the communication network, information about potential risks in the traffic environment will be aggregated in the server, and risks are predicted using the reproduction of the traffic environment in the virtual space.
  • System derives the most appropriate support information, communicates it to each road user and encourages them to take actions to avoid potential risk before it actually happens.
  • Honda will accelerate industrywide and public-private collaboration with an aim to standardize the technology in the second half of the 2020s.
8692260a321451c6c8316f32eab9d9160f225b8a


To realize a "collision-free" mobility society for all road users, Honda is striving to realize a "cooperative safety society" where utilization of telecommunication technologies will enable everyone sharing the road to be connected and coexist.
With the ‘Safe and Sound Technology," information about potential risks in the traffic environment, which are detected based on information obtained from roadside cameras, on-board cameras and smartphones, will be aggregated in the server to reproduce that traffic environment in the virtual space. In that virtual space, in consideration of the conditions and characteristics of each individual road user, the system predicts/simulates the behaviors of road users at high risk of a collision. Then, the system derives the most appropriate support information to help the road users avoid risks.
Such support information will be communicated intuitively to automobile drivers, motorcycle riders and pedestrians through "cooperative risk HMI (human-machine interface)," which will make it possible for the system to encourage road users to take action to avoid a collision before it happens.
Aiming for real-world implementation of this technology after 2030, Honda will build the system and complete verification of effectiveness in the first half of the 2020s, then accelerate industrywide and public-private collaboration with an aim to standardize the technology in the second half of the 2020s.
  • Comments by Keiji Ohtsu, President and Representative Director of Honda R&D Co., Ltd.:
"Striving to completely eliminate mobility risks for everyone sharing the road, Honda will offer safety and peace of mind of each and every road user as a new value. Applying our future safety technologies which will embody such new value, Honda will work toward the realization of ‘zero traffic collision fatalities’ involving Honda motorcycles and automobiles globally by 2050. For the realization of a collision-free society where all road users care for each other and the freedom of mobility becomes possible, we will further accelerate our industry-wide and public-private initiatives."
*1 Reduce traffic collision fatalities involving Honda motorcycles and automobiles per 10,000 units sold by 50% by 2030 compared to 2020.
*2 The functional magnetic resonance imaging, fMRI (One of the methods to obtain images of brain’s functioning areas based on changes in blood flow.)
*3 Source: "Number of Fatalities in Traffic Accidents By Type of Violations of Laws," White Paper on Traffic Safety in Japan 2017
About Honda
Honda offers a full line of clean, safe, fun and connected vehicles sold through more than 1,000 independent U.S. Honda dealers. Honda has the highest fleet average fuel economy and lowest CO2 emissions of any major full-line automaker in America, according to the latest data from the U.S. Environmental Protection Agency (EPA). The Honda lineup includes the Civic, Insight, Accord and Clarity series passenger cars, along with the HR-V, CR-V, Passport and Pilot sport utility vehicles, the Ridgeline pickup and the Odyssey minivan.
Honda has been producing automobiles in America for 38 years and currently operates 18 major manufacturing facilities in North America. In 2020, more than 95 percent of all Honda vehicles sold in the U.S. were made in North America, using domestic and globally sourced parts.
More information about Honda is available in the Digital Fact Book.





Think Small (Nanoscale)

1654396904329.png

Creating a nano-ribbon. (Image: Honda)

While on the subject of things you may not know. . .

Honda Research Institute USA (HRI-US) scientists have synthesized ribbon-shaped two-dimensional materials that are on the order of 7 to 8 nanometers wide.

While it probably isn’t too helpful in terms of understanding what that means:

A nanometer is one-billionth of a meter.

Apparently, the approach the HRI-US scientists--who collaborated with those from Columbia and Rice Universities and the Oak Ridge National Laboratory--results in a narrower structure than can be obtained by more commonly used methods (e.g., nanolithography).

The point?

Quantum electronics.

The use?

“The potential applications are extremely broad,” said Dr. Avetik Harutyunyan, senior chief scientist at HRI-US (Honda Research Institute USA). “We see immediate opportunities for the applications in the high speed, low-energy consumption electronics, spintronics, quantum sensing, quantum and neuromorphic computing.””

Neuromorphic computing?

According to the folks at Intel, who are developing neuromorphic chips, “Guided by the principles of biological neural computation, neuromorphic computing uses new algorithmic approaches that emulate how the human brain interacts with the world to deliver capabilities closer to human cognition.”

Of course.

All of which is to say that there’s a whole lot going on at Honda besides developing Accords.
 
  • Like
  • Fire
Reactions: 13 users

MrNick

Regular
Hi MrNick,

This is all part of Honda Motor Co's very grand plan to develop and build a "collision free" system which uses ADAS sensors and cameras to recognize potential risks in the vehicle’s surroundings and to "connect all road users, both people and mobility products, through telecommunications, making it possible to predict potential risks and help people avoid such risks before collisions actually occur". They have a timeline to establish underlying technologies during the first half of the 2020s, then launch practical applications during the second half of the 2020s with an aim to STANDARDIZE the technology in the second half of the 2020s.

What I really like about this article is that it stipulates that the plan requires AI to detect driving risks and behaviours on a REAL-TIME basis and...and...and..also, that the "Cognitive Assist" communicates risk with visual, tactile and auditory sensation. Now, if that doesn't reek of AKIDA, then I'll be a monkey's uncle. He-he-he!

A lot of the technologies that will be rolled out under this plan are still in R&D phase, but I have no doubt that AKIDA is at the top of the VIP list!

B x🥳





Honda Unveils the World Premiere of Advanced Future Safety Technologies toward the Realization of its Goal for Zero Traffic Collision Fatalities by 2050​



November 25, 2021 — TOKYO, Japan


Honda Motor Co., Ltd. today held the world premiere of advanced future safety technologies Honda is currently developing for the realization of a society where everyone sharing the road will be liberated from the risk of traffic collisions and enjoy freedom of mobility with total peace of mind.
Honda will strive to attain its goal of realizing "zero traffic collision fatalities involving Honda motorcycles and automobiles globally by 2050" utilizing two key technologies. One is the world’s first artificial intelligence (AI)-powered "Intelligent Driver-Assistive Technology" providing assistance that is suited to the ability and situation of each individual to reduce driving errors and risks, helping the driver achieve safe and sound driving. The other is the "Safe and Sound Network Technology" which connects all road users, both people and mobility products, through telecommunications, making it possible to predict potential risks and help people avoid such risks before collisions actually occur.
b8c9cf888fb56589ed18617e70a864770a34c0c2


  • Realization of "zero traffic collision fatalities by 2050"
Striving for a collision-free society for everyone sharing the road, represented by the global safety slogan "Safety for Everyone," Honda has been pursuing the research and development of safety technologies from the perspective of both hardware and software.
For the pursuit of a collision-free society, Honda will expand the introduction of Honda SENSING 360, a recently announced omnidirectional safety and driver-assistive system, to all models to go on sale in all major markets by 2030. Moreover, Honda will continue working to expand application of a motorcycle detection function and further enhance functions of its ADAS (advanced driver-assistance system).
Furthermore, Honda also will continue to make progress in expanding application of motorcycle safety technologies and offering of safety education technologies (Honda Safety EdTech). Through these initiatives, Honda will strive to reduce global traffic collision fatalities involving Honda motorcycles and automobiles by half*1 by 2030.
Beyond that, Honda will strive to realize its ambitious goal of "zero traffic collision fatalities by 2050" through establishment of future safety technologies at the earliest possible timing.
  • 1) Safety suited to each individual:
Aiming for "zero human error" in driving with the "Intelligent Driver-Assistive Technology"
  • Honda has unraveled the factors behind human errors through its original fMRI*2-based study of the human brain and analysis of risk-taking behaviors.
  • The system presumes predictors of driving errors based on information obtained through a driver monitoring camera and pattern of the driving operations.
  • This technology is being developed to enable each individual driver to mitigate driving errors and enjoy mobility without any sense of anxiety.
  • Honda will strive for establishment of underlying technologies during the first half of the 2020s, with practical application during the second half of the 2020s.
With the goal to unravel underlying causes of driving errors that make the driver feel anxious, Honda has been conducting research and development of "technologies to understand people" with an original method that utilizes fMRI*2.
In addition to technologies to understand human behavior and conditions, which Honda has amassed to date, the "Intelligent Driver-Assistive Technology" unveiled today, the world’s first such technology, uses ADAS sensors and cameras to recognize potential risks in the vehicle’s surroundings, which enables AI to detect driving risks. At the same time, AI will determine optimal driving behavior on a real-time basis and offer assistance suited to the cognitive state and traffic situations of each individual driver.
With the next-generation driver-assistive functions currently under research and development, Honda will strive to offer the new value of "error-free" safety and peace of mind which are suited to the driving behavior and situation of each individual driver and keep them away from any potential risks.
Three values Honda will offer with its next-generation driver assist technology
  1. No driving operation errors (Operational assist):
    Vehicle offers AI-based assist to reduce drifting and prevent a delay in operations.
  2. No oversight/No prediction errors (Cognitive assist): Vehicle communicates risks with visual, tactile and auditory sensations.
    Technologies in R&D phase: Risk indicator, seatbelt control and 3D audio

  3. No errors due to daydreaming and careless driving (Attentiveness assist): Vehicle helps reduce driver fatigue/drowsiness
    Technologies in R&D phase: Bio feedback/vibration stimulus through the seatback
0ccb215d80866e475e65c93a263fb168f8007c0b


From here onward, Honda will further advance the "Intelligent Driver-Assistive Technology" unveiled today and continue making progress in development with the goal to establish underlying technologies during the first half of the 2020s, then launch practical applications during the second half of the 2020s.
With this technology, Honda will advance the conventional driver assist which helps the driver avoid risk while it is occurring to the AI-powered driver assist which will keep the driver away from the risks and strive to completely eliminate human errors, which are the cause of over 90% of traffic collisions*3.
  • 2) Safe coexistence of all road user:
Establishment of the "Safe and Sound Network Technology" which connects all road users through telecommunication
  • System understands/recognizes the situation and surrounding environment of each driver and road user
  • Through the communication network, information about potential risks in the traffic environment will be aggregated in the server, and risks are predicted using the reproduction of the traffic environment in the virtual space.
  • System derives the most appropriate support information, communicates it to each road user and encourages them to take actions to avoid potential risk before it actually happens.
  • Honda will accelerate industrywide and public-private collaboration with an aim to standardize the technology in the second half of the 2020s.
8692260a321451c6c8316f32eab9d9160f225b8a


To realize a "collision-free" mobility society for all road users, Honda is striving to realize a "cooperative safety society" where utilization of telecommunication technologies will enable everyone sharing the road to be connected and coexist.
With the ‘Safe and Sound Technology," information about potential risks in the traffic environment, which are detected based on information obtained from roadside cameras, on-board cameras and smartphones, will be aggregated in the server to reproduce that traffic environment in the virtual space. In that virtual space, in consideration of the conditions and characteristics of each individual road user, the system predicts/simulates the behaviors of road users at high risk of a collision. Then, the system derives the most appropriate support information to help the road users avoid risks.
Such support information will be communicated intuitively to automobile drivers, motorcycle riders and pedestrians through "cooperative risk HMI (human-machine interface)," which will make it possible for the system to encourage road users to take action to avoid a collision before it happens.
Aiming for real-world implementation of this technology after 2030, Honda will build the system and complete verification of effectiveness in the first half of the 2020s, then accelerate industrywide and public-private collaboration with an aim to standardize the technology in the second half of the 2020s.
  • Comments by Keiji Ohtsu, President and Representative Director of Honda R&D Co., Ltd.:
"Striving to completely eliminate mobility risks for everyone sharing the road, Honda will offer safety and peace of mind of each and every road user as a new value. Applying our future safety technologies which will embody such new value, Honda will work toward the realization of ‘zero traffic collision fatalities’ involving Honda motorcycles and automobiles globally by 2050. For the realization of a collision-free society where all road users care for each other and the freedom of mobility becomes possible, we will further accelerate our industry-wide and public-private initiatives."
*1 Reduce traffic collision fatalities involving Honda motorcycles and automobiles per 10,000 units sold by 50% by 2030 compared to 2020.
*2 The functional magnetic resonance imaging, fMRI (One of the methods to obtain images of brain’s functioning areas based on changes in blood flow.)
*3 Source: "Number of Fatalities in Traffic Accidents By Type of Violations of Laws," White Paper on Traffic Safety in Japan 2017
About Honda
Honda offers a full line of clean, safe, fun and connected vehicles sold through more than 1,000 independent U.S. Honda dealers. Honda has the highest fleet average fuel economy and lowest CO2 emissions of any major full-line automaker in America, according to the latest data from the U.S. Environmental Protection Agency (EPA). The Honda lineup includes the Civic, Insight, Accord and Clarity series passenger cars, along with the HR-V, CR-V, Passport and Pilot sport utility vehicles, the Ridgeline pickup and the Odyssey minivan.
Honda has been producing automobiles in America for 38 years and currently operates 18 major manufacturing facilities in North America. In 2020, more than 95 percent of all Honda vehicles sold in the U.S. were made in North America, using domestic and globally sourced parts.
More information about Honda is available in the Digital Fact Book.
That’s some Sunday reading I can make time for 👍🏻
 
  • Like
Reactions: 4 users

Diogenese

Top 20
I’m not actually sure what your implying here. Other than you calling me an Orge for asking for your valued opinion?

And if you where implying that Akida is most definitely not apart of this Chip because they are using an Andes RISC V processor then you will have to elaborate as I don’t think that exclude Akida at all.

I’m not saying that it has to be Akida, I’m simply stating it would be a very good fit especially considering the connections around the chip.

Wasn’t it Sean Heir that recently stated how important it is to have partnerships like Arm and SiFive to give confidence to the industry that Akida works very well with all these standard processors?

I get they haven’t mentioned Akida by name, but if they own the IP I guess they don’t really have to either?

Anyways, if you think it’s such a rubbish suggestion I will take your word for it. Oh mighty tech lord of the tsx
Hi TFM,

The origin of the ogre was that after I had rebutted several suggestions that "Akida is here" over at the other place, @MC and @Fact Finder got into a huddle in a corner muttering darkly about my relentless negativity, and I drew the conclusion that they viewed me as the Akida-eating ogre. So really the ogre is a self-parody.
 
  • Like
  • Haha
  • Love
Reactions: 50 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
"Soaring Otter"... Who comes up with these names? 🤭





Posted Feb 19 2021, 11:11 am

Soaring Otter will be a one step, Closed BAA to advance, evaluate and mature Air Force autonomy capabilities, leveraging the latest advancements in both the fundamental science of autonomy and Machine Learning (ML) and the most modern computing technologies designed to support them.

The Air Force (AF) is increasingly employing the science of autonomy to solve complex problems related to global persistent awareness, resilient information sharing, and rapid decision making. Current autonomy approaches include a growing spectrum of techniques ranging from the well-understood to the truly novel: ML, Artificial Intelligence (AI), many varieties of Neural Networks, Neuromorphic Computing, Data Exploitation and others. Together with the rapid progress of autonomy algorithms and methodologies is the equally rapid progress of hardware and software designed to support the efficient execution of autonomy. These computing solutions bring new capabilities, but also new challenges, including how best to develop applications with them, and how to integrate them into larger systems. The application space for autonomy is rapidly growing, with critical technologies like target identification and recognition, Positioning, Navigation and Timing (PNT) and Unmanned Aircraft Systems (UAS) route planning. Finally, how best to integrate and test these new solutions within reasonable constraints of cost and risk is still not well understood, and there is need for a well-defined progression from lab prototype, through realistic System Integration Lab (SIL) testing, finally through field and flight testing for Technology Readiness Level (TRL) increase.

The scope includes the following seven main topic areas:

Autonomy Development and Testing: Develop novel approaches to solving autonomy problems using the latest techniques in ML, neural networks, AI and other fields. Constantly seek to leverage the newest developments from both government and industry; mature existing approaches toward greater levels of robustness and determine early what is required for the eventual successful transition of these autonomy technologies to the warfighter.

Evaluation of Autonomy Capabilities: Provide neutral 3rd party evaluation of algorithms from Government, Academia and Industry. Provide unbiased analysis of alternatives for algorithms being produced by the Government, Industry and Academia to provide actionable information to AFRL about which algorithms are performing best against objective criteria, as well as determine which solutions are most ready for maturation and integration into systems. Design and perform trade studies to identify best-of-breed solutions and make recommendations to the Government for their application and further maturation.

Novel Computing Approaches: This area will focus on compact computing solutions that push processing to the edge for real or near real time solutions to support the warfighter. Assess the latest emerging computing architectures from government and industry, together with the latest approaches to efficiently developing applications using these technologies.

New Application Spaces: Evaluate emerging Air Force priorities and user requirements, to determine where autonomy can bring the greatest benefit, focusing on Intelligence, Surveillance, and Reconnaissance (ISR).

Open System Architectures for Autonomy: Assess existing and emerging Open System Architectures (OSAs) as fundamental elements of future autonomous systems.

Autonomy Technology Integration and Testing: Plan and execute paths by which new autonomy technologies can be rapidly integrated into larger systems for lab, SIL and field/flight testing.

Maturing System Support: Plan and execute technology transition and system transition activities for operational partners. System deployment support and participation, system integration, testing, and assessment support activities.

Additionally, this BAA will address the issues both individually and collectively. No R&D conducted under this program will be done in isolation, but rather in full consideration of how the new technologies can progress toward full integration with large, complex systems, ready to transition to support of the warfighter.

 
  • Like
  • Fire
  • Haha
Reactions: 16 users
Sounds like Bosch are doing the sensors in the carpark and they are demonstrating interoperability of their two different systems.

It makes sense that carparks will have a system that is vehicle manufacturer agnostic.

My opinion only DYOR
FF

AKIDA BALLISTA

Just saw @MrNick’s LinkedIn post based upon your call to arms post yesterday @Fact Finder

Milind Joshi liked the post. For those of you not aware of Joshi, he is a patent lawyer and BrainChip’s Intellectual Property Officer

Another post liked by Joshi is ParkON.ai - a car parking solution start up in India

Most likely it is just a mate supporting his old Samsung India colleague, however there has been a few car park related posts recently

ParkON seems a bit low-tech but could be one to keep an eye on 🤷🏼‍♂️

06588761-3229-4AD8-8D57-7E277DDC9DFE.jpeg


321D200A-6295-4965-8B07-236BA66B0F29.jpeg




DFA95C57-9352-4339-9217-6C390A665891.jpeg

C03482F9-014D-4F52-8F5A-7032C409D27A.jpeg
 
  • Like
  • Fire
  • Love
Reactions: 22 users

Diogenese

Top 20
"Soaring Otter"... Who comes up with these names? 🤭





Posted Feb 19 2021, 11:11 am

Soaring Otter will be a one step, Closed BAA to advance, evaluate and mature Air Force autonomy capabilities, leveraging the latest advancements in both the fundamental science of autonomy and Machine Learning (ML) and the most modern computing technologies designed to support them.

The Air Force (AF) is increasingly employing the science of autonomy to solve complex problems related to global persistent awareness, resilient information sharing, and rapid decision making. Current autonomy approaches include a growing spectrum of techniques ranging from the well-understood to the truly novel: ML, Artificial Intelligence (AI), many varieties of Neural Networks, Neuromorphic Computing, Data Exploitation and others. Together with the rapid progress of autonomy algorithms and methodologies is the equally rapid progress of hardware and software designed to support the efficient execution of autonomy. These computing solutions bring new capabilities, but also new challenges, including how best to develop applications with them, and how to integrate them into larger systems. The application space for autonomy is rapidly growing, with critical technologies like target identification and recognition, Positioning, Navigation and Timing (PNT) and Unmanned Aircraft Systems (UAS) route planning. Finally, how best to integrate and test these new solutions within reasonable constraints of cost and risk is still not well understood, and there is need for a well-defined progression from lab prototype, through realistic System Integration Lab (SIL) testing, finally through field and flight testing for Technology Readiness Level (TRL) increase.

The scope includes the following seven main topic areas:

Autonomy Development and Testing: Develop novel approaches to solving autonomy problems using the latest techniques in ML, neural networks, AI and other fields. Constantly seek to leverage the newest developments from both government and industry; mature existing approaches toward greater levels of robustness and determine early what is required for the eventual successful transition of these autonomy technologies to the warfighter.

Evaluation of Autonomy Capabilities: Provide neutral 3rd party evaluation of algorithms from Government, Academia and Industry. Provide unbiased analysis of alternatives for algorithms being produced by the Government, Industry and Academia to provide actionable information to AFRL about which algorithms are performing best against objective criteria, as well as determine which solutions are most ready for maturation and integration into systems. Design and perform trade studies to identify best-of-breed solutions and make recommendations to the Government for their application and further maturation.

Novel Computing Approaches: This area will focus on compact computing solutions that push processing to the edge for real or near real time solutions to support the warfighter. Assess the latest emerging computing architectures from government and industry, together with the latest approaches to efficiently developing applications using these technologies.

New Application Spaces: Evaluate emerging Air Force priorities and user requirements, to determine where autonomy can bring the greatest benefit, focusing on Intelligence, Surveillance, and Reconnaissance (ISR).

Open System Architectures for Autonomy: Assess existing and emerging Open System Architectures (OSAs) as fundamental elements of future autonomous systems.

Autonomy Technology Integration and Testing: Plan and execute paths by which new autonomy technologies can be rapidly integrated into larger systems for lab, SIL and field/flight testing.

Maturing System Support: Plan and execute technology transition and system transition activities for operational partners. System deployment support and participation, system integration, testing, and assessment support activities.

Additionally, this BAA will address the issues both individually and collectively. No R&D conducted under this program will be done in isolation, but rather in full consideration of how the new technologies can progress toward full integration with large, complex systems, ready to transition to support of the warfighter.

Pigs Might Fly!
 
  • Haha
  • Like
Reactions: 15 users
That’s some Sunday reading I can make time for 👍🏻

More Honda Research Institute reading @MrNick and @Diogenese if you’re interested

Link


Chiho Choi, Joon Hee Choi, Jiachen Li, Srikanth Malla

Shared Cross-Modal Trajectory Prediction for Autonomous Driving​


IEEE Conference on Computer Vision and Pattern Recognition (CVPR) (Oral Presentation) 2021
June, 2021 - Abstract
Ran Tian, Liting Sun, Masayoshi Tomizuka, David Isele.

Anytime Game-Theoretic Planning with Safe and Active Information Gathering on Humans’ Latent States for Human-Centered Robots​


International Conference on Robotics and Automation (ICRA) 2021
May, 2021 - Abstract
Srikanth Malla, Chiho Choi, Behzad Dariush

Social-STAGE: Spatio-Temporal Multi-Modal Future Trajectory Forecast​


IEEE International Conference on Robotics and Automation (ICRA) 2021
May, 2021 - Abstract
Xiaobai Ma, Jiachen Li, Mykel J. Kochenderfer, David Isele, Kikuo Fujimura

Reinforcement Learning for Autonomous Driving with Latent State Inference and Spatial-Temporal Relationships​


International Conference on Robotics and Automation (ICRA) 2021
May, 2021 - Abstract
Pete Trautman, Karankumar Patel.

Real-Time Crowd Navigation from First Principles of Probability Theory​


Conference on Automated Planning and Scheduling (ICAPS) 2020
December, 2020 - Abstract
Jiachen Li, Fan Yang, Masayoshi Tomizuka, Chiho Choi

EvolveGraph: Multi-Agent Trajectory Prediction with Dynamic Relational Reasoning​


Advances in Neural Information Processing Systems (NeurIPS) 2020
December, 2020 - Abstract
Dhruv Mauria Saxena, Sangjae Bae, Alireza Nakhaei, Kikuo Fujimura, Maxim Likhachev

Driving in Dense Traffic with Model-Free Reinforcement Learning​


International Conference on Robotics and Automation (ICRA) 2020
November, 2020 - Abstract
Chiho Choi, Srikanth Malla, Abhishek Patil, Joon Hee Choi

DROGON: A Trajectory Prediction Model based on Intention-Conditioned Behavior Reasoning​


Conference on Robot Learning (CoRL) 2020
November, 2020 - Abstract
Karankumar Patel, Soshi Iba, Nawid Jamali

Deep Tactile Experience: Estimating Tactile Sensor Output from Depth Sensor Data​


IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) 2020
October, 2020 - Abstract
Daniel Seita, Aditya Ganapathi, Ryan Hoque, Minho Hwang, Edward Cen, Ajay Kumar Tanwani, Ashwin Balakrishna, Brijen Thananjeyan, Jeffrey Ichnowski, Nawid Jamali, Katsu Yamane, Soshi Iba, John Canny, Ken Goldberg

Deep Imitation Learning of Sequential Fabric Smoothing From an Algorithmic Supervisor​


IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) 2020
October, 2020 - Abstract
Xingwei Wu, Coleman Merenda, Teruhisa Misu, Kyle Tanous, Chihiro Suga, and Joseph L. Gabbard.

Drivers' Attitudes and Perceptions Towards A Driving Automation System with Augmented Reality Human-Machine Interfaces​


IEEE Intelligent Vehicles Symposium (IV) 2020
October, 2020 - Abstract
Hyungil Kim, Sujitha Martin, Ashish Tawari, Teruhisa Misu , Joseph L. Gabbard

Toward Real-Time Estimation of Driver Situation Awareness: An Eye-tracking Approach based on Moving Objects of Interest​


IEEE Intelligent Vehicles Symposium (IV) 2020 pp. 1978-1983
October, 2020 - Abstract
  • <
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • >

Filter Publications​

Show 12 results Show all results
Year
Scientific Domains
Research Areas
Apply
Clear Filters
 
  • Like
  • Fire
Reactions: 9 users

Terroni2105

Founding Member
Ok, I never paid attention to the orge posts that he posts on occasions. I realise now that I have acted like an orge and full and well deserve my own definition of the meme🤣
No longer an ogre virgin 😉 believe me you are in good company. Please keep up your great contributions TFM.
 
  • Like
  • Haha
  • Love
Reactions: 21 users
Hi TFM,

The origin of the ogre was that after I had rebutted several suggestions that "Akida is here" over at the other place, @MC and @Fact Finder got into a huddle in a corner muttering darkly about my relentless negativity, and I drew the conclusion that they viewed me as the Akida-eating ogre. So really the ogre is a self-parody.
Your wrong I like my definition better I even have one where the Ogre represents repressed homosexuality which I share with Professors of English Literature. 😂🤣😎

FF



AKIDA BALLISTA
 
  • Haha
  • Wow
  • Like
Reactions: 13 users

Diogenese

Top 20
More Honda Research Institute reading @MrNick and @Diogenese if you’re interested

Link


Chiho Choi, Joon Hee Choi, Jiachen Li, Srikanth Malla

Shared Cross-Modal Trajectory Prediction for Autonomous Driving

IEEE Conference on Computer Vision and Pattern Recognition (CVPR) (Oral Presentation) 2021
June, 2021 - Abstract
Ran Tian, Liting Sun, Masayoshi Tomizuka, David Isele.

Anytime Game-Theoretic Planning with Safe and Active Information Gathering on Humans’ Latent States for Human-Centered Robots

International Conference on Robotics and Automation (ICRA) 2021
May, 2021 - Abstract
Srikanth Malla, Chiho Choi, Behzad Dariush

Social-STAGE: Spatio-Temporal Multi-Modal Future Trajectory Forecast

IEEE International Conference on Robotics and Automation (ICRA) 2021
May, 2021 - Abstract
Xiaobai Ma, Jiachen Li, Mykel J. Kochenderfer, David Isele, Kikuo Fujimura

Reinforcement Learning for Autonomous Driving with Latent State Inference and Spatial-Temporal Relationships

International Conference on Robotics and Automation (ICRA) 2021
May, 2021 - Abstract
Pete Trautman, Karankumar Patel.

Real-Time Crowd Navigation from First Principles of Probability Theory

Conference on Automated Planning and Scheduling (ICAPS) 2020
December, 2020 - Abstract
Jiachen Li, Fan Yang, Masayoshi Tomizuka, Chiho Choi

EvolveGraph: Multi-Agent Trajectory Prediction with Dynamic Relational Reasoning

Advances in Neural Information Processing Systems (NeurIPS) 2020
December, 2020 - Abstract
Dhruv Mauria Saxena, Sangjae Bae, Alireza Nakhaei, Kikuo Fujimura, Maxim Likhachev

Driving in Dense Traffic with Model-Free Reinforcement Learning

International Conference on Robotics and Automation (ICRA) 2020
November, 2020 - Abstract
Chiho Choi, Srikanth Malla, Abhishek Patil, Joon Hee Choi

DROGON: A Trajectory Prediction Model based on Intention-Conditioned Behavior Reasoning

Conference on Robot Learning (CoRL) 2020
November, 2020 - Abstract
Karankumar Patel, Soshi Iba, Nawid Jamali

Deep Tactile Experience: Estimating Tactile Sensor Output from Depth Sensor Data

IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) 2020
October, 2020 - Abstract
Daniel Seita, Aditya Ganapathi, Ryan Hoque, Minho Hwang, Edward Cen, Ajay Kumar Tanwani, Ashwin Balakrishna, Brijen Thananjeyan, Jeffrey Ichnowski, Nawid Jamali, Katsu Yamane, Soshi Iba, John Canny, Ken Goldberg

Deep Imitation Learning of Sequential Fabric Smoothing From an Algorithmic Supervisor

IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) 2020
October, 2020 - Abstract
Xingwei Wu, Coleman Merenda, Teruhisa Misu, Kyle Tanous, Chihiro Suga, and Joseph L. Gabbard.

Drivers' Attitudes and Perceptions Towards A Driving Automation System with Augmented Reality Human-Machine Interfaces

IEEE Intelligent Vehicles Symposium (IV) 2020
October, 2020 - Abstract
Hyungil Kim, Sujitha Martin, Ashish Tawari, Teruhisa Misu , Joseph L. Gabbard

Toward Real-Time Estimation of Driver Situation Awareness: An Eye-tracking Approach based on Moving Objects of Interest

IEEE Intelligent Vehicles Symposium (IV) 2020 pp. 1978-1983
October, 2020 - Abstract
  • <
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • >

Filter Publications​

Show 12 results Show all results
Year
Scientific Domains
Research Areas
Apply
Clear Filters
Thanks tls,

Could I've a hint?

Usual clues are:
digital spiking neural network;
JAST;
STDP with JAST;
on-chip learning;
Akida;
BrainChip.
 
  • Like
  • Love
  • Haha
Reactions: 10 users
Thanks tls,

Could I've a hint?

Usual clues are:
digital spiking neural network;
JAST;
STDP with JAST;
on-chip learning;
Akida;
BrainChip.

I’ll review and narrow the search
 
  • Like
  • Fire
  • Haha
Reactions: 5 users

Diogenese

Top 20
Your wrong I like my definition better I even have one where the Ogre represents repressed homosexuality which I share with Professors of English Literature. 😂🤣😎

FF



AKIDA BALLISTA
I should hasten to point out that, while it may be a reasonable facsimile, it in no way accords with my self-image.
 
  • Haha
  • Like
Reactions: 10 users

Diogenese

Top 20
"Soaring Otter"... Who comes up with these names? 🤭





Posted Feb 19 2021, 11:11 am

Soaring Otter will be a one step, Closed BAA to advance, evaluate and mature Air Force autonomy capabilities, leveraging the latest advancements in both the fundamental science of autonomy and Machine Learning (ML) and the most modern computing technologies designed to support them.

The Air Force (AF) is increasingly employing the science of autonomy to solve complex problems related to global persistent awareness, resilient information sharing, and rapid decision making. Current autonomy approaches include a growing spectrum of techniques ranging from the well-understood to the truly novel: ML, Artificial Intelligence (AI), many varieties of Neural Networks, Neuromorphic Computing, Data Exploitation and others. Together with the rapid progress of autonomy algorithms and methodologies is the equally rapid progress of hardware and software designed to support the efficient execution of autonomy. These computing solutions bring new capabilities, but also new challenges, including how best to develop applications with them, and how to integrate them into larger systems. The application space for autonomy is rapidly growing, with critical technologies like target identification and recognition, Positioning, Navigation and Timing (PNT) and Unmanned Aircraft Systems (UAS) route planning. Finally, how best to integrate and test these new solutions within reasonable constraints of cost and risk is still not well understood, and there is need for a well-defined progression from lab prototype, through realistic System Integration Lab (SIL) testing, finally through field and flight testing for Technology Readiness Level (TRL) increase.

The scope includes the following seven main topic areas:

Autonomy Development and Testing: Develop novel approaches to solving autonomy problems using the latest techniques in ML, neural networks, AI and other fields. Constantly seek to leverage the newest developments from both government and industry; mature existing approaches toward greater levels of robustness and determine early what is required for the eventual successful transition of these autonomy technologies to the warfighter.

Evaluation of Autonomy Capabilities: Provide neutral 3rd party evaluation of algorithms from Government, Academia and Industry. Provide unbiased analysis of alternatives for algorithms being produced by the Government, Industry and Academia to provide actionable information to AFRL about which algorithms are performing best against objective criteria, as well as determine which solutions are most ready for maturation and integration into systems. Design and perform trade studies to identify best-of-breed solutions and make recommendations to the Government for their application and further maturation.

Novel Computing Approaches: This area will focus on compact computing solutions that push processing to the edge for real or near real time solutions to support the warfighter. Assess the latest emerging computing architectures from government and industry, together with the latest approaches to efficiently developing applications using these technologies.

New Application Spaces: Evaluate emerging Air Force priorities and user requirements, to determine where autonomy can bring the greatest benefit, focusing on Intelligence, Surveillance, and Reconnaissance (ISR).

Open System Architectures for Autonomy: Assess existing and emerging Open System Architectures (OSAs) as fundamental elements of future autonomous systems.

Autonomy Technology Integration and Testing: Plan and execute paths by which new autonomy technologies can be rapidly integrated into larger systems for lab, SIL and field/flight testing.

Maturing System Support: Plan and execute technology transition and system transition activities for operational partners. System deployment support and participation, system integration, testing, and assessment support activities.

Additionally, this BAA will address the issues both individually and collectively. No R&D conducted under this program will be done in isolation, but rather in full consideration of how the new technologies can progress toward full integration with large, complex systems, ready to transition to support of the warfighter.

I'm sure @Yak52 will correct me, but there was a de Havilland Twin Otter propeller aircraft used in regional airlines 40 years or so ago.
 
  • Like
  • Love
  • Fire
Reactions: 5 users
Top Bottom