BrainChip and Tata Consultancy Services (TCS) jointly Present a Demonstration Featuring Its Akida Neuromorphic Technology Platform at NeurIPS 2019
Akida is available as a licensable IP technology that can be integrated into ASIC devices and will be available as an integrated SoC, both suitable for applications such as surveillance, advanced driver assistance systems (ADAS), autonomous vehicles (AV), vision guided robotics, drones, augmented and
virtual reality (AR/VR), acoustic analysis, and Industrial Internet-of-Things (IoT). Akida performs neural processing and memory accesses on the edge, which vastly reduces the computing resources required of the system host CPU. This unprecedented efficiency not only delivers faster results, it consumes only a tiny fraction of the power resources of traditional AI processing. Functions like training, learning, and inferencing are orders of magnitude more efficient with Akida.
Akida incorporated at the sensors of sight, hearing, feel makes alot of sense. Having Akida learn on the fly customized to each personal situation and needs.
Microsoft Research
@MSFTResearch
Inspired by biological vision, Microsoft researchers paired neuromorphic cameras and representation learning to improve how autonomous systems can reason from visual data. Learn how this leads to better visuomotor policies for drones, deployed in AirSim:
https://aka.ms/AAbfzdy
Microsoft researchers paired neuromorphic cameras and representation learning to improve how autonomous systems can reason from visual data. Learn how this leads to better visuomotor policies for drones, deployed in AirSim.
www.microsoft.com
The above Microsoft link was from March 2021, with reference to SNN but they say still needs specialized hardware to reach full potential
Nine Ways We Use AR and VR on the International Space Station
1. VR control of robots
Pilote, an investigation from ESA (European Space Agency) and France’s National Center for Space Studies (CNES), tests remote operation of robotic arms and space vehicles using VR with interfaces based on haptics, or simulated touch and motion. Results could help optimize the ergonomics of workstations on the space station and future spacecraft for missions to the Moon and Mars.
Pilote compares existing and new technologies, including those recently developed for teleoperation and others used to pilot the Canadarm2 and Soyuz spacecraft. The investigation also compares astronaut performance on the ground and during long-duration space missions. Designs from Earth-based testing use ergonomic principles that do not fit microgravity conditions, which is why testing is performed in space.
3. AR maintenance assists
T2 AR tests using AR to help crew members inspect and maintain the space station’s T2 Treadmill. Astronaut Soichi Noguchi of the Japan Aerospace Exploration Agency (JAXA)
kicked off the first of a series of tests in April. On future space missions, crew members need to be ready perform this type of task without assistance from the ground due to significant time delays in communications. AR guidance on complex spacecraft maintenance and repair activities also reduces the time astronauts spend training for and completing such tasks. Acting as a smart assistant, AR applications run on tablets or headsets, interpreting what the camera sees and what a crew member does and suggesting the next step to perform. Crew members can operate the applications by speaking or gesturing as well.
4. A cool upgrade
NASA’s
Cold Atom Lab (CAL) is the first quantum science laboratory in Earth orbit, hosting experiments that explore the fundamental behaviors and properties of atoms. About the size of a minifridge, it was designed to enable in-flight hardware upgrades. In July, the Cold Atom Lab team successfully demonstrated using an AR headset to assist astronauts with upgrade activities.
5. Time travel
Astronauts need to accurately perceive time and speed of objects in their environment in order to reliably perform tasks. Research shows that our perceptions of time and space overlap, and the speed of the body’s movement may affect time perception. Other factors of spaceflight that can affect time perception include disrupted sleep and circadian rhythms and stress.
6. Just like being there
The ISS Experience is an immersive VR series filmed over multiple months to document different crew activities, from science conducted aboard the station to a spacewalk
Astronauts need to accurately perceive time and speed of objects in their environment in order to reliably perform tasks. Research shows that our perceptions of time and space overlap, and the speed of the body’s movement may affect time perception. Other factors of spaceflight that can affect time perception include disrupted sleep and circadian rhythms and stress.
7. Get a better grip
The way humans grip and manipulate an object evolved in the presence of gravity. Microgravity changes the cues we use to control these activities. An ESA investigation,
GRIP, studies how spaceflight affects gripping and manipulation of objects
8. Controlling movement in microgravity
9. An astronaut’s reach should not exceed their grasp
The study showed a decreased psychosocial stress response and lower levels of anxiety after the virtual training, comparable to what happens after real exercise
www.nationalheraldindia.com
Even the most highly trained and experienced person sometimes needs a hand. For astronauts aboard the International Space Station, that helping hand comes
www.nasa.gov