“For example, terrestrial neuromorphic processors such as Intels Cooporation's LoihiTM, Brainchip's AkidaTM, and Google Inc's Tensor Processing Unit (TPUTM) require full host processors for integration for their software development kit (SDK) that are power hungry or limit throughput. This by itself is inhibiting the use of neuromorphic processors for low SWaP space missions.”Hi D,
Thanks, I must have missed that SBIR. That explains why there's been talk about not needing a processor.
The timing of the SBIR and the annoucement of our 22nm AKD1500 GF reference chip does align very nicely.
For anyone else that missed it:
Release Date:
January 10, 2023
Open Date:
January 10, 2023
Application Due Date:
March 13, 2023
Close Date:
March 13, 2023 (closing in 29 days
The preference is for a prototype processor fabricated in a technology node suitable for the space environment, such as 22-nm FDSOI, which has become increasingly affordable.
Neuromorphic and deep neural net software for point applications has become widespread and is state of the art. Integrated solutions that achieve space-relevant mission capabilities with high throughput and energy efficiency is a critical gap. For example, terrestrial neuromorphic processors such as Intels Cooporation's LoihiTM, Brainchip's AkidaTM, and Google Inc's Tensor Processing Unit (TPUTM) require full host processors for integration for their software development kit (SDK) that are power hungry or limit throughput. This by itself is inhibiting the use of neuromorphic processors for low SWaP space missions.
Australian Space TV, he is a Neural-Spiker! The first published (non-secret) SNN application in space, I believe first started here.
Good video, yeah Australia!
Same thing, you don't need big data to train SNN edge AI on Industrial patterns, ultra-low power.
View attachment 29376
“For example, terrestrial neuromorphic processors such as Intels Cooporation's LoihiTM, Brainchip's AkidaTM, and Google Inc's Tensor Processing Unit (TPUTM) require full host processors for integration for their software development kit (SDK) that are power hungry or limit throughput. This by itself is inhibiting the use of neuromorphic processors for low SWaP space missions.”
Am I reading this wrong? Are they saying that Akida needs a full host processor for integration for their SDK? I thought one of the main selling points of Akida is that it doesn’t need a host processor and can do all the computation alone?
Sitting in a tree, watching the Superbowl. Baby Face the old music group just did a bit live. Fantastic.Hi Funky, what about SiFive + BrainChip sitting in a tree, K I S S I N G?
Extract 1
View attachment 29384NASA, Microchip, SiFive Announces Partnership for RISC-V Spaceflight Computing Platform
Designed to replace existing systems still using a processor design from 1997, the RISC-V-powered chip will offer 100 times the performance.www.hackster.io
Extract 2
View attachment 29385BrainChip and SiFive partner on next-generation edge AI chips
BrainChip and SiFive have combined the former's Akida architecture with the latter's RISC-V processors to bring AI and ML to the edge.www.edgecomputing-news.com
How about some of the huge transactions on the market so far.
Is that distant drums I hear?
Yeah, they appear to have lumped all chips listed in the same basket. Akida only needs an external pre processor to process the initial conditions, i.e. the trained weights and other configuration. Then all processing is done within the Akida Neural Fabric.“For example, terrestrial neuromorphic processors such as Intels Cooporation's LoihiTM, Brainchip's AkidaTM, and Google Inc's Tensor Processing Unit (TPUTM) require full host processors for integration for their software development kit (SDK) that are power hungry or limit throughput. This by itself is inhibiting the use of neuromorphic processors for low SWaP space missions.”
Am I reading this wrong? Are they saying that Akida needs a full host processor for integration for their SDK? I thought one of the main selling points of Akida is that it doesn’t need a host processor and can do all the computation alone?
Sounds promising!FEBRUARY 2, 2023
Next-gen Siri: The future of personal assistant AIWhat advancements and features could make Siri a more powerful personal assistant in the future?
With the rapid advancements in artificial intelligence, it’s no surprise that many users are looking forward to what the next generation of Siri personal assistance will have to offer. From improved emotional recognition to autonomous management, the possibilities are endless. But what exactly are people looking for in the next-gen Siri?
One of the most requested features is improved contextual language capabilities. At the moment, a lack of such capabilities can make it difficult for some users to have a smooth conversation with their assistants. By incorporating more advanced voice recognition technologies, Siri could better understand contexts and intentions.
Another highly requested feature is the ability to multitask. Currently, Siri can only handle one task at a time, which can be frustrating for users who want to accomplish multiple things at once. The incorporation of multitasking could enable the assistant to handle complex requests simultaneously, thereby improving efficiency.
Contextual language
Many users are asking for Siri to have improved natural language processing capabilities. This would allow for more seamless conversations with the AI, as it would be able to understand more complex and nuanced requests. This would also make it easier for users to ask for specific information, as Siri would be able to understand more context.
Machine learning (ML)
Users have expressed a wish for Siri to become more proactive. With the increasing popularity of smart home devices, users also request that Siri integrates with daily habits and routines, improving awareness of the different spaces and rooms using machine learning (ML).
This could include autonomous actions like sending reminders, providing updates, asking for deliveries in perfect timing, optimizing e-vehicle charging, watering gardens only when needed, and even making suggestions based on the user’s behavior, or external data like weather forecasts and traffic conditions. This would make Siri a more helpful personal assistant that could anticipate needs, making the home itself more proactive.
Top breakthroughs in AI: what to expect
- Natural Language Processing (NLP): the ability to understand and interpret human language, allowing for more accurate and natural dialogue;
- Emotion detection: the ability to detect and respond to human emotions, allowing for more personalized and empathetic interactions;
- Machine learning (ML): a method of teaching AI through data and experience, allowing it to remember, adapt, and improve over time;
- Contextual understanding: the ability to understand and respond to the context of a conversation or request, providing more accurate and relevant results, answers, and actions;
- Explainable AI: the ability to analyze complex data and scenarios, providing clear explanations and the best options for decision-making processes, increasing transparency and trust;
- Autonomous awareness: the ability to connect and control multiple devices directly, creating a seamless awareness environment;
- Predictive analytics: in the future, Siri will be able to analyze data and predict future events, allowing for proactive problem-solving over the “Internet of Things” (IoT) without human interference;
- Computer vision: the ability to interpret and understand visual data, such as images or video, to improve image recognition and object detection, acting accordingly;
- Autonomous services: the integration with robotics, or automated systems (drone delivery, lawn mowing, vacuum cleaning, pool maintenance, etc) and third-party services to improve the home’s efficiency.
The next generation of Siri has the potential to revolutionize the way we interact with AI with advancements in integration capabilities. Siri could definitively become part of the family.
Stay tuned to AppleMagazine for more updates in relation to the latest advancements in personal assistants and artificial intelligence.
Next-gen Siri: The Future Of Personal Assistant AI - AppleMagazine
With the rapid advancements in artificial intelligence, it's no surprise that many users are looking forward to what the next generation of Siri personalapplemagazine.com
You would hope that BRN management have been banging down their door showing them what akida could do to improve Siri.FEBRUARY 2, 2023
Next-gen Siri: The future of personal assistant AIWhat advancements and features could make Siri a more powerful personal assistant in the future?
With the rapid advancements in artificial intelligence, it’s no surprise that many users are looking forward to what the next generation of Siri personal assistance will have to offer. From improved emotional recognition to autonomous management, the possibilities are endless. But what exactly are people looking for in the next-gen Siri?
One of the most requested features is improved contextual language capabilities. At the moment, a lack of such capabilities can make it difficult for some users to have a smooth conversation with their assistants. By incorporating more advanced voice recognition technologies, Siri could better understand contexts and intentions.
Another highly requested feature is the ability to multitask. Currently, Siri can only handle one task at a time, which can be frustrating for users who want to accomplish multiple things at once. The incorporation of multitasking could enable the assistant to handle complex requests simultaneously, thereby improving efficiency.
Contextual language
Many users are asking for Siri to have improved natural language processing capabilities. This would allow for more seamless conversations with the AI, as it would be able to understand more complex and nuanced requests. This would also make it easier for users to ask for specific information, as Siri would be able to understand more context.
Machine learning (ML)
Users have expressed a wish for Siri to become more proactive. With the increasing popularity of smart home devices, users also request that Siri integrates with daily habits and routines, improving awareness of the different spaces and rooms using machine learning (ML).
This could include autonomous actions like sending reminders, providing updates, asking for deliveries in perfect timing, optimizing e-vehicle charging, watering gardens only when needed, and even making suggestions based on the user’s behavior, or external data like weather forecasts and traffic conditions. This would make Siri a more helpful personal assistant that could anticipate needs, making the home itself more proactive.
Top breakthroughs in AI: what to expect
- Natural Language Processing (NLP): the ability to understand and interpret human language, allowing for more accurate and natural dialogue;
- Emotion detection: the ability to detect and respond to human emotions, allowing for more personalized and empathetic interactions;
- Machine learning (ML): a method of teaching AI through data and experience, allowing it to remember, adapt, and improve over time;
- Contextual understanding: the ability to understand and respond to the context of a conversation or request, providing more accurate and relevant results, answers, and actions;
- Explainable AI: the ability to analyze complex data and scenarios, providing clear explanations and the best options for decision-making processes, increasing transparency and trust;
- Autonomous awareness: the ability to connect and control multiple devices directly, creating a seamless awareness environment;
- Predictive analytics: in the future, Siri will be able to analyze data and predict future events, allowing for proactive problem-solving over the “Internet of Things” (IoT) without human interference;
- Computer vision: the ability to interpret and understand visual data, such as images or video, to improve image recognition and object detection, acting accordingly;
- Autonomous services: the integration with robotics, or automated systems (drone delivery, lawn mowing, vacuum cleaning, pool maintenance, etc) and third-party services to improve the home’s efficiency.
The next generation of Siri has the potential to revolutionize the way we interact with AI with advancements in integration capabilities. Siri could definitively become part of the family.
Stay tuned to AppleMagazine for more updates in relation to the latest advancements in personal assistants and artificial intelligence.
Next-gen Siri: The Future Of Personal Assistant AI - AppleMagazine
With the rapid advancements in artificial intelligence, it's no surprise that many users are looking forward to what the next generation of Siri personalapplemagazine.com
Sounds promising!
I'll switch from Google to Apple if they implement Akida. Unless Google decides they need us too.
Almost reads like NASA had trouble comprehending the autonomous nature of the Akida NN in performing inference and ML.Yeah, they appear to have lumped all chips listed in the same basket. Akida only needs an external pre processor to process the initial conditions, i.e. the trained weights and other configuration. Then all processing is done within the Akida Neural Fabric.