BRN Discussion Ongoing

Bloodsy

Regular
Prepare yourselves for a rabbit hole :ROFLMAO: I think i found the first consumer product which will have Akida onboard o_O:cool:

1652331154670.png


This is a photo from the NVISO Stand at the AI trade show in japan yesterday!

Notice the robot in the top right? That is KiKi the robot from Zoetic AI

My guess is that Zoetic AI is a customer of NVISO otherwise why else are they showcasing it?


1652331620453.png


Almost all of the above features are possible with Akida none more so than Real time onboard intelligence!!!
"All of Kikis processes occur on-board, so you can rest easy knowing that your data is secure and private"



Now that i have your attention, the nice bow to tie it all up and something that gives me 90% confidence that Akida is onboard Kiki is the following "estimated delivery Q3 2022" This after PVDM stated that revenue would show up in the second half of 2022

Anyone object???

1652331855556.png
 
  • Like
  • Fire
  • Wow
Reactions: 97 users

alwaysgreen

Top 20
  • Haha
  • Like
  • Wow
Reactions: 17 users

Terroni2105

Founding Member
Prepare yourselves for a rabbit hole :ROFLMAO: I think i found the first consumer product which will have Akida onboard o_O:cool:

View attachment 6355

This is a photo from the NVISO Stand at the AI trade show in japan yesterday!

Notice the robot in the top right? That is KiKi the robot from Zoetic AI

My guess is that Zoetic AI is a customer of NVISO otherwise why else are they showcasing it?


View attachment 6356

Almost all of the above features are possible with Akida none more so than Real time onboard intelligence!!!
"All of Kikis processes occur on-board, so you can rest easy knowing that your data is secure and private"



Now that i have your attention, the nice bow to tie it all up and something that gives me 90% confidence that Akida is onboard Kiki is the following "estimated delivery Q3 2022" This after PVDM stated that revenue would show up in the second half of 2022

Anyone object???

View attachment 6357

wow, good sleuthing Bloodsy.

I looked at their youtube video which is on their website and found that Kiki is referred to as a 'sidekick'. Was it Rob Telson's word in finding a 'sidekick' for Ken the robot?


1652332793169.png
 
  • Like
  • Fire
  • Love
Reactions: 51 users
When are we expecting an update on AKD2000?
 
  • Like
Reactions: 7 users
Really appreciate the amazing work done by many on this site! Thank you!
Am I missing something or has the connection between ARM and BRN disappeared from the BRN page?
I believe FF verified with BRN that were was an actual connection so wondering why it’s not visible anymore?
Cheers
 
  • Like
Reactions: 4 users

Moonshot

Regular
Translation and word spotting in an edge device? Akida use case?

 
  • Like
  • Fire
  • Thinking
Reactions: 3 users
ARM are still listed on the partner page.

 
  • Like
  • Fire
  • Love
Reactions: 14 users

alwaysgreen

Top 20
Really appreciate the amazing work done by many on this site! Thank you!
Am I missing something or has the connection between ARM and BRN disappeared from the BRN page?
I believe FF verified with BRN that were was an actual connection so wondering why it’s not visible anymore?
Cheers
Still there mate

1652333246228.png
 
  • Like
  • Fire
  • Love
Reactions: 13 users

Bloodsy

Regular
wow, good sleuthing Bloodsy.

I looked at their youtube video which is on their website and found that Kiki is referred to as a 'sidekick'. Was it Rob Telson's word in finding a 'sidekick' for Ken the robot?


View attachment 6361

Cheers Terroni, Yep im pretty sure that was robs wording!

Thats a great video of Kiki and knowing that Akida might be onboard makes it so much better!!

Fingers crossed!!
 
  • Like
  • Fire
  • Love
Reactions: 17 users
D

Deleted member 118

Guest
PS, between you, me and the other 1000 eyes, I think Blind Freddie is a fake. He has 20/20 vision.
Nothing wrong with my eye sight and especially when I drive

 
  • Haha
  • Like
  • Love
Reactions: 36 users
Really appreciate the amazing work done by many on this site! Thank you!
Am I missing something or has the connection between ARM and BRN disappeared from the BRN page?
I believe FF verified with BRN that were was an actual connection so wondering why it’s not visible anymore?
Cheers

Nope still there under partners.

Screenshot_2022-05-12-13-33-01-53_4641ebc0df1485bf6b47ebd018b5ee76.jpg
 
  • Like
  • Fire
  • Love
Reactions: 12 users
wow, good sleuthing Bloodsy.

I looked at their youtube video which is on their website and found that Kiki is referred to as a 'sidekick'. Was it Rob Telson's word in finding a 'sidekick' for Ken the robot?


View attachment 6361
Just looked all their people from the founder down are software people so they are using someone else’s hardware.

The founders are ex Google.

Given that Kiki is processing multiple senses including touch, sound, motion and vision all on device unconnected and are highlighted on Nviso’s platform it seems a very small fairy step of faith to believe this is AKIDA.

If you go back to the earliest statements by Nviso about why teaming up with Brainchip was essential to Nviso providing the very things that Kiki is claimed to do and that Brainchip’s AKIDA was unique in this regard you are left to ask who could it be if not Brainchip.

My opinion & speculation only so DYOR but well done @Bloodsy
FF

AKIDA BALLISTA
 
  • Like
  • Fire
  • Love
Reactions: 43 users

Moonshot

Regular
Translation and word spotting in an edge device? Akida use case?

Google have been researching in this space..

https://medium.com/ai³-theory-practice-business/google-ai-temporal-coding-in-spiking-neural-networks-df8a6c87a049
 
  • Like
  • Fire
Reactions: 5 users

VictorG

Member
Really appreciate the amazing work done by many on this site! Thank you!
Am I missing something or has the connection between ARM and BRN disappeared from the BRN page?
I believe FF verified with BRN that were was an actual connection so wondering why it’s not visible anymore?
Cheers
BRN Partners.PNG

Your computer obviously has Intel inside
 
  • Haha
  • Like
  • Love
Reactions: 20 users
D

Deleted member 118

Guest

Panasonic's focus on "prosperity of the mind" and some of the innovative products designed to provide this are emerging from the company's new Aug Lab research and development facility
Panasonic is already a global leader in the field of home appliances and other devices that make our daily lives more comfortable and convenient. A home packed with these is a traditional sign of prosperity, the focus of which has tended to be people's material or financial wealth.
But across many developed countries, there is increasing attention on people's mental or social well-being as an indicator of prosperity - prosperity of the mind, if you like. This trend was already evident prior to COVID; however the pandemic has served to remind many of the importance of human relations, of community and, for some, of nature.
In response to this trend, Panasonic is focusing development effort and resources in the field of what it calls "augmentation," which, the company says, is a more subtle use of technology to improve people's happiness and sense of well-being. This includes a new generation of robots and robotic devices.
Robots have traditionally been deployed to automate physical processes, for example in manufacturing. More recently they have been utilized to carry out work remotely on behalf of human operators - for example during surgical procedures - or in difficult to access or dangerous environments.
In Japan in particular, which is facing the challenges of a declining population and a shrinking and aging workforce, robots are increasingly being deployed to provide physical assistance and care for the elderly. However, Panasonic recognizes that robots have the potential to do more than just automate processes and replace human beings; they are also able to complement them and provide emotional support.
The company is therefore developing new types of robots that can facilitate communication and provide mental support, or as Panasonic calls it, "self-enrichment." These are being conceived and developed in the company's new "Aug Lab", a virtual research and development facility set up in 2019 which leverages the expertise and imagination of a wide range of people inside and outside Panasonic.
Aug Lab's activities are not limited to engineering research or robotics; they incorporate the input of designers and creators and specialists in other non-engineering fields. The aim is to explore fresh perspectives, such as "What is it that provides a sense of well-being?" and "What makes people's minds tick?"
An example of an early development by Panasonic in the field of self-enhancement is the human companion robot "Nicobo", a cuddly pet that communicates with its owner in various verbal and non-verbal ways.*
Among the prototype products that the Aug Lab is currently working on in this field is a set of three small, cute robots designed to deepen the connection between babies or small children and their parents. These robots sing and make sounds, interacting with and delighting children and babies in a relaxed and non-intrusive manner, capturing their smiles and daily movements, particularly when their parents are not in the room.
Photo: A set of new companion robots
A set of new companion robots
All parents want to capture pictures of their children growing up. However, not all of them have a camera or smartphone to hand at all times, and many might not be comfortable installing a camera in a child's room. In the case of these robots, the camera comes disguised as a friend, one that makes its young subjects smile or laugh. A typical situation is one where the robots snap a photo of a child in an unguarded moment and, unprompted, sends it to the parent's smartphone.
Photo: Takeshi Ando, Director of Aug Lab, Panasonic Corporation
Takeshi Ando, Director of Aug Lab, Panasonic Corporation
Explains Takeshi Ando, Director of the Aug Lab, "When people think about the use of robots for communication, they tend to focus on communication between humans and robots. But what we are trying to do with baby papa is create opportunities for communication between humans; in this case, between parents and their children, with the aim of deepening their relationship."
Another new development coming out of the Aug Lab is UMOZ - a miniature robot inspired by humble green moss. Moss grows extensively in Japan, with its humid and sub-tropical climate, often carpeting gardens, temple grounds and forests. There are more than 1,700 varieties, each with its own characteristics and environmental preferences.
Photo: Examples of UMOZ robots
Examples of UMOZ robots
Resembling a hermit crab and containing real moss, the UMOZ robot has built-in optical and humidity sensors. These perceive the intensity and direction of light and the degree of humidity in its immediate environment, and the UMOZ moves in response to these.
Some are programmed to avoid light and will move away if its owner introduces a light source. Others are programmed to seek out light and will move closer in the same situation. Some will seek moisture while others avoid it.
Explains Ando, "Most of us think of moss as something inanimate that you cannot interact with, but actually it's a living thing which adapts itself to its surrounding environment. The miniature UMOZ robots mimic this relationship, or dependency. Their behavior algorithms are also programmed differently for each individual. Some will move towards sources of light or moisture; others will move away from these. The aim is to stimulate their owners' perception of their surroundings and their awareness of nature." He adds, "I think that well-being is not only about the relationships people have with each other; it's also about how people can live together in harmony with nature."
Photo: Takeshi Ando, who champions the enhancement of well-being through augmentation
Takeshi Ando, who champions the enhancement of well-being through augmentation
As the Aug Lab continues its work, Panasonic is seeking new research partners to help accelerate its future innovation, and the company is looking forward to announcing new developments soon.
Concludes Ando, "Data concerning life satisfaction is collected all over the world, and if you look at the trend, it is basically flat. Although we have contributed to economic development, we have not really been focused on contributing to people's happiness. New technologies are providing us with a means to do this."
 
  • Like
  • Love
  • Fire
Reactions: 20 users
D

Deleted member 118

Guest


Service Robotics Market Is Likely to Experience a Tremendous Growth in Near Future | Panasonic Corporation, Robert bosch GmbH, Samsung Electronics CO., Limited​



By Report Ocean
2022/04/07 05:28

The global service robotics market size was US$ 26.51 billion in 2021. The global service robotics market size is forecast to reach US$ 163.7 billion by 2030, growing at a compound annual growth rate (CAGR) of 22.2% during the forecast period from 2022 to 2030.


Service robots are semi-automated or fully automated robots that perform beneficial and potentially harmful activities for humans. Humans benefit from these robots in a variety of fields, including medical and healthcare, construction, automation, household, and entertainment. These robots are controlled by an internal control system, with the option to overrule the operation manually. These service robots remove the possibility of human error, manage time, and increase production by lowering staff and labor workload. Because of benefits like the delivery of accurate and high-quality services, reduced operational expenses and human errors, and enhanced usability and dependability, service robotics has acquired widespread adoption in a number of professional and personal applications.

Factors Influencing Market Growth


  • Growing awareness about the advantages of robots, a surge in R&D investments, and rising demand for automation in the personal & professional sectors are forecast to drive the global market growth.
  • Rising the price of raw materials and semiconductor chips may slow down the overall market growth.
  • Swarm robotics is a system that helps in coordinating with multiple robots as a system. Swarm robotics has several benefits, including scalability, flexibility, and robustness. Thus, the rising adoption of swarm intelligence technology, allowing robots to accomplish various complex tasks with comfort, is forecast to offer lucrative opportunities for the global market during the forecast period.
Impact Analysis of COVID-19


The COVID-19 pandemic had a minimal impact on the global market growth. Robotics assisted the healthcare industry in dealing with major workforce shortages in healthcare, manufacturing, and supply chains. In addition, this robotics helped social distancing and diagnosis and treatment. In the healthcare industry, service robotics is critical. They reduce human intervention at every stage of the process, from patient evaluation to patient care and drug distribution.

Insights

Europe held dominance in the market in 2021 and is forecast to remain dominant during the forecast period. As a result of the rising government investment in the R&D sector, the growing need for service robots from retail, medical, defense, and logistics sectors in this region fuels the growth of robotics R&D activities in many sectors. Thus, boosting the global market growth in the region.

The Asia Pacific region is forecast to witness lucrative growth during the forecast period. As a result of the rising developing countries in the region. In addition, countries in the region are adopting robotics solutions due to technological advances and the advent of new enterprise models, such as the rising e-commerce and profitable retail sectors.

Leading Competitors

The leading prominent companies profiled in the global service robotics market are:


  • iRobot Corporation
  • Intuitive Surgical Incorporated
  • Honda Motor Co., Limited
  • Panasonic Corporation
  • Aethon Incorporated
  • Yujin Robot Co. Limited
  • Samsung Electronics CO., Limited
  • DeLaval
  • Robert bosch GmbH
  • AB Electrolux
  • Other Prominent Players
Scope of the Report

The global service robotics market segmentation focuses on Application, Type, and Region.

Segmentation based on Application

  • Healthcare
  • Defense
  • Field
  • Logistics
  • Construction
  • Domestic
  • Entertainment
  • Others
Segmentation based on Type

  • Professional service robotics
  • Personal service robotics
 
  • Like
  • Fire
  • Love
Reactions: 21 users
Google have been researching in this space..

https://medium.com/ai³-theory-practice-business/google-ai-temporal-coding-in-spiking-neural-networks-df8a6c87a049
Hi @Moonshot
I just found the following news release from Google which is ten hours old. They are clearly working in this area but Kiki seems much more advanced than where Google is proudly announcing they have reached and are very careful not to say it is a connected device but to talk about it being secure because you have to opt in to take part in all the features which do not include touch or vibration. They say video is not shared but do not mention that voice commands still are processed off device (by not saying they are dealt with on device as they do for video):

"Currently, Google home devices require you to activate the assistant by saying ‘Hey Google’ or ‘OK Google’. But soon, you will be able to just… stare at your device and the Google Assistant will start listening to you.

This creepy new feature is called ‘Look and Talk’ and it’s first rolling out to Google’s Nest Hub Max.

Look and Talk​

It was announced during Google I/O this morning and touted by the search giant as “making it much easier for users to initiate a conversation with a Google Assistant”.

Mycoolman 30L Portable Fridge/Freezer: The Transporter
Ad
Ad
Auto Parts Co

th



From a user perspective, there’s not much more to it: Look and Talk allows you to communicate with the assistant by looking and talking. Google reckons this is a lot more natural, because IRL when you want to strike up a conversation with someone, you simply look at them and start talking.

“Achieving this was no easy task, we needed to build an assistant that could distinguish things like intentional eye contact and simply a passing glance,” Google Assistant product manager Jaclyn Konzelmann said during a press briefing.

“This required six machine learning models, to process over 100 signals from proximity to height orientation and gaze direction, all in real time.”

If you’re thinking, What if I accidentally glance at my Google device while I’m having a very private conversation? Well, in addition to the constant reassurance that our devices aren’t listening to us, Google reckons Look and Talk is wrapped in the Nest Hub Max’s “camera sensing safeguards”.

Look and Talk uses face match and voice match to recognise that it’s you. It also takes into consideration your proximity, head orientation and gaze direction. Google says the video transaction is processed completely on-device (which means it isn’t shared with Google or any third parties), and the feature is also opt-in (you have to opt-in to each part, which might seem tedious, but privacy, people).

Look and Talk starts rolling out on Android this week, with iOS to follow ‘soon’.

Quick Phrases

Already available on Pixel phones, ‘Quick Phrases’ is also heading to Google’s smart home devices.

If you aren’t familiar, this feature allows you to skip that whole ‘Hey Google’ thing for common daily tasks. The idea behind Quick Phrases is straightforward: allowing people to respond to common situations with short voice commands while skipping wake words entirely.

You will be able to choose which Quick Phrases you want available from a list Google provides, including setting timers, turning lights on and asking for the weather. It will also be able to handle the different ways people ask for the weather, for example (it’s also voice-matched to each person in your household).

Powerful speech and language models

From early 2023, the Google Assistant will be able to start having more ‘natural’ conversations with you, with the ability to understand the nuances of your voice (such as if you pause mid-sentence) and not chime in with what it thinks you were searching for. Konzelmann said this mimics more ‘human’ conversations.

Answering questions on Look and Talk, Konzelmann was asked if the idea was to eliminate the need for ‘Hey Google’ altogether. She said no, rather it’s to expand the ways in which you can start talking to the assistant that mirror real-life.

“I really like to think of ‘Hey Google’ as being similar to the assistant’s name … similar to how all of us have names and they’re very useful, sometimes people call my name before they want to talk to me, but other times they don’t. They might simply look towards me, they might tap me on the shoulder, or if it’s just the two of us sitting in a room, they might simply ask their question,” she said.

So what you’re saying is next my Google Nest Hub Max will reach out and tap me on the shoulder if it wants my attention?
"

My opinion only but it would seem unlikely that Kiki is powered by Google chips.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Fire
  • Wow
Reactions: 18 users

SERA2g

Founding Member
Just looked all their people from the founder down are software people so they are using someone else’s hardware.

The founders are ex Google.

Given that Kiki is processing multiple senses including touch, sound, motion and vision all on device unconnected and are highlighted on Nviso’s platform it seems a very small fairy step of faith to believe this is AKIDA.

If you go back to the earliest statements by Nviso about why teaming up with Brainchip was essential to Nviso providing the very things that Kiki is claimed to do and that Brainchip’s AKIDA was unique in this regard you are left to ask who could it be if not Brainchip.

My opinion & speculation only so DYOR but well done @Bloodsy
FF

AKIDA BALLISTA
Yep. The BRN / Nviso partnership statement also says that Nviso’s initial projects will relate to “social robots” and autonomous vehicles.

Kiki (Zoetic AI/Nviso) together with Nicobo (Panasonic/Nviso) are two social robots that Nviso have assisted with developing.

I think we can step shorter than a fairy and still arrive at the same outcome - akida.
 
  • Like
  • Fire
  • Love
Reactions: 46 users
  • Like
Reactions: 3 users
  • Love
  • Like
Reactions: 2 users
Top Bottom