BRN Discussion Ongoing

Taproot

Regular
Installation of Electric Harnesses ( Aircraft Panel )
Diota (June2021)
 
  • Like
  • Fire
Reactions: 4 users
Installation of Electric Harnesses ( Aircraft Panel )
Diota (June2021)

We know Saffron is linked to Brainchip:​

Safran Electrical & Power improves efficiency of cable assembly in Airbus A350’s structures​

In Customer storyPosted June 15, 2020
article_diota_sep-1170x650.jpg

How to accelerate and make reliable the assembly of tens of kilometers of cables that run through the fuselage of an airliner such as the Airbus A350 XWB? For Safran Electrical & Power, the answer lies in the implementation of Diota’s solution.

Engaged alongside Airbus in the production of its latest jumbo, Safran Electrical & Power is responsible for the production of 75% of the electrical harnesses that equip the fuselage of the A350 XWB. At its plant in Temara, Morocco, more than 400 people work in the manufacture of these complex wiring that supply all the electrical equipment in the cabin. The volumes are significant: since the delivery of the first harness for the A350 in 2012 the equipment manufacturer has already produced more than 100,000, which must then be assembled accurately within the fuselage of the A350.

Assembling electrical harnesses is a complex operation that leaves no room for chance. The length of the cables is calculated as close to the actual distance provided by the engineering, and each harness must take a precise path along the structural elements of the aircraft. The assembly is traditionally done by hand with the help of a technical plan of the device.

From the paper plan to the digital model​

The technician must therefore first identify among the different sections of the fuselage, then identify each of the crossing points before putting his harnesses. The process is all the more tedious as it varies from one aircraft to another, depending on the cabin configuration selected by the end customer. It does not tolerate any error, neither in the connection of the harnesses, nor in the routing of the cables.

In this context, Safran wanted to offer its technicians a tool adapted to their needs, capable of improving the efficiency of assembly, but also of allowing each operator to intuitively check the conformity of its assembly. Diota has responded to this dual objective by adapting its Digital-assisted operator solution, articulated around DiotaPlayer, to the operational constraints of harness assembly.

It allows Safran teams to visualize the exact routing of each harness in augmented reality via a tablet. Rather than follow a theoretical route on a plane, the operator can thus constantly compare his editing with the expected expected by the digital model of the aircraft. The tracking system developed by Diota provides the level of reliability needed to accurately determine the path in which a cable must pass and what connections to make. Tracking is effective even in the least well-lit sections.

A solution integrated to a simple tablet​

To successfully deploy augmented reality in an environment as constrained as the fuselage of an aircraft, Diota has developed a dedicated device. Equipped with a black and white camera to optimize contrast detection, it is associated with an inertial unit that completes the visual tracking module, allowing to be very precise in large volumes. The set has been integrated in a simple tablet so as not to hinder the movements of the technician and facilitate access to tight spaces. The solution finally shows an accessible interface in order to favor the handling by operators who are not necessarily used to Augmented Reality.

The results obtained on the production stages now point to new outlets for the solution developed by Diota, particularly on the maintenance phases
 
  • Like
  • Fire
Reactions: 17 users

We know Saffron is linked to Brainchip:​

Safran Electrical & Power improves efficiency of cable assembly in Airbus A350’s structures​

In Customer storyPosted June 15, 2020
article_diota_sep-1170x650.jpg

How to accelerate and make reliable the assembly of tens of kilometers of cables that run through the fuselage of an airliner such as the Airbus A350 XWB? For Safran Electrical & Power, the answer lies in the implementation of Diota’s solution.

Engaged alongside Airbus in the production of its latest jumbo, Safran Electrical & Power is responsible for the production of 75% of the electrical harnesses that equip the fuselage of the A350 XWB. At its plant in Temara, Morocco, more than 400 people work in the manufacture of these complex wiring that supply all the electrical equipment in the cabin. The volumes are significant: since the delivery of the first harness for the A350 in 2012 the equipment manufacturer has already produced more than 100,000, which must then be assembled accurately within the fuselage of the A350.

Assembling electrical harnesses is a complex operation that leaves no room for chance. The length of the cables is calculated as close to the actual distance provided by the engineering, and each harness must take a precise path along the structural elements of the aircraft. The assembly is traditionally done by hand with the help of a technical plan of the device.

From the paper plan to the digital model​

The technician must therefore first identify among the different sections of the fuselage, then identify each of the crossing points before putting his harnesses. The process is all the more tedious as it varies from one aircraft to another, depending on the cabin configuration selected by the end customer. It does not tolerate any error, neither in the connection of the harnesses, nor in the routing of the cables.

In this context, Safran wanted to offer its technicians a tool adapted to their needs, capable of improving the efficiency of assembly, but also of allowing each operator to intuitively check the conformity of its assembly. Diota has responded to this dual objective by adapting its Digital-assisted operator solution, articulated around DiotaPlayer, to the operational constraints of harness assembly.

It allows Safran teams to visualize the exact routing of each harness in augmented reality via a tablet. Rather than follow a theoretical route on a plane, the operator can thus constantly compare his editing with the expected expected by the digital model of the aircraft. The tracking system developed by Diota provides the level of reliability needed to accurately determine the path in which a cable must pass and what connections to make. Tracking is effective even in the least well-lit sections.

A solution integrated to a simple tablet​

To successfully deploy augmented reality in an environment as constrained as the fuselage of an aircraft, Diota has developed a dedicated device. Equipped with a black and white camera to optimize contrast detection, it is associated with an inertial unit that completes the visual tracking module, allowing to be very precise in large volumes. The set has been integrated in a simple tablet so as not to hinder the movements of the technician and facilitate access to tight spaces. The solution finally shows an accessible interface in order to favor the handling by operators who are not necessarily used to Augmented Reality.

The results obtained on the production stages now point to new outlets for the solution developed by Diota, particularly on the maintenance phases

Consider the additional features???:​

Deep Learning & Augmented Reality​

In NewsPosted May 14, 2021
Diota Deep Learning option for AR tracking

Diota accelerates the deployment of Augmented Reality solutions with the integration of a Deep Learning option allowing an automatic initialization of the tracking model that is more robust to changes in the environment​

Initialization of the tracking model is a required step before starting to use augmented reality to display work instructions on the field. Different methods are available to perform this initialization. Standard learning techniques make it possible to set up a reference model proposing rapid and efficient initialization. During the deployment phases though, when systems must be operational in varying conditions, e.g. day, night, natural lighting, artificial lighting, or within numerous workstations with different assembly configurations and changing backgrounds, standard learning procedures must be enriched to take these variations into account.

To face these challenges, a novel groundbreaking approach has been integrated in our Diota 4X solution using deep learning technology to ensure easy initialization of tracking and robustness to changing environment.
The Deep Learning option offers an alternative to our standard learning approach offering even more flexibility during deployment of use cases in factories. Indeed, the Deep Learning approach allows to create a learning base from a small volume of data using few video recordings in combination with the digital mockup to offer a generic initialization method that adapts to a broad variety of variations. Thus, the customer is spending significantly less time during initialization and therefore saves time on deployments while having flexibility in the choice of workstations.

Deep learning option is also particularly relevant if you already have implemented our solution and you want to quickly replicate and deploy the same use case on another industrial site with a different work environment
 
  • Like
  • Fire
Reactions: 10 users

Consider the additional features???:​

Deep Learning & Augmented Reality​

In NewsPosted May 14, 2021
Diota Deep Learning option for AR tracking

Diota accelerates the deployment of Augmented Reality solutions with the integration of a Deep Learning option allowing an automatic initialization of the tracking model that is more robust to changes in the environment​

Initialization of the tracking model is a required step before starting to use augmented reality to display work instructions on the field. Different methods are available to perform this initialization. Standard learning techniques make it possible to set up a reference model proposing rapid and efficient initialization. During the deployment phases though, when systems must be operational in varying conditions, e.g. day, night, natural lighting, artificial lighting, or within numerous workstations with different assembly configurations and changing backgrounds, standard learning procedures must be enriched to take these variations into account.

To face these challenges, a novel groundbreaking approach has been integrated in our Diota 4X solution using deep learning technology to ensure easy initialization of tracking and robustness to changing environment.
The Deep Learning option offers an alternative to our standard learning approach offering even more flexibility during deployment of use cases in factories. Indeed, the Deep Learning approach allows to create a learning base from a small volume of data using few video recordings in combination with the digital mockup to offer a generic initialization method that adapts to a broad variety of variations. Thus, the customer is spending significantly less time during initialization and therefore saves time on deployments while having flexibility in the choice of workstations.

Deep learning option is also particularly relevant if you already have implemented our solution and you want to quickly replicate and deploy the same use case on another industrial site with a different work environment
Hi @Diogenese.
We need your patent research skills to see how they achieve this on device training from a few videos.

Many thanks in advance.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
Reactions: 5 users
To celebrate the French election result which means continued support for Ukraine I just had to translate and post a French article:

Home > Software > Platform

BrainChip and the Swiss nViso team up around human behavioural analysis in the automotive industry

Posted on 22-04-2022 by Pierrick Arlot

Platform

BrainChip, which developed a neuromorphic processor for network edge (edge) under the name Akida, has engaged in a collaboration with the Swiss firm nViso, which specialises in human behavioural analysis reinforced by artificial intelligence (AI). This partnership, which focusses on battery-powered equipment in the automotive, new mobility and robotics markets, aims to meet the needs of applications that require high AI performance while relying on ultra-low-energy technologies.

The two partners will initially focus on the implementation of nViso AI solutions for social robots and automotive cabin monitoring systems on Akida processors.

Created in 2009 and based in the innovation park of the École Polytechnique fédérale de Lausanne (EPFL), nViso says it provides robust embedded software solutions capable of detecting and understanding human behaviour (facial expressions, emotions, head positions, looks, gestures, etc.), and generating actions accordingly, in real environments deployed on the edge of a distant network (deep edge).

These solutions are based on real-time perception and observation of people and objects in contextual situations, which are combined with semantics of human behaviour based on proven scientific research. With the objective of helping autonomous machines understand human behaviour and reasoning so that they are safe, secure and adapted to people in their environment.

In this context, BrainChip brings with its Akida processors capabilities to process artificial intelligence (AI) and machine learning (ML) algorithms locally with levels of performance and consumption that other approaches cannot achieve (read our articles here and here).

"NViso's AI-enhanced human behaviour analysis systems offer fascinating possibilities in homes, cars, buildings, hospitals, and we will support these functions with the performance and energy efficiency of our chips," says Sean Hehir, CEO of BrainChip. It is not only a collaboration between two companies, it is about advancing the state of the art in the field of AI with platforms for edge equipment capable of interpreting human behaviour, in order to improve product performance and user experience. ”

You can also follow our news on L'Embarqué's LinkedIN showcase dedicated to artificial intelligence in embedded: Embedded-IA
 
  • Like
  • Fire
Reactions: 21 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Who is BrainChip's customer(s) - Mercedes, LG, Cerence or all of them????...

This morning I've been delving into LG's newest premium in-vehicle infotainment (IVI) system. Back in 2020 Cerence signed a memorandum of understanding with LG Electronics LG to develop software that integrates the infotainment system with Cerence’s AI Reference Kit (ARK).

Here's an excerpt from an article about the memorandum which talks about their aim to offer "a wider range of AI-powered experiences for both manufacturers and auto customers".



Extract Only
The companies say they aim to create and advance a more convenient in-car experience through voice commands, allowing drivers and passengers to have far more control over car functions, navigation and multimedia content.

The partnership is designed to deliver more efficient management of resources to integrate Cerence ARK with webOS Auto, giving OEMs and tier-one manufacturers a pre-packaged, full-stack IVI software system that will shorten the time to market, bring an immersive voice experience and rich content ecosystem to connected cars and improve the user experience inside the vehicle.

“We look forward to this collaboration with Cerence to develop a turnkey voice solution for today’s auto and component makers to accelerate the arrival of the connected car,” said IP Park, president and CTO of LG Electronics. “We will continue to evolve webOS Auto by offering a wider range of AI-powered experiences for both manufacturers and auto customers.”


At the Mercedes EQXX reveal, Mercedes chief technology officer Markus Schäfer noted that the concept was a running and driving protoype, which uses "a version of the existing MBUX infotainment system". I'm going to assume that Markus is referring to a version meaning the one now being powered by AKIDA.

So, wouldn't Mercedes need to get permission from LG/Cerence to embed AKIDA in the existing infotainment system, or would it be more likley that LG and Cerence were customers already?

😵‍💫


 
Last edited:
  • Like
  • Fire
  • Thinking
Reactions: 21 users
I think these guys are on to something here. Someone needs to step up with a solution ideally it will be ultra low powered at least 5 to 10 times so and capable of functioning regardless of any connection issues. Would also be cool if it could learn new classes at the edge:

Edge Applications Bring New App Performance Challenges​

by TutsHost
3 min read
Edge Applications Bring New App Performance Challenges



Modern monitoring solutions provide visibility into the business of complex, advanced applications and help employees make data-driven decisions when investigating latency issues.

The application performance requirements are quite strict these days. However, the challenging expectations of service level for web and mobile apps seem odd when it comes to performance constraints imposed by high-end apps. This is the reason why companies need modern monitoring solutions to better understand and eliminate performance issues in such applications.
Let’s put the issue into perspective. The expectations of today’s employees, customers and other users are very demanding. They have no tolerance for delays. 40 percent of users will abandon a website that takes more than three seconds to load. And 53 percent of users will abandon a mobile app that fails to load in three seconds.
Related: Architecting Memory Pools For HPC And AI Applications Using CXL
When it comes to the rim, the tolerances are tighter. IoT devices and applications typically require latency rates of no more than 50ms and sometimes as high as 10ms.
Even worse, the consequences could be more serious if edge application causes a slight degradation in performance.
Take a common stand. The online user is about to make a purchase, but when they go to pay, the transaction is suspended. The problem may be that the third-party payment processing gateway is working poorly. If the customer gets frustrated, they will abandon the purchase. The company may lose not only that sale, but if a customer finds the same item on another site, they may lose their business forever.
Related: Emoji quiz challenges players to name romantic films
This has a huge impact on a poorly performing application. But the problem may be life-threatening if a delay occurs in one of the high-end applications. Just imagine if the pedestrian detection system of a self-driving car had a malfunction.
See also: Customer impatience leads to application performance requirements

With edge applications, toInattention is the devil​

Many new technologies, such as connected vehicles, AR/VR, and industrial automation, are imposing new requirements on latency. Many applications require single-digit millisecond latency. Complicating matters is the fact that data traversing multiple networks between a data center and an edge device can take tens of milliseconds or more.
There are several ways to reduce latency. One of the most important things is to properly design an edge app. For example, one could use a hub-and-spoke model with latency-sensitive components at the edge.
Related: Google's Phone app might pick up some cues from Android 12's lock screen
Another thing is to gain insight into the correctness of applying the tip. However, with the complexity of distributed applications, it can be difficult to access real-time telemetry. This can hinder troubleshooting and slow root cause analysis.
What we need is a modern observation solution that provides insight into the workings of complex edge applications. Such a solution can help employees make data-driven decisions and reduce the time required to investigate operational issues.
 
  • Like
  • Fire
  • Haha
Reactions: 8 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
I think these guys are on to something here. Someone needs to step up with a solution ideally it will be ultra low powered at least 5 to 10 times so and capable of functioning regardless of any connection issues. Would also be cool if it could learn new classes at the edge:

Edge Applications Bring New App Performance Challenges​

by TutsHost
3 min read
Edge Applications Bring New App Performance Challenges



Modern monitoring solutions provide visibility into the business of complex, advanced applications and help employees make data-driven decisions when investigating latency issues.

The application performance requirements are quite strict these days. However, the challenging expectations of service level for web and mobile apps seem odd when it comes to performance constraints imposed by high-end apps. This is the reason why companies need modern monitoring solutions to better understand and eliminate performance issues in such applications.
Let’s put the issue into perspective. The expectations of today’s employees, customers and other users are very demanding. They have no tolerance for delays. 40 percent of users will abandon a website that takes more than three seconds to load. And 53 percent of users will abandon a mobile app that fails to load in three seconds.
Related: Architecting Memory Pools For HPC And AI Applications Using CXL
When it comes to the rim, the tolerances are tighter. IoT devices and applications typically require latency rates of no more than 50ms and sometimes as high as 10ms.
Even worse, the consequences could be more serious if edge application causes a slight degradation in performance.
Take a common stand. The online user is about to make a purchase, but when they go to pay, the transaction is suspended. The problem may be that the third-party payment processing gateway is working poorly. If the customer gets frustrated, they will abandon the purchase. The company may lose not only that sale, but if a customer finds the same item on another site, they may lose their business forever.
Related: Emoji quiz challenges players to name romantic films
This has a huge impact on a poorly performing application. But the problem may be life-threatening if a delay occurs in one of the high-end applications. Just imagine if the pedestrian detection system of a self-driving car had a malfunction.
See also: Customer impatience leads to application performance requirements

With edge applications, toInattention is the devil​

Many new technologies, such as connected vehicles, AR/VR, and industrial automation, are imposing new requirements on latency. Many applications require single-digit millisecond latency. Complicating matters is the fact that data traversing multiple networks between a data center and an edge device can take tens of milliseconds or more.
There are several ways to reduce latency. One of the most important things is to properly design an edge app. For example, one could use a hub-and-spoke model with latency-sensitive components at the edge.
Related: Google's Phone app might pick up some cues from Android 12's lock screen
Another thing is to gain insight into the correctness of applying the tip. However, with the complexity of distributed applications, it can be difficult to access real-time telemetry. This can hinder troubleshooting and slow root cause analysis.
What we need is a modern observation solution that provides insight into the workings of complex edge applications. Such a solution can help employees make data-driven decisions and reduce the time required to investigate operational issues.


Hi FF,

I just got stuck at the part where it said "toInattention".🥴
 
  • Like
  • Haha
Reactions: 4 users

Taproot

Regular

We know Saffron is linked to Brainchip:​

Safran Electrical & Power improves efficiency of cable assembly in Airbus A350’s structures​

In Customer storyPosted June 15, 2020
article_diota_sep-1170x650.jpg

How to accelerate and make reliable the assembly of tens of kilometers of cables that run through the fuselage of an airliner such as the Airbus A350 XWB? For Safran Electrical & Power, the answer lies in the implementation of Diota’s solution.

Engaged alongside Airbus in the production of its latest jumbo, Safran Electrical & Power is responsible for the production of 75% of the electrical harnesses that equip the fuselage of the A350 XWB. At its plant in Temara, Morocco, more than 400 people work in the manufacture of these complex wiring that supply all the electrical equipment in the cabin. The volumes are significant: since the delivery of the first harness for the A350 in 2012 the equipment manufacturer has already produced more than 100,000, which must then be assembled accurately within the fuselage of the A350.

Assembling electrical harnesses is a complex operation that leaves no room for chance. The length of the cables is calculated as close to the actual distance provided by the engineering, and each harness must take a precise path along the structural elements of the aircraft. The assembly is traditionally done by hand with the help of a technical plan of the device.

From the paper plan to the digital model​

The technician must therefore first identify among the different sections of the fuselage, then identify each of the crossing points before putting his harnesses. The process is all the more tedious as it varies from one aircraft to another, depending on the cabin configuration selected by the end customer. It does not tolerate any error, neither in the connection of the harnesses, nor in the routing of the cables.

In this context, Safran wanted to offer its technicians a tool adapted to their needs, capable of improving the efficiency of assembly, but also of allowing each operator to intuitively check the conformity of its assembly. Diota has responded to this dual objective by adapting its Digital-assisted operator solution, articulated around DiotaPlayer, to the operational constraints of harness assembly.

It allows Safran teams to visualize the exact routing of each harness in augmented reality via a tablet. Rather than follow a theoretical route on a plane, the operator can thus constantly compare his editing with the expected expected by the digital model of the aircraft. The tracking system developed by Diota provides the level of reliability needed to accurately determine the path in which a cable must pass and what connections to make. Tracking is effective even in the least well-lit sections.

A solution integrated to a simple tablet​

To successfully deploy augmented reality in an environment as constrained as the fuselage of an aircraft, Diota has developed a dedicated device. Equipped with a black and white camera to optimize contrast detection, it is associated with an inertial unit that completes the visual tracking module, allowing to be very precise in large volumes. The set has been integrated in a simple tablet so as not to hinder the movements of the technician and facilitate access to tight spaces. The solution finally shows an accessible interface in order to favor the handling by operators who are not necessarily used to Augmented Reality.

The results obtained on the production stages now point to new outlets for the solution developed by Diota, particularly on the maintenance phases
Yes,
I have noticed that the Safran + Diota relationship continues to evolve, but everything is very quiet as far as Brainchip is concerned ?
From 2017
It was developed by the Diota company in Massy-Palaiseau, in which Safran Corporate Ventures took a stake in 2016. The system also uses an image processing application to check that the cable is correctly plugged in, produced by Spikenet Technology (BrainChip) in Toulouse.“We want to validate the system by the summer to deploy an industrial solution in 2018 on all the wiring sites that make complicated plug-ins ,” says Valentin Safir.


Safran as a customer is an absolute octopus for Brainchip. Just wish there was something a little more concrete to chew on.
Any current supporting linkages between Diota and Brainchip would do the trick, given Safran have a financial interest etc.
 
  • Like
  • Fire
  • Love
Reactions: 8 users
Yes,
I have noticed that the Safran + Diota relationship continues to evolve, but everything is very quiet as far as Brainchip is concerned ?
From 2017
It was developed by the Diota company in Massy-Palaiseau, in which Safran Corporate Ventures took a stake in 2016. The system also uses an image processing application to check that the cable is correctly plugged in, produced by Spikenet Technology (BrainChip) in Toulouse.“We want to validate the system by the summer to deploy an industrial solution in 2018 on all the wiring sites that make complicated plug-ins ,” says Valentin Safir.


Safran as a customer is an absolute octopus for Brainchip. Just wish there was something a little more concrete to chew on.
Any current supporting linkages between Diota and Brainchip would do the trick, given Safran have a financial interest etc.
I wait patiently for @Diogenese to emerge from the barrel on the mount to hand down his patent wisdom to the assembled masses.

@Diogenese is the one true hope.

FF

AKIDA BALLISTA
 
  • Like
  • Love
  • Fire
Reactions: 12 users
Hopefully Kevin is chewing his Marvell colleagues’ ears off going on about his old employer BrainChip, who he still seems to rate

E1859905-9393-41D5-9B9E-CB02425A9A35.jpeg



85F6DE77-EF37-41B2-90CC-4EDEE65BAEF1.jpeg

2A8E44F7-945C-4093-B8CA-C894468609AE.jpeg




B36AC26A-4AFE-450E-9C65-0360E6261E96.jpeg


368434E9-0F5D-4782-9B49-11A34A3F3B01.jpeg


C11A6AAB-7550-444C-AA35-3B091EE44C31.jpeg


9BCF0C22-A138-4C7E-A493-9168F6F742FD.jpeg
 
  • Like
  • Love
  • Fire
Reactions: 29 users

Veritone Launches Veriverse, a Portfolio of Integrated AI Solutions for Content IP Owners and Individuals Leveraging the Metaverse, NFTs and Blockchain​

AI/MLANALYTICS
By Business Wire On Apr 19, 2022
Veritone Launches Veriverse, a Portfolio of Integrated AI Solutions for Content IP Owners and Individuals Leveraging the Metaverse, NFTs and Blockchain

Share

Veriverse to empower customers, rights holders and influencers with the ability to leverage metaverse-ready solutions to create new revenue channels at scale

Veritone, Inc., creator of aiWARE™, a hyper-expansive enterprise AI platform, today announced the launch of a series of proven AI solutions to open up new distribution, commerce, and revenue opportunities for today’s content IP owners and individuals across multiple digital communities, including the metaverse. Veriverse™ offers a path forward in both traditional media channels and immersive environments to securely create and activate synthetic media, protect and manage identity and assets, maintain brand continuity across channels and safely create and sell NFTs.
Marketing Technology News: Why Some Small Businesses Don’t Use Websites
“Veritone Voice will streamline the way we work with talent in film and TV production, while still creating authentic experiences for audiences”
Citi is the latest banking giant to call the metaverse and Web3 a multi-trillion dollar opportunity. In its March 2022 report, “Metaverse and Money: Decrypting the Future,” Citi said the metaverse economy could be an $8 to $13 trillion total addressable market by 2030. In research published in December 2021, Goldman Sachs put a $12.5 trillion number on the space, in a bullish outlook that assumed one-third of the digital economy shifts into virtual worlds and then expands by 25 percent. Furthermore, a Gartner® report states, “the metaverse will provide economic opportunities using new kinds of digital business assets and value exchange models. Executive leaders should augment their digital transformation strategy by exploring virtual world product development, brand placement, customer engagement, and financial flows.” (Gartner, “How to Increase Customer Engagement and Drive Revenue in the Metaverse”, David Furlonger, Christophe Uzureau, Rajesh Kandaswamy, Feb 8, 2022).
With specific solutions that operate in traditional channels and the metaverse, Veriverse includes:
  • Veritone Voice
  • Veritone Avatar
  • Veritone NFT
  • Veritone Verify
  • Veritone Metaverse Migration Services
“Disparate metaverses already exist, signaling the emergence of Web3, an extremely fast-growing yet decentralized and disruptive opportunity, and content IP owners and influencers need to be fully enabled to make that transition without sacrificing their brand security, personality and attributes. Veritone is a trusted partner of the largest rights holders in the world and has unmatched expertise in artificial intelligence––including synthetic media, media workflows, advertising and content licensing. We are well positioned to help our current and future customers make the move to, and excel within, the metaverse,” said Ryan Steelberg, President and co-founder of Veritone.
Marketing Technology News:ForAllSecure Launches $2 Million Mayhem Heroes Program to…
Veritone Voice™ is an AI-enabled custom synthetic voice cloning solution that allows users to securely and ethically create and monetize synthetic voices in real or digitally immersive worlds by acquiring the necessary training data to create a custom voice that can be transformed into different languages, dialects, accents, genders and more.
“Veritone Voice will streamline the way we work with talent in film and TV production, while still creating authentic experiences for audiences,” said Amani Martin, Emmy award-winning director and producer and Managing Partner of Vin Scully Digital.
With Veritone Avatar™, users can create avatars for a broad range of use cases, including advertising, recruiting, patient care, eLearning and customer service.
“What audiences crave in these metaverse experiences is, ironically, human connection,” says Beau Romero, YouTube Influencer, Us Always member. “Bringing such realistic and lifelike avatars to immersive experiences is something I am very passionate about for the next evolution of my brand.”
Veritone NFT™ is a smart contract-enabled NFT minting and marketplace solution for audio, images, and video, including a proprietary minting service exclusive for Veritone clients. The Veritone NFT Marketplace enables approved users to create, buy and sell NFTs.
Veritone Verify™ is a powerful digital security system that provides content owners a way to manage and protect their identity and IP in both traditional channels and digitally immersive worlds. The solution also enables brand continuity and ensures their assets will be licensed and monetized within a centralized ecosystem.
“Individuals who create and own media have a critical need for cohesive solutions that combine content management, IP protection and monetization across real and digital immersive worlds,” Brett Kollmann, YouTube Influencer, Creator and Host of The Film Room. “I’m most excited about ensuring brand continuity while protecting my identity and assets as we create and share content in different ways, including in the metaverse.”
With Veritone Metaverse Migration Services, our industry experts guide companies of all sizes to assess their readiness, map a plan to enter the metaverse, ensure proper licensing of owned content in a metaverse world, establish activation for virtual shopping experiences, VIP events, exclusive engagements, product promotions, or new digital NFTs, and test and scale their migration.
“We are active users of the Veritone Digital Media Hub solution to manage our deep bench of 80+ years of video assets on and off the field,” said Paul Hodges, VP of content and entertainment, San Francisco Giants Productions. “When we partnered with Veritone back in 2020, we liked that the AI solution was future-proofed by being cloud and model agnostic. Today, we are seeing a path forward to engaging our fans in the metaverse because of our legacy investment and partnership with Veritone.”

Some interesting language used that raises a possibility of a continuing relationship.

My opinion and speculation only so DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Fire
  • Wow
Reactions: 18 users
  • Haha
  • Like
  • Sad
Reactions: 10 users

TheFunkMachine

seeds have the potential to become trees.
Careful not to bump your head. Read the Mercedes thread WHERE Brainchip, Mercedes and Nvidia are working side by side and then add the Rob Telson interview a few weeks ago where he was asked about competing with Nvidia and he said they saw Nvidia as more of a partner moving forward.

My opinion only DYOR
FF

AKIDA BALLISTA
What the heck! That is huge! I will have to do some more reading on that. Thank you:)
 
  • Like
  • Love
  • Fire
Reactions: 9 users

TheFunkMachine

seeds have the potential to become trees.
https://venturebeat-com.cdn.ampproj...ner-to-bring-ai-and-ml-to-edge-computing/amp/

I think this article has been shared before, but I read something today that I didn’t really think to much of last time. Maybe someone else has an idea of what is meant here.

The interview is asking Sifive about if Brainchip has any Arm IP in their chip. He goes on by saying that ARM IP has been mentioned by Brainchip.

I wasn’t aware that they used any ARM IP in Akida. It’s another connection between Brainchip and ARM ?

I thought they used Arm Core of some sort but maybe that’s me not understanding it properly.

And I also find the engagement between Brainchip and Sifive interesting when you think of the fact that ARM is supposed to be biggest competitor of Sifive and all the ex Arm employees jumping ship.

Is Brainchip the key to taking on Arm ?
 

Attachments

  • 986924AC-2905-434D-AF41-4D033C112164.jpeg
    986924AC-2905-434D-AF41-4D033C112164.jpeg
    522.9 KB · Views: 60
  • Like
Reactions: 14 users
https://venturebeat-com.cdn.ampproj...ner-to-bring-ai-and-ml-to-edge-computing/amp/

I think this article has been shared before, but I read something today that I didn’t really think to much of last time. Maybe someone else has an idea of what is meant here.

The interview is asking Sifive about if Brainchip has any Arm IP in their chip. He goes on by saying that ARM IP has been mentioned by Brainchip.

I wasn’t aware that they used any ARM IP in Akida. It’s another connection between Brainchip and ARM ?

I thought they used Arm Core of some sort but maybe that’s me not understanding it properly.

And I also find the engagement between Brainchip and Sifive interesting when you think of the fact that ARM is supposed to be biggest competitor of Sifive and all the ex Arm employees jumping ship.

Is Brainchip the key to taking on Arm ?
The AKD1000 chip uses an ARM 4 processor for the peripherals. It has been stated on many occasions that AKIDA technology is processor agnostic and the clearest statement as to how this work in my opinion is to be found in Anil Mankar's presentation to the 2021 Ai Field Day. Now bearing in mind that I am not a techie but a lawyer I would think that if you were using AKIDA IP with SiFive's RISC-V it would perform the function of the ARM4 and you would not need any additional architecture as the IP is the neural network.

If you were putting SiFive's RISC-V next to an AKD1000 it would have to be using the ARM4 because it is on the chip as part of the design fabric.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Fire
Reactions: 16 users

Taproot

Regular

Veritone Launches Veriverse, a Portfolio of Integrated AI Solutions for Content IP Owners and Individuals Leveraging the Metaverse, NFTs and Blockchain​

AI/MLANALYTICS
By Business Wire On Apr 19, 2022
Veritone Launches Veriverse, a Portfolio of Integrated AI Solutions for Content IP Owners and Individuals Leveraging the Metaverse, NFTs and Blockchain

Share

Veriverse to empower customers, rights holders and influencers with the ability to leverage metaverse-ready solutions to create new revenue channels at scale

Veritone, Inc., creator of aiWARE™, a hyper-expansive enterprise AI platform, today announced the launch of a series of proven AI solutions to open up new distribution, commerce, and revenue opportunities for today’s content IP owners and individuals across multiple digital communities, including the metaverse. Veriverse™ offers a path forward in both traditional media channels and immersive environments to securely create and activate synthetic media, protect and manage identity and assets, maintain brand continuity across channels and safely create and sell NFTs.
Marketing Technology News: Why Some Small Businesses Don’t Use Websites

Citi is the latest banking giant to call the metaverse and Web3 a multi-trillion dollar opportunity. In its March 2022 report, “Metaverse and Money: Decrypting the Future,” Citi said the metaverse economy could be an $8 to $13 trillion total addressable market by 2030. In research published in December 2021, Goldman Sachs put a $12.5 trillion number on the space, in a bullish outlook that assumed one-third of the digital economy shifts into virtual worlds and then expands by 25 percent. Furthermore, a Gartner® report states, “the metaverse will provide economic opportunities using new kinds of digital business assets and value exchange models. Executive leaders should augment their digital transformation strategy by exploring virtual world product development, brand placement, customer engagement, and financial flows.” (Gartner, “How to Increase Customer Engagement and Drive Revenue in the Metaverse”, David Furlonger, Christophe Uzureau, Rajesh Kandaswamy, Feb 8, 2022).
With specific solutions that operate in traditional channels and the metaverse, Veriverse includes:
  • Veritone Voice
  • Veritone Avatar
  • Veritone NFT
  • Veritone Verify
  • Veritone Metaverse Migration Services
“Disparate metaverses already exist, signaling the emergence of Web3, an extremely fast-growing yet decentralized and disruptive opportunity, and content IP owners and influencers need to be fully enabled to make that transition without sacrificing their brand security, personality and attributes. Veritone is a trusted partner of the largest rights holders in the world and has unmatched expertise in artificial intelligence––including synthetic media, media workflows, advertising and content licensing. We are well positioned to help our current and future customers make the move to, and excel within, the metaverse,” said Ryan Steelberg, President and co-founder of Veritone.
Marketing Technology News:ForAllSecure Launches $2 Million Mayhem Heroes Program to…
Veritone Voice™ is an AI-enabled custom synthetic voice cloning solution that allows users to securely and ethically create and monetize synthetic voices in real or digitally immersive worlds by acquiring the necessary training data to create a custom voice that can be transformed into different languages, dialects, accents, genders and more.
“Veritone Voice will streamline the way we work with talent in film and TV production, while still creating authentic experiences for audiences,” said Amani Martin, Emmy award-winning director and producer and Managing Partner of Vin Scully Digital.
With Veritone Avatar™, users can create avatars for a broad range of use cases, including advertising, recruiting, patient care, eLearning and customer service.
“What audiences crave in these metaverse experiences is, ironically, human connection,” says Beau Romero, YouTube Influencer, Us Always member. “Bringing such realistic and lifelike avatars to immersive experiences is something I am very passionate about for the next evolution of my brand.”
Veritone NFT™ is a smart contract-enabled NFT minting and marketplace solution for audio, images, and video, including a proprietary minting service exclusive for Veritone clients. The Veritone NFT Marketplace enables approved users to create, buy and sell NFTs.
Veritone Verify™ is a powerful digital security system that provides content owners a way to manage and protect their identity and IP in both traditional channels and digitally immersive worlds. The solution also enables brand continuity and ensures their assets will be licensed and monetized within a centralized ecosystem.
“Individuals who create and own media have a critical need for cohesive solutions that combine content management, IP protection and monetization across real and digital immersive worlds,” Brett Kollmann, YouTube Influencer, Creator and Host of The Film Room. “I’m most excited about ensuring brand continuity while protecting my identity and assets as we create and share content in different ways, including in the metaverse.”
With Veritone Metaverse Migration Services, our industry experts guide companies of all sizes to assess their readiness, map a plan to enter the metaverse, ensure proper licensing of owned content in a metaverse world, establish activation for virtual shopping experiences, VIP events, exclusive engagements, product promotions, or new digital NFTs, and test and scale their migration.
“We are active users of the Veritone Digital Media Hub solution to manage our deep bench of 80+ years of video assets on and off the field,” said Paul Hodges, VP of content and entertainment, San Francisco Giants Productions. “When we partnered with Veritone back in 2020, we liked that the AI solution was future-proofed by being cloud and model agnostic. Today, we are seeing a path forward to engaging our fans in the metaverse because of our legacy investment and partnership with Veritone.”

Some interesting language used that raises a possibility of a continuing relationship.

My opinion and speculation only so DYOR
FF

AKIDA BALLISTA
I was literally about to [post the same article.
The whole AR / VR / Web3 space is something I'm really excited about. The amount of money pouring into these applications over the next 10 years will be mind blowing.

Citi is the latest banking giant to call the metaverse and Web3 a multi-trillion dollar opportunity. In its March 2022 report, “Metaverse and Money: Decrypting the Future,” Citi said the metaverse economy could be an $8 to $13 trillion total addressable market by 2030. In research published in December 2021, Goldman Sachs put a $12.5 trillion number on the space, in a bullish outlook that assumed one-third of the digital economy shifts into virtual worlds and then expands by 25 percent. Furthermore, a Gartner® report states, “the metaverse will provide economic opportunities using new kinds of digital business assets and value exchange models. Executive leaders should augment their digital transformation strategy by exploring virtual world product development, brand placement, customer engagement, and financial flows.”

If Brainchip can pick up 1% of total spending in this space, it's worth 130 Billion dollars !!!.
If Brainchip makes a complete dogs breakfast of all their engagements and only manages to pick up 0.01% of the total spend, it's still worth 13 Billion dollars.
 
  • Like
  • Fire
  • Love
Reactions: 30 users

Taproot

Regular
I was literally about to [post the same article.
The whole AR / VR / Web3 space is something I'm really excited about. The amount of money pouring into these applications over the next 10 years will be mind blowing.

Citi is the latest banking giant to call the metaverse and Web3 a multi-trillion dollar opportunity. In its March 2022 report, “Metaverse and Money: Decrypting the Future,” Citi said the metaverse economy could be an $8 to $13 trillion total addressable market by 2030. In research published in December 2021, Goldman Sachs put a $12.5 trillion number on the space, in a bullish outlook that assumed one-third of the digital economy shifts into virtual worlds and then expands by 25 percent. Furthermore, a Gartner® report states, “the metaverse will provide economic opportunities using new kinds of digital business assets and value exchange models. Executive leaders should augment their digital transformation strategy by exploring virtual world product development, brand placement, customer engagement, and financial flows.”

If Brainchip can pick up 1% of total spending in this space, it's worth 130 Billion dollars !!!.
If Brainchip makes a complete dogs breakfast of all their engagements and only manages to pick up 0.01% of the total spend, it's still worth 13 Billion dollars.
This is where it's all heading,
 
  • Like
  • Fire
  • Love
Reactions: 13 users
Top Bottom