BRN Discussion Ongoing

Tothemoon24

Top 20

AI 'brain' created from core materials for OLED TVs​

March 24, 2023
AI “brain” created from core materials for OLED TVs
Structure of the proposed AI synaptic device. Two oxide semiconductor transistors are connected; one for writing and the other for reading. Credit: POSTECH
ChatGPT's impact extends beyond the education sector and is causing significant changes in other areas. The AI language model is recognized for its ability to perform various tasks, including paper writing, translation, coding, and more, all through question-and-answer-based interactions.



The AI system relies on deep learning, which requires extensive training to minimize errors, resulting in frequent data transfers between memory and processors. However, traditional digital computer systems' von Neumann architecture separates the storage and computation of information, resulting in increased power consumption and significant delays in AI computations. Researchers have developed semiconductor technologies suitable for AI applications to address this challenge.

A research team at POSTECH, led by Professor Yoonyoung Chung (Department of Electrical Engineering, Department of Semiconductor Engineering), Professor Seyoung Kim (Department of Materials Science and Engineering, Department of Semiconductor Engineering), and Ph.D. candidate Seongmin Park (Department of Electrical Engineering), has developed a high-performance AI semiconductor deviceusing indium gallium zinc oxide (IGZO), an oxide semiconductor widely used in OLED displays.

The new device has proven to be excellent in terms of performance and power efficiency.

AI “brain” created from core materials for OLED TVs
AI synaptic device operation using oxide semiconductors. The conductance of the read transistor can be controlled by charging/discharging the storage node through the write transistor. Once weight updates are complete; the programmed weights are maintained as the writing transistor is turned off. Credit: POSTECH
Efficient AI operations, such as those of ChatGPT, require computations to occur within the memory responsible for storing information. Unfortunately, previous AI semiconductor technologies were limited in meeting all the requirements, such as linear and symmetric programming and uniformity, to improve AI accuracy.

The research team sought IGZO as a key material for AI computations that could be mass-produced and provide uniformity, durability, and computing accuracy. This compound comprises four atoms in a fixed ratio of indium, gallium, zinc, and oxygen and has excellent electron mobility and leakage current properties, which have made it a backplane of the OLED display.

Using this material, the researchers developed a novel synapse device composed of two transistors interconnected through a storage node. The precise control of this node's charging and discharging speed has enabled the AI semiconductor to meet the diverse performance metrics required for high-level performance.

AI “brain” created from core materials for OLED TVs
Input voltage-dependent potentiation-depression characteristics of the proposed artificial synapse devices. The brain-inspired synapse device based on an oxide semiconductor exhibits excellent linearity and symmetry during the potentiation and depression, while the output current linearly varies with the number of input pulses. The output current is nearly proportional to the input voltage, making it possible to perform AI training and inference tasks with high accuracy and extremely low power consumption. Credit: POSTECH
Furthermore, applying synaptic devices to a large-scale AI system requires the output current of synaptic devices to be minimized. The researchers confirmed the possibility of utilizing the ultra-thin film insulators inside the transistors to control the current, making them suitable for large-scale AI.

The researchers used the newly developed synaptic device to train and classify handwritten data, achieving a high accuracy of over 98%, which verifies its potential application in high-accuracy AI systems in the future.

AI “brain” created from core materials for OLED TVs
Simulation results of MNIST classification using IGZO 2T synaptic devices. High linear and symmetric weight programming characteristics of AI synaptic devices based on oxide semiconductors enable accurate training processes that minimize errors in artificial intelligence systems. In a simulation of random handwriting recognition, these devices achieved an inference accuracy of approximately 98%, which is nearly identical to the accuracy of ideal mathematical calculations. Credit: POSTECH
Professor Chung explained, "The significance of my research team's achievement is that we overcame the limitations of conventional AI semiconductor technologies that focused solely on material development. To do this, we utilized materials already in mass production. Furthermore, Linear and symmetrical programming characteristics were obtained through a new structure using two transistors as one synaptic device. Thus, our successful development and application of this new AI semiconductor technology show great potential to improve the efficiency and accuracy of AI."
 
  • Like
Reactions: 5 users

Diogenese

Top 20
Is hand gesture technology new ?



Mercedes-AMG EQS 53 4Matic Plus Tech Review: Future wrapped in luxury
March 25, 2023 11:38 am | Updated 07:44 pm IST

The Mercedes-AMG EQS 53 4Matic Plus embraces the future with enough tech features to satiate a hardcore technocrat​

NABEEL AHMED
Mercedes%20AMG.png


Performance​

Extending from A-Pillar to A-Pillar, the MBUX Hyperscreen in the EQS claims to be the biggest infotainment screen ever mounted in a series-built car. The 56-inch screen comes with 8 CPU cores, 24-gigabyte RAM, 46.4 GB per second RAM memory, 7 individual profiles, and a 2,432.11 cm2 viewing area in a zero-layer concept, which ensures that the most important functions are always on display.

MBUX%20Hyperscreen.png

The MBUX Hyperscreen in the EQS extends from A-Pillar to A-Pillar. | Photo Credit: Nabeel Ahmed
The display homogeneously adapts to lighting conditions and permanently displays climate controls for the driver and front passenger. The screen comes with haptic feedback, achieved with the help of 8 actuators in the central display and four in the front passenger display. The MBUX also comes with adaptive software offering personalised suggestions to individual driver profiles, which can be created via using the Mercedes me app on a smartphone.

Driver display and steering-mounted controls​

The driver’s 12.3-inch driver display rests behind the wheel and displays information based on the selected driver profile. We found the display bright and crisp, and its beginner profile settings were especially useful when manoeuvring through narrow urban stretches.

A driver can switch profiles based on the performance setting of the vehicle- which includes the Sport and Sport+- and change the overall lighting and settings within the cabin, including the fake engine sound pumped through the sound speakers.

Steering%20controls%20and%20screen.png

Steering mounted controls allow drivers to switch easily between driving modes. | Photo Credit: Nabeel Ahmed
Additionally, the driver display also presents prompts for ADAS features including active lane-keep assist and adaptive distance from oncoming traffic. The driver display settings can be adjusted using the steering-mounted controls which come with haptic feedback.The controls are easy to use and provide ample feedback, ensuring that the driver does not need to take their eyes off the road while switching between modes. Additionally, the side mirrors also present warnings every time a vehicle approaches for an overtake.

The steering also comes with paddle shifters and controls for drive modes.

Central display​

This 17.7-inch touchscreen display sits flush with the dashboard of the vehicle. It comes with access to all the driving, comfort, and infotainment features. The display is bright and comes with adaptive software, which is handy when scrolling through the menu, changing volume settings, and accessing infotainment features while driving.

ALSO READ

Mercedes-Benz to deploy advanced automated driving system in Nevada
This software is intuitive and responds well to user inputs based on past usage. We also found the hand gesture recognition feature helpful when opening the sunroof and switching on cabin lights on the go.

Central%20display%20Mercedes%20AMG.png

The central display also offers a host of luxury features. | Photo Credit: Nabeel Ahmed
The central display also offers a host of luxury features, including seat ventilation and massage features along with interior mood light controls. Seat ventilation settings are also available on the door panels for ease of access.

A welcome addition is the ability to download games on the screen, which can be accessed when the vehicle is stationary and provide ample entertainment when the vehicle is being charged.

Passenger display​

The passenger display, a 12.3-inch touchscreen unit, and the features baked into the central display ensure that front passengers have plenty to keep them engaged.

While skipping out on the functionality to switch between driving modes, the unit comes with the ability to input a destination, control the infotainment, seat ventilation and massage functionality. The passenger display thus ensures that the driver is not disturbed when changes are made to the passenger seat.

Another useful feature includes the dimming of the passenger display if when the vehicle detects the driver looking at the passenger screen. The screen can detect the presence of a front passenger and will only switch on if there is one.

Rear seat tablet​

As an AMG automobile, the Mercedes EQS focuses more on the driver than the passengers, but the vehicle does not leave them empty-handed, with their own tablet in a dedicated spot in the armrest.

Rear%20tablet.png

The rear seat of the AMG EQS gets a fully functional Android tablet. | Photo Credit: Nabeel Ahmed
The fully functional Android tablet comes with controls for rear seat reclining and ventilation, along with integration with the MBUX system. The tablet can also be used for content consumption and has all the functionality of users expect from an Android tablet.

Ease of life features​

Along with multiple screens, hand gestures, adaptive software and ADAS, the Mercedes EQS also comes with an inbuilt air purifier with a HEPA filter to remove airborne pollutants. During our review, the vehicle was able to start the purifier and climate control when it detected the key in its vicinity, ensuring clean and comfortable breathing upon entry. The vehicle was even able to keep AQI readings within reasonable limits when driven with an open sunroof.

Somewhere in this forest of posts (try the nViso thread), there is a branch which discusses Nviso having an AI App which is built on MetaTF and which nViso said was in use.

Is this too big a stretch ...

"The MBUX also comes with adaptive software offering personalised suggestions to individual driver profiles, which can be created via using the Mercedes me app on a smartphone"

... and how do they recognize hand gestures?

"This software is intuitive and responds well to user inputs based on past usage. We also found the hand gesture recognition feature helpful when opening the sunroof and switching on cabin lights on the go."

... not to mention lane keeping etc.
 
  • Like
  • Love
  • Fire
Reactions: 22 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Hey Gang!

Remember back on the 7th Feb 2023 NVISO announced the following:

NVISO Japan and Privately SA join forces to bring advanced age verification technology to digital kiosks. The solution offers secure, safe and engaging experiences while maintaining privacy and enhancing interactivity for digital kiosks and interactive avatars.​


Well, I just noticed that David Miles (Director of Safety Policy at Meta) "liked" Deepak Tewari (Privately SA) LinkedIn post about the announcement!

That is Meta as in formerly known as Facebook!

B 💋

View attachment 32983
View attachment 32984
View attachment 32985

Returning to this whole Meta connection (as above), I think it's quite interesting and makes a lot of sense when you think about Meta plans for the Metaverse through VR, AR, XR and the use of digital avatars. Camera's inside Meta's V R headsets track eye and face movements but they raise also new privacy questions in terms of how much personal data is being harvested from the device to the cloud.

Here's an extract from an article that explains the privacy issue really well.

Screen Shot 2023-03-26 at 3.14.28 pm.png

Screen Shot 2023-03-26 at 3.28.14 pm.png






Back on the 24th Jan 2023 NVISO also signed a digital avatar agreement with Unith. The announcement stated" The integrations will be designed to deliver interactive digital avatars capable of understanding various behavioural and emotional facets of users through eye tracking and facial detection".

So thinking about the big picture in all of this, I imagine that NVISO, Privately SA and Unith are all teaming together (along with BrainChip's help through NVISO) to overcome the privacy issues thanks to AKIDA, to prevent personal data from eye tracking etc, from leaving the device. So no wonder the Director of Safety Policy at Meta was interested in the collaboration announcement between NVISO and Privatley SA, as privacy concerns would be under his remitt IMO.

IMO. DYOR.


 

Attachments

  • Screen Shot 2023-03-26 at 3.14.28 pm.png
    Screen Shot 2023-03-26 at 3.14.28 pm.png
    141.1 KB · Views: 72
Last edited:
  • Like
  • Fire
  • Wow
Reactions: 11 users

zeeb0t

Administrator
Staff member
cool idea and tool

I tested a few questions and they came up as expected.

I also tested a few questions that had me asking more questions.
One Q I asked related to 'who are BRN Customers and who are they partnering with'. The results of this came up a little short IMO' so I asked one more question re one company that seemed to be missing.
Id be very concerned if Megachips has pulled out lol

Early days I know
View attachment 32986

What did you search for? I searched for "Is Megachips partnered with Brainchip?" and it answered with: "Yes, Megachips is partnered with Brainchip. They are incorporating the Akida technology into their ASIC solutions service, enabling the development and support required to design and manufacture integrated circuits and systems on chips with intelligence that will drive AI into the next generation of edge-based devices.".

I also asked: "What relationship does Megachips and Brainchip have together?" and it answered with: "Megachips is incorporating Brainchip's Akida technology into its ASIC solutions service, enabling the development and support required to design and manufacture integrated circuits and systems on chips with intelligence that will drive AI into the next generation of edge-based devices. Therefore, Megachips and Brainchip have a partnership where Megachips utilizes Brainchip's technology through designing the Akida IP into system-on-chip (SoC) products that will be ready-to-use systems or implementing Akida silicon into ready-to-use modules."

Maybe I am just 'asking the right questions'... can you share your 'wrong' question with me? It's cut off in the picture!
 
  • Like
Reactions: 1 users

zeeb0t

Administrator
Staff member
Hi Zeeb0t.

Even though the NASA logo is on the websites scrolling banner when I ask if NASA is associated with Brainchip the App doesent recognise it.
Perhaps because its a logo or embedded in the scrolling mechanism?

Brainchip Inc​

Question to ask:


Getting an answer for you now, one moment...

There is no information on the website that suggests a direct association between Brainchip and NASA. The website only mentions Brainchip's mission to make every device with a sensor AI-smart and its leadership in edge AI. It also provides information about the company's products, technology, markets, partners, and team members, including its board of directors and scientific advisory board.

To answer your question - sadly, they've provided no alt text indicating what the NASA logo means, or the relevance at all. The alt text for the image is just the image file name.

If that were updated, or, I equipped it with image recognition software (out of scope for time being, but I wouldn't say never!) then it would understand. For now, they simply provide no information other than a visual 'clue' in the code of the site.
 
  • Fire
  • Love
  • Like
Reactions: 3 users

Boab

I wish I could paint like Vincent
Somewhere in this forest of posts (try the nViso thread), there is a branch which discusses Nviso having an AI App which is built on MetaTF and which nViso said was in use.

Is this too big a stretch ...

"The MBUX also comes with adaptive software offering personalised suggestions to individual driver profiles, which can be created via using the Mercedes me app on a smartphone"

... and how do they recognize hand gestures?

"This software is intuitive and responds well to user inputs based on past usage. We also found the hand gesture recognition feature helpful when opening the sunroof and switching on cabin lights on the go."

... not to mention lane keeping etc.
Not sure if this is relevant? I had it on file. Does it dismiss ur chances of being part of this? NB from DEC 2022
Gesture.jpg
 
  • Thinking
Reactions: 2 users

Diogenese

Top 20

Boab

I wish I could paint like Vincent
  • Haha
Reactions: 8 users

Tothemoon24

Top 20
Defense News Logo

NATO preps tech competition to solve real-world security issues​

By Vivienne Machi
Mar 25 at 01:42 AM

YS6UAOGTKVAQ5K7MHNBTN3HYAE.jpg
(liuzishan/Getty Images)
STUTTGART, Germany — NATO’s nascent defense technology accelerator is preparing to launch the first several competition-style programs, meant to help the alliance find solutions to emerging technology problems.
NATO plans to begin this fall initial “challenges” under the Defence Innovation Accelerator for the North Atlantic, or DIANA, initiative, the alliance announced March 22. The events will pit competitors against each other to solve real-world security issues, such as operating in a GPS-denied environment, officials previously explained.

A call for participants to apply for the challenges is to take place in mid-2023, and up to 30 participants are to be selected for each challenge. The number of challenges and awardees is expected to grow annually until the program is fully operational by 2025, according to NATO.
The first challenges will take place at the following five accelerator sites, pending the conclusion of contractual arrangements:
  • Tehnopol in Tallinn, Estonia.
  • Officine Grandi Riparazioni in Turin, Italy.
  • BioInnovation Institute in Copenhagen, Denmark.
  • MassChallenge in the U.S. city of Boston, Massachusetts.
  • Pacific Northwest Mission Acceleration Center in the U.S. city of Seattle, Washington.
Click here to see DIANA’s test center locations

NATO members formally agreed to launch the DIANA initiative in April 2022, along with an initial €1 billion (U.S. $1.1 billion) investment in an innovation fund. The goal is for DIANA to help the alliance speed up trans-Atlantic cooperation on technologies such as artificial intelligence, big-data processing, energy and propulsion, autonomy, space, biotechnology, hypersonics, and quantum advancements.
DIANA will also build a network meant to support defense- and national security-focused tech startups that are developing NATO-relevant efforts through grant programs.
In December 2022, the DIANA board of directors agreed that energy resilience, secure information sharing, as well as sensing and surveillance would be the initiative’s technology priorities for 2023. Barbara McQuiston, deputy chief technology officer for science and technology at the U.S. Defense Department, is chair of the board. Imre Porkoláb, ministerial commissioner for defense innovation at Hungary’s Defence Ministry, serves as vice chair.
The board also recently agreed to add 28 new “deep-tech test centers” to the entity’s footprint, alongside two new accelerator sites in North America. That brings the total number of test centers to 91, with 11 accelerator sites, the alliance said March 22. That footprint is expected to continue to grow, according to NATO officials.

The Europe-based accelerator sites are currently located in London, England; Copenhagen; Brussels and Wallonia in Belgium; Tallinn; Turin; Prague, Czech Republic; Ankara, Turkey; Lisbon, Portugal; as well as Athens and Heraklion in Greece. In the United States, Boston and Seattle each host an accelerator site.
Click here to see DIANA’s accelerator sites
France intends to give DIANA access to national technology accelerators drawn from across its domestic innovation sector, according to NATO.
Meanwhile, the following countries have offered to link up existing and new national test centers to DIANA’s network: Belgium, Bulgaria, Denmark, Estonia, France, Germany, Greece, Hungary, Italy, Latvia, Lithuania, the Netherlands, Norway, Poland, Portugal, Romania, Slovakia, Slovenia, Spain, Turkey, the United Kingdom and the United States.
About Vivienne Machi
Vivienne Machi is a reporter based in Stuttgart, Germany, contributing to Defense News' European coverage. She previously reported for National Defense Magazine, Defense Daily, Via Satellite, Foreign Policy and the Dayton Daily News. She was named the Defence Media Awards' best young defense journalist in 2020.
 
  • Like
  • Fire
Reactions: 7 users

Diogenese

Top 20
No, its just a screen shot and my memory is also.....shot
Thanks Boab,

It just seems odd that it would deny Akida's capability for gesture recognition when that has been one of the earliest Akida demos.

Maybe it's not in the MetaTF version of Akida incorporated in the nViso app?
 
  • Like
Reactions: 6 users

Boab

I wish I could paint like Vincent
Thanks Boab,

It just seems odd that it would deny Akida's capability for gesture recognition when that has been one of the earliest Akida demos.

Maybe it's not in the MetaTF version of Akida incorporated in the nViso app?
Found it. Page 3
 

Attachments

  • NVISO_December_2022_SDK_News_Release (1).pdf
    1.5 MB · Views: 167
  • Like
Reactions: 4 users

zeeb0t

Administrator
Staff member
  • Like
  • Fire
  • Love
Reactions: 25 users

Violin1

Regular
Indeed. However that all being said, this is kind of the point - it uses the company website as it’s source :) now if they aren’t factual, that’s a whole other thing.
Hi. So can I just substitute Toshiba for Brainchip and off we go?
 
  • Like
  • Haha
Reactions: 3 users

zeeb0t

Administrator
Staff member

Violin1

Regular
Sorry what exactly do you mean?
Sorry, got excited. If I want to try to use it to crawl other company websites now, can I do that? I don't want to start playing and misuse it before you give the ok.
 
  • Like
  • Haha
Reactions: 2 users
Indeed. However that all being said, this is kind of the point - it uses the company website as it’s source :) now if they aren’t factual, that’s a whole other thing.

Aaarh, aren’t I the foolish one, talking before knowing the subject matter fully.

I’m sure Dio will have a quote similar to “Better to stay silent and have people think you’re stupid rather than open your mouth and prove it! 😂

Sounds promising then @zeebot

I’ve been at work all day and haven’t had time to use it yet so I look forward to giving it a workout.

It’ll be nice to have a tool to help sort fact from fiction!

Cheers!
 
  • Like
  • Fire
  • Love
Reactions: 8 users

zeeb0t

Administrator
Staff member
Sorry, got excited. If I want to try to use it to crawl other company websites now, can I do that? I don't want to start playing and misuse it before you give the ok.

Oh, I see what you mean now. No, you can't do that yourself - this example is tuned only to the Brainchip website. However, it is very easy for me to turn on additional websites. I have a few others in the building sector (typically they have terrible websites, so it's a great test-case) which work extremely well. Message me if you'd like the links.

The idea behind this is companies would pay me to have access to the tool via their website, online help / chat bot, email responders, etc.
 
  • Like
  • Fire
  • Love
Reactions: 19 users

Learning

Learning to the Top 🕵‍♂️
Thanks Perhaps,
Post in thread 'NVISO / BRN' https://thestockexchange.com.au/threads/nviso-brn.30719/post-267244

Post in thread 'NVISO / BRN' https://thestockexchange.com.au/threads/nviso-brn.30719/post-267347

Post in thread 'BRN Discussion Ongoing' https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-267468

I highly recommend everyone to re-read NVISO; November 30 2022 Presentation.

As Falling Knife has pointed out.
Screenshot_20230326_170848_Chrome.jpg

So NVISO is Porting 4 NVISO AI Apps to Brainchip's Akida in 3 months and the presentation was in 30th November 2022.

Today is 26th March 2023. So it should be available already.

Screenshot_20230326_170152_Samsung Notes.jpg

So for the neuromorphic module is costing $80-$300 (Available 2023).
So should Brainchip expect some revenue from NVISO soon? Asking for a friend 😎.

"Exiting times"

Learning 🏖
 
  • Like
  • Fire
  • Love
Reactions: 51 users

Esq.111

Fascinatingly Intuitive.
Good morning Esq 111
Hope all is well mate any chance of a Sunday tune?
Something to keep me rolling while exercising

🤘
Afternoon Tothemoon24,

Apologies for the delayed return.

-2GTHR - Kololova - Buenos Aires ( Enero 2022 )
11th September 2022.
Club Morocco.
2:58:27

* Sorry, iv forgotten how to post direct link.

Enjoy.

Regards,
Esq.
 
  • Like
  • Fire
  • Love
Reactions: 6 users

Learning

Learning to the Top 🕵‍♂️
Afternoon Tothemoon24,

Apologies for the delayed return.

-2GTHR - Kololova - Buenos Aires ( Enero 2022 )
11th September 2022.
Club Morocco.
2:58:27

* Sorry, iv forgotten how to post direct link.

Enjoy.

Regards,
Esq.


Learning 🏖
 
  • Fire
  • Love
  • Like
Reactions: 6 users
Top Bottom