BRN Discussion Ongoing

The Pope

Regular
Just my thoughts with SP going down on many companies. I also get a gut feel a number of investors across many stocks have unfortunately had to sell due to rising hardship with rapid rise in interest rates and possibly margin lending on shares and banks calling it in on investors to either top up loans or force to sell shares. Yeah still have shorters and manipulators still going at it hammer and tong as well.
Also 800000 plus mortgages are coming off fix rates and facing 10 interest rate rises in one hit over next 6months. Interesting times on what people may do without anymore more interest rate raises or job losses etc
I recall someone posted yesterday or today they had to sell 500000 BRN shares for possibly reasons I have noted above.
All the best to all BRN retail investors and hope we are all rewarded for holding our BRN shares etc through these uncertain times with the world economy etc
 
  • Like
  • Love
Reactions: 23 users

HopalongPetrovski

I'm Spartacus!
Just my thoughts with SP going down on many companies. I also get a gut feel a number of investors across many stocks have unfortunately had to sell due to rising hardship with rapid rise in interest rates and possibly margin lending on shares and banks calling it in on investors to either top up loans or force to sell shares. Yeah still have shorters and manipulators still going at it hammer and tong as well.
Also 800000 plus mortgages are coming off fix rates and facing 10 interest rate rises in one hit over next 6months. Interesting times on what people may do without anymore more interest rate raises or job losses etc
I recall someone posted yesterday or today they had to sell 500000 BRN shares for possibly reasons I have noted above.
All the best to all BRN retail investors and hope we are all rewarded for holding our BRN shares etc through these uncertain times with the world economy etc
"with SP going down on many companies"

Thank you for your lovely thoughts there Mr Pope and even though I'll turn 66 years old later this year if I survive that long, there still lives within me a sniggering 12 year old. Is it just me or do some of us just never grow up? 🤣
 
  • Like
  • Haha
  • Love
Reactions: 30 users

The Pope

Regular
All good. Yep always amusing even when we get older. Lmfao
 
  • Like
  • Haha
Reactions: 9 users

cosors

👀
Last edited:
  • Haha
  • Like
Reactions: 16 users

Deadpool

hyper-efficient Ai
Bloody typical - you couldn't trust Australians to make a tin boat let alone a submarine, and now we can't even get disparity right.
Hi Dodge, I believe you are a very accomplished comedian in a parallel universe, and its coming through in this one, jocularity is the spice of life in my opinion.
 
  • Like
  • Haha
  • Love
Reactions: 13 users

Slade

Top 20
“Get ready to unveil our AI/ML services as they are set to revolutionize the industry. Our range of products and services can make it easy for you to realize your goals.”

View attachment 32904

Very interesting in light of BrainChip’s recent announcement.

BrainChip expands its Ecosystem with Teksun to Bring the Akida Processor to Next-Generation AIoT Devices​

ISC West is a massive Security trade event.



More from Teksun:
"It's time to prepare for the leading comprehensive & converged security trade event! While we have buckled ourselves up to present you with a few cutting-edge technological solutions in the field of embedded hardware.
With the potential to transform the many verticals like Drones & Robotics, Cybersecurity & Connected IoT, Smart homes, and more find out which solutions we have that can transform your business for the better. Mark your calendars from 28 - 31 March 2023 for ISC West 2023. Let's catch up and explore all the exciting new technologies."
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 20 users

Learning

Learning to the Top 🕵‍♂️
Thanks latnrollah,

At market launch of the new model series, the following options will be available:

1/Templates: With the help of these templates, customers can experiment with the basic types of support that are possible with the help of artificial intelligence. Examples here include "Cold days" (switching on the seat heating at a certain outside temperature, ambient lighting changes to warm orange) or "Date night" (romantic music is played via Bluetooth audio, ambient lighting turns pink). The templates are mostly stored in the backend. Some are stored locally in the vehicle, so they can be used even if there happens to be no connectivity. The templates are operated via the vehicle's central display and are arranged in carousel-style on the screen.


AI-generated routines[17]:
In the future, the aim is for the E-Class to fully automate recurring routine tasks if desired. Artificial intelligence (AI) will make this possible. The vehicle's ability to learn and evolve with the customer will represent a new level of intelligence. In the first stage of expansion, Mercedes-Benz will offer AI-generated routines for the driver and the seating system (ventilation, heating, massage). Other interior systems are to be successively integrated and further routines made possible.


Then what is [17]???

[17]The functions described represent visions of the future, some of which are not yet available and will depend on the respective vehicle model, the individual configuration and the particular market.

From the Mercedes Feb 22 Release, the Routines function seem like an added option with Akida. What is everyone take on this?

Learning 🏖
Now let revisit AI-generated routines soon to be in the new Mercedes.

Screenshot_20230324_220534_LinkedIn.jpg


"Just like a fine wine" the car's software will gets better over time.

So just think about all that's for a minute.

I do understand that's Mercedes is talking about software and update overtime to make improvements of it software to become intelligence. BUT, to run the software, hardware needs to be in the Mercedes; so we have the main CPU (Nvidia).

But remember this?

"BrainChip already is seeing some success. It’s Akida 1000 platform is being used in Mercedes-Benz’s Vision EQXX concept car for in-cabin AI, including driver and voice authentication, keyword spotting and contextual understanding."


Brainchip's Akida technology is to make "sensor smarter". So with Mercedes AI-generated routines. Being implement in the new Mercedes. I do think AKIDA is processing driver and voice authentication, keyword spotting and contextual understanding (learning driver behaviours) at the sensor (process relevant data). Sent it to the main CPU (Nvidia) and the CPU sent it to the cloud (Data Center). Data center process the data and make improvements on the software to make the software more intelligent. Then software updates in the car. So that how it's 'gets better over time.

Just my intoxicated thought (DYOR)
-------

On another note. In the Next Platform article;

"BrainChip also is working with SiFive to integrate the Akida technology with SiFive’s RISC-V processors for edge AI computing workloads and MosChip, running its Akida IP with the vendor’s ASIC platform for smart edge devices."

Good old MosChip...

Learning 🏖 🍺😎
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 44 users
Now let revisit AI-generated routines soon to be in the new Mercedes.

View attachment 32906

"Just like a fine wine" the car's software will gets better over time.

So just think about all that's for a minute.

I do understand that's Mercedes is talking about software and update overtime to make improvements of it software to become intelligence. BUT, to run the software, hardware needs to be in the Mercedes; so we have the main CPU (Nvidia).

But remember this?

"BrainChip already is seeing some success. It’s Akida 1000 platform is being used in Mercedes-Benz’s Vision EQXX concept car for in-cabin AI, including driver and voice authentication, keyword spotting and contextual understanding."


Brainchip's Akida technology is to make "sensor smarter". So with Mercedes AI-generated routines. Being implement in the new Mercedes. I do think AKIDA is processing driver and voice authentication, keyword spotting and contextual understanding (learning driver behaviours) at the sensor (process relevant data). Sent it to the main CPU (Nvidia) and the CPU sent it to the cloud (Data Center). Data center process the data and make improvements on the software to make the software more intelligent. Then software updates in the car. So that how it's 'gets better over time.

Just my intoxicated thought (DYOR)
-------

On another note. In the Next Platform article;

"BrainChip also is working with SiFive to integrate the Akida technology with SiFive’s RISC-V processors for edge AI computing workloads and MosChip, running its Akida IP with the vendor’s ASIC platform for smart edge devices."

Good old MosChip...

Learning 🏖 🍺😎
This cameo by MOSCHIP comes out of left field. The 1000 Eyes has homework for the weekend.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Fire
  • Haha
Reactions: 10 users

Deadpool

hyper-efficient Ai
If the below story is accurate, I'm not entirely sure that we should be getting into bed with this company anyway.
 
  • Like
  • Thinking
Reactions: 3 users
This cameo by MOSCHIP comes out of left field. The 1000 Eyes has homework for the weekend.

My opinion only DYOR
FF

AKIDA BALLISTA
This cameo by MOSCHIP comes out of left field. The 1000 Eyes has homework for the weekend.

My opinion only DYOR
FF

AKIDA BALLISTA

The article is quite dated. May 2022. I recall there was some excitement at the time re Moschips but very little info since.
Good article by the way!

😀
 
  • Like
Reactions: 6 users

Slade

Top 20
Now let revisit AI-generated routines soon to be in the new Mercedes.

View attachment 32906

"Just like a fine wine" the car's software will gets better over time.

So just think about all that's for a minute.

I do understand that's Mercedes is talking about software and update overtime to make improvements of it software to become intelligence. BUT, to run the software, hardware needs to be in the Mercedes; so we have the main CPU (Nvidia).

But remember this?

"BrainChip already is seeing some success. It’s Akida 1000 platform is being used in Mercedes-Benz’s Vision EQXX concept car for in-cabin AI, including driver and voice authentication, keyword spotting and contextual understanding."


Brainchip's Akida technology is to make "sensor smarter". So with Mercedes AI-generated routines. Being implement in the new Mercedes. I do think AKIDA is processing driver and voice authentication, keyword spotting and contextual understanding (learning driver behaviours) at the sensor (process relevant data). Sent it to the main CPU (Nvidia) and the CPU sent it to the cloud (Data Center). Data center process the data and make improvements on the software to make the software more intelligent. Then software updates in the car. So that how it's 'gets better over time.

Just my intoxicated thought (DYOR)
-------

On another note. In the Next Platform article;

"BrainChip also is working with SiFive to integrate the Akida technology with SiFive’s RISC-V processors for edge AI computing workloads and MosChip, running its Akida IP with the vendor’s ASIC platform for smart edge devices."

Good old MosChip...

Learning 🏖 🍺😎
Interesting

MAR 22, 2023

Robocar No Longer Drives Nvidia GTC​

At Nvidia’s GPU Technology Conference (GTC) this week, the shiniest new object was ChatGPT. Autonomous driving is already an old story.

What’s at stake:
For years, Nvidia has hung its hat on autonomous driving as the linchpin of its AI technologies. However, that revenue stream is waning, not because AV is a solved problem but because it is just too hard a problem to solve. What’s the next big AI application?

GTC, put together by Nvidia, is one of the world’s premier conferences dedicated to AI developers. Nvidia has used the venue to showcase its AI prowess built on GPU technology.

For several years, AI-enabled autonomous driving has highlighted every GTC. Nvidia presented its AI solutions — deployed in data centers for AI training and its multi-thousand teraflops SoC inside vehicles’ central brains doing AI inference.

However, it was evident in a pre-GTC briefing this week that Nvidia has begun singing a markedly different tune on fully automated driving.

Fully autonomous vehicles are ‘the next decade’s problems.’
Ali Kani
Offering a more measured and somewhat skeptical view on autonomous driving – unprecedented for Nvidia – Ali Kani, vice president and general manager of the automotive team, called fully autonomous vehicles “the next decade’s problems.”

Given that “amazing companies with multibillion dollar investment, like Argo.ai, have gone under,” Kani said that it is “very reasonable” for OEMs to focus on making ADAS better and treating L4 as a long-term goal.

Further, he cautioned that autonomous driving “isn’t like AI development demo – as a kind of a fun little thing I can show.” People’s lives are at stake, and “there is no room for error,” he added.

“Robotaxi companies, trying to make it work in one city, are having challenges … and they are the best tech companies in the world,” he said. To make matters worse, “they’re trying to solve the problem worldwide in a passenger car … You can’t map that world every week. It’s impossible. What they must solve is an entirely different problem an order of magnitude harder.”

In other words, reality bites – even at GTC where AI always seems invincible.

The impressive capabilities of generative AI have created a sense of urgency.
Jensen Huang
ChatGPT, here it comes
Fast forward to the keynote speech Nvidia founder and CEO Jensen Huang delivered Tuesday.

Huang stressed, “the impressive capabilities of generative AI have created a sense of urgency” among companies eager to reimagine products and business models.

With computing advancing at “lightspeed,” Huang discussed a broad set of partnerships crucial to build an ecosystem for everything from training to deployment for cutting-edge AI services. These encompass Google, Microsoft, Oracle and a range of leading businesses that bring new AI, simulation and collaboration capabilities to every industry.

In other words, shelve autonomous driving, think ChatGPT.

Nvidia’s automotive business
Put simply, Nvidia is no longer counting on AVs to sell its automotive SoCs.

Instead, it has thrown an arsenal of technologies at automakers, urging them to use AI to support software development and plan new automotive plants.

The pitch at GTC to the automotive industry was Nvidia’s Omniverse platform and AI. Nvidia is now saying that automakers can leverage both Omniverse and AI to improve their “product lifecycle from concepts and styling, design and engineering, software and electronics, smart factories, autonomous driving and retail.”

According to announcements at GTC, Nvidia is working with the BMW Group – not on the vehicle front, but on vehicle production infrastructure. By building and operating industrial metaverse applications across BMW Group’s production network, Nvidia claims that it is helping BMW on an electric vehicle plant in Debrecen, Hungary, projected to open in 2025.

As for the central compute SoC, Nvidia’s automotive SoCs (such as Orin or Thor) won’t be powering BMW’s next-generation models. Instead, it will be Qualcomm’s automotive SoC platform.

Nonetheless, one big design win Nvidia announced at GTC this week is BYD. China’s leading EV company is building its mainstream vehicles on Nvidia’s Drive Orin SoC.

Other tier ones and OEMs are also adopting Nvidia’s platform for next-gen vehicle development, claimed Nvidia. They include DeepRoute, Pony.ai, Lenovo, Rimac Technology and smart. However, not every customer is embracing both hardware and software designed by Nvidia.

Nvidia said that carmakers are taking advantage of Nvidia’s flexible offerings. Nvidia’s platform is “open,” with parts of it available in “modules.” Some customers are building their own cockpit application software. Others are even developing their own autonomous driving stacks.

All said and done, Nvidia estimated that its automotive design win pipeline has increased to $14 billion over the next six years (2023 – 2028 inclusive). What matters to its automotive revenue isn’t just accumulating design wins inside vehicle models, but the support infrastructure Nvidia’s AI and Omniverse can extend to automakers.

Bottom line:
More automakers are backing off autonomous vehicles and opting for improved ADAS. In this environment, Nvidia’s automotive SoCs could prove to be overkill. Nvidia’s revised AI strategy for the auto industry appears to be cultivating an alternative path – supporting OEMs’ production and product lifecycles. It is hardly clear, however, if this will prove more lucrative than pursuing design wins for Nvidia’s automotive SoC design in every AV model.
 
  • Like
  • Fire
  • Wow
Reactions: 12 users
It has probably/certainly already been posted but I can't find it here via the search.

Brainchip Readies Generation 2 Akida AI | Gestalt IT Rundown: March 8, 2023​


https://gestaltit.com/rundown/tom/b...n-2-akida-ai-gestalt-it-rundown-march-8-2023/

Yeah it's been posted before and straight off the bat, these knobs don't know anything about the technology, to be able to comment..

AKIDA1000 was "one size fits all"?

Not sure what their credentials are, but they're knobs..
 
  • Haha
  • Like
  • Love
Reactions: 10 users

Wags

Regular
"with SP going down on many companies"

Thank you for your lovely thoughts there Mr Pope and even though I'll turn 66 years old later this year if I survive that long, there still lives within me a sniggering 12 year old. Is it just me or do some of us just never grow up? 🤣
Hoppy, Growing old is mandatory, Growing up is optional..........
 
  • Haha
  • Like
Reactions: 16 users

Diogenese

Top 20
Hoppy, Growing old is mandatory, Growing up is optional..........
Someone gave me a T-shirt with:
"When is this 'old enough to know better' thing supposed to kick in?'
 
Last edited:
  • Haha
  • Like
Reactions: 16 users

jtardif999

Regular

ARM adds custom instructions to M85 controller in RISC-V AI fightback​

Technology News | March 13, 2023
By Nick Flaherty
RISC-V IOT AIMPUS/MCUS

ARM has developed a new version of its high end Cortex-M85 microcontroller, allowing chip designers to add custom instructions for AI applications.​

This has been one of the key advantages of the competing RISC-V instruction set and signals a fight back by ARM ahead of the Embedded World (EW2023) exhibition in Nuremberg, Germany, this week.


Renesas Electronics will be showing the first live demonstrations of artificial intelligence (AI) and machine learning (ML) implementations on the previous generation M85 design.

This first demonstration showcases a people detection application developed in collaboration with UK ML startup Plumerai that identifies and tracks persons in the camera frame in varying lighting and environmental conditions.

The second demo showcases a motor control predictive maintenance use case with an AI-based unbalanced load detection application using Tensorflow Lite for Microcontrollers with CMSIS-NN.


The latest version of the M85 core, Cortex-M85 r1, includes ARM Custom Instructions (ACI) which allow designers to include custom defined data processing instructions directly on the controller. These were first proposed in 2019 in the ARMv8.1 architecture but are only now being implemented in designs, in part to address the increasing popularity of cores based on the customisable RISC-V instruction set.

“With ACI on Cortex-M85, the user does not have to turn to alternative architectures to implement a desired instruction encoding. Instead, this can now be done on CPUs that are based on the ARM architecture, with Cortex-M85 being the first high-performing microcontroller to provide this option.,” said ARM. “Through ACI, the user is given the power to innovate within the proven AEM architecture, while maintaining the ecosystem advantages of the Cortex-M CPUs.”

ACI is suitable for applications using specialized bit field processing, trigonometric functions, and image pixel manipulations and can also be applied to accelerate frequently used data processing functions. It is also being implemented on the Cortecx-M55 core.

AI demonstrations​

In the meantime, Renesas is taking advantage of the Helium technology in the core which will be part of the RA (Renesas Advanced) family of MCUs.

Helium is ARM’s M-Profile Vector Extension, available as part of the Armv8.1M architecture and provides a boost for machine learning (ML) and digital signal processing (DSP) applications, accelerating applications such as endpoint AI.

“We’re proud to again lead the industry in implementing the powerful new Arm Cortex-M85 processor with Helium technology,” said Roger Wendelken, Senior Vice President in Renesas’ IoT and Infrastructure Business Unit.

“By showcasing the performance of AI on the new processor, we are highlighting technical advantages of the new platform and at the same time demonstrating Renesas’ strengths in providing solutions for emerging applications with our innovative ecosystem partners.”

“We’re excited to take part in this ground-breaking demonstration,” said Roeland Nusselder, CEO of Plumerai. “ARM’s Helium technology supported on the new RA MCUs with the Cortex-M85 core significantly accelerates the Plumerai inference engine.”

“This performance uplift will enable our customers to use larger and more accurate versions of Plumerai’s People Detection AI, add additional product features, and extend battery life,” he said. ”Our customers have an insatiable appetite for adding new and more accurate AI features that run on a microcontroller. Together with Renesas, we are the first to fulfil this demand.”

The Cortex-M85 core supports ARM TrustZone alongside a Renesas integrated cryptographic engine, immutable storage, key management and tamper protection against DPA/SPA side-channel attacks. The Armv8-M architecture also brings Pointer Authentication/Branch Target Identification (PAC/BTI) security extension, a new architectural feature that provides enhanced mitigation from software attack threats and helps achieve PSA Certified Level 2 certification.

The new RA MCUs based on the Cortex-M85 core will be supported by Renesas’ Flexible Software Package (FSP). This enables faster application development by providing all the infrastructure software needed, including multiple RTOS, BSP, peripheral drivers, middleware, connectivity, networking, and security stacks as well as reference software to build complex AI, motor control and graphics solutions.

It also allows developers to integrate their own legacy code and choice of RTOS with FSP, providing full flexibility in application development.
Maybe we can deduce a few things with whether or not Akida IP lurks in the Renesas MCU based on the Cortex M85 with Helium:

1) The Helium tech performs vector extensions and together with the M85 acts in a similar way to SiFives X280 series chip.
2) Both the X280 and the M85 with Helium do better with a dedicated AI processor performing the inference.
3) The Renesas demo only mentions the M85, the Helium tech and the Plumeai model being used as the components for the demo. There is no mention of any AI hardware.
4) If Renesas were using their own AI hardware such as the DRP for example, they would have been eager to tell all about it - I would think there would no detail spared. Besides I’m not sure they could embed the DRP as IP with the M85 - they may only be able to utilise it as a self contained chip (like the AKD1000 for example).
5) The week before Renesas showcase AI working on the M85, ARM announce the addition of the custom instructions (ACI) to optimise data processing on the M85.
6) As has been already said, the coincidence of BrainChip announcing compatibility of the Akida product family IP with the M85 in the same week as the Renesas announcement.

As others have said nothing is proven, but gee there’s a bit of feel to it don’t you think? If this turns out to be true then we probably have the announcement we are looking for for some kind of a rerate in the share price. This would be amazing. AIMO.
 
  • Like
  • Love
  • Fire
Reactions: 36 users

Beebo

Regular
Now let revisit AI-generated routines soon to be in the new Mercedes.

View attachment 32906

"Just like a fine wine" the car's software will gets better over time.

So just think about all that's for a minute.

I do understand that's Mercedes is talking about software and update overtime to make improvements of it software to become intelligence. BUT, to run the software, hardware needs to be in the Mercedes; so we have the main CPU (Nvidia).

But remember this?

"BrainChip already is seeing some success. It’s Akida 1000 platform is being used in Mercedes-Benz’s Vision EQXX concept car for in-cabin AI, including driver and voice authentication, keyword spotting and contextual understanding."


Brainchip's Akida technology is to make "sensor smarter". So with Mercedes AI-generated routines. Being implement in the new Mercedes. I do think AKIDA is processing driver and voice authentication, keyword spotting and contextual understanding (learning driver behaviours) at the sensor (process relevant data). Sent it to the main CPU (Nvidia) and the CPU sent it to the cloud (Data Center). Data center process the data and make improvements on the software to make the software more intelligent. Then software updates in the car. So that how it's 'gets better over time.

Just my intoxicated thought (DYOR)
-------

On another note. In the Next Platform article;

"BrainChip also is working with SiFive to integrate the Akida technology with SiFive’s RISC-V processors for edge AI computing workloads and MosChip, running its Akida IP with the vendor’s ASIC platform for smart edge devices."

Good old MosChip...

Learning 🏖 🍺😎
It’ll be interesting to know that Mercedes will let it’s drivers easily take their experienced cockpit with them when they flip their cars.
 
  • Like
  • Thinking
  • Love
Reactions: 6 users

Beebo

Regular
It’ll be interesting to know that Mercedes will let it’s drivers easily take their experienced cockpit with them when they flip their cars.
This also opens up the possibility of a subscription based model, where, for a fee, a driver can “own” his/her trained Akida cockpit and plug it into any other Mercedes they buy in the future…. Rather than having to retrain from scratch with every new car they buy!
 
  • Like
  • Love
  • Haha
Reactions: 10 users

Perhaps

Regular
The latest investor presentation of Nviso. Pure dynamite, enjoy:

 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 50 users

cosors

👀

Here is the Renesas booth presentation in full length.


Here are more videos. A total of almost 20 hours of exhibition tours and interviews. Unfortunately, Brainchip is not featured. But maybe someone is bored or even discovers something interesting. There are exciting things in it even if I've only had time to skim through it so far. Maybe something for the thousand eyes.




 
Last edited:
  • Like
  • Love
Reactions: 10 users
Top Bottom