BRN Discussion Ongoing

jtardif999

Regular
Was having a look at that ISO to see how hard it is....usually are :(

This is from the Synopsys website and the ISO is not an actual legal requirement...yet... but a best practice in reality.

Is ISO 26262 Required?
ISO 26262 is not required by law, but many car makers and suppliers follow it to show their commitment to safety and to improve their products safety. Sometimes customers and regulators might require them to prove they follow the standard. But even if it's not required, it's still considered a good practice and following it can improve the safety of car electronic systems and show customers, regulators and end users the company's commitment to safety.

Gives a pretty detailed run down on what it is etc.


When searching for the ISO I also saw Jan 23 article in Embedded about T2M getting the ISO for certain IP blocks...not us unfortunately haha


Complete ASIL-A,B,C,D and ISO26262 Certified IP Cores from T2M​

By Chad Cox

ASSOCIATE EDITOR
EMBEDDED COMPUTING DESIGN
January 24, 2023

But it did make think of T2M as recalled reading something once that BRN engaged them in the early days for Akida.

I presume they still have a relationship / engagement to market and sell Akida as on the Chip Estimate website they had T2M still flagged but maybe outdated?...though is for the IP so would have to be from when we started moving to IP I presume.

Wonder what reach T2M has?



Log In

Akida-Scalable Neural Network AI Silicon IP

IP Preview

Name:Akida-Scalable Neural Network AI Silicon IP

Provider:T2M

Description:AKIDA - Neuromorphic Computing

Overview:The Akida Neuromorphic IP is the first neuromorphic IP available in the market. Inspired by the biological function of neurons but engineered on a digital logic process, this event-based spiking ...

Category:IP Catalog : Digital Core IP :

Processors : Other

Additional data available! Portability, process node, maturity, features, and more can be viewed by logging in with your ChipEstimate.com account.
Pretty sure Renesas claimed that the chip they taped out in December - the one with Akida Inside is ISO 26262 compliant. I remember also AM being asked during the Q&A of one of his presentations about ISO compliance and he said (paraphrasing) BrainChip wouldn’t have to worry about that.. that that would be taken care of in the higher scheme of things - perhaps by a company in the position to have already be seeking compliance on a whole range of items, a company such as Renesas.
 
  • Like
  • Fire
  • Haha
Reactions: 30 users
Pretty sure Renesas claimed that the chip they taped out in December - the one with Akida Inside is ISO 26262 compliant. I remember also AM being asked during the Q&A of one of his presentations about ISO compliance and he said (paraphrasing) BrainChip wouldn’t have to worry about that.. that that would be taken care of in the higher scheme of things - perhaps by a company in the position to have already be seeking compliance on a whole range of items, a company such as Renesas.
It's in the video I posted a few posts back
"I remember also AM being asked during the Q&A of one of his presentations about ISO compliance and he said (paraphrasing) BrainChip wouldn’t have to worry about that"
 
  • Like
Reactions: 12 users

stuart888

Regular
Excellent 90 second upbeat delivery of a new Edge Impulse feature. Bring Your Own Model! 🍾🍾

 
  • Like
  • Love
  • Fire
Reactions: 22 users

stuart888

Regular
Out-of-the-Box Support for RZ/V2L Evaluation Board Kit

This was just posted as well, detailed, 55 minutes.



Join Renesas and Edge Impulse to hear and discuss more on the following topics.

• Cutting-Edge Innovation with Edge Impulse Studio and RZ/V2L Edge Impulse Studio is a toolkit that simplifies AI/ML development for embedded devices. You'll be amazed at how easy it is to collect data, train models, test results, and deploy your models to embedded devices such as the powerful Renesas RZ/V2L.

• DRP-AI Translator + Edge Impulse: A Match Made in Heaven The DRP-AI translator is seamlessly integrated into the Edge Impulse environment, eliminating the need to run this step externally. With Edge Impulse and DRP-AI working together, you'll be able to create amazing projects with ease.

• Out-of-the-Box Support for RZ/V2L Evaluation Board Kit The RZ/V2L Evaluation Board Kit (EVK) is now supported out-of-the-box by Edge Impulse; take advantage of Renesas' powerful AI accelerator, dual-core CPU, 3D graphics, and video codec engine to bring your embedded projects to life quickly and easily, helping build solutions that lower costs and increase revenue.
 
  • Like
  • Wow
  • Love
Reactions: 13 users
Screenshot_20230405-030640.png
 
  • Fire
  • Like
Reactions: 8 users
D

Deleted member 118

Guest
Renasas

VIP-Bühne: Embedding Real-time AI on Tiny Endpoint Devices​


 
  • Like
  • Fire
Reactions: 6 users

stuart888

Regular
Intel put up a 9 second quiz? The 3-minute answer is interesting.



A clue!
1680631758200.png


Answer:


Take a sneak peek “Behind This Door” to a critical level of #Intel Oregon’s D1X #semiconductor factory, the sub fab. It’s the underbelly full of tens of thousands of pumps, transformers, power cabinets, scrubbers, treatment systems and more — all to support the 1,200 chipmaking tools in the clean rooms directly above.
 
  • Like
  • Fire
Reactions: 8 users
D

Deleted member 118

Guest
46F261DB-2AD9-4647-A52F-5F299580B95D.png



 
  • Like
  • Fire
  • Love
Reactions: 22 users
SparseGPT: Massive Language Models Can Be Accurately Pruned in One-Shot
For the bored and fascinated
👇
 

Attachments

  • 2301.00774.pdf
    723 KB · Views: 249
  • Like
Reactions: 3 users

stuart888

Regular
View attachment 33640


Good find Rocket. This is an excellent overview, starting historically and progressing to now and beyond.

Some might want to move forward in the video, straight to the Event Based focus.

Vision is one of the hottest themes that is going to rocket out of the Spiking Neural Network gates! 🚀🚀🚀

1680637768037.png


FRAME-BASED: A conventional camera takes an arbitrary number of pictures per second, usually around 30 fps, in which all pixels record in synchrony regardless of what is going on in the scene.

EVENT-BASED: In Prophesee Metavision patented sensor (discover our Evaluation Kits), there is a new kind of pixel. Each of them is powered by its own independent intelligent processing. This allows them to only records when they sense a change or movement. The information created does not arrive frame by frame. Rather, movement is captured as a continuous stream of information.

Prophesee sees between the frames, where all traditional frame-based systems are blind.
 
  • Like
  • Love
Reactions: 26 users

IloveLamp

Top 20

When Yeh turned on the car, a voice welcomed him by name — straight-up "Knight Rider" vibes.

“As you can see, the vehicle just recognized me, said my name through a facial-recognition app, put on my favorite color scheme — in this case, blue — it’s got my seat position, and it’s playing my music,” Yeh said, pointing to each of these features inside the car.
.
Cars are becoming much more safe because they're more aware of their surroundings

Nakul Duggal, senior VP and GM of automotive, Qualcomm Technologies

Duggal added that the industry as a whole is headed more toward driver-assistance technology and not so much toward self-driving vehicles.



“A feature that automakers are starting to introduce is monitoring of the driver: Is the driver distracted? Is the driver drowsy? Is he impaired in any way? Can the driver take over and bring the vehicle to a halt carefully?”



Duggal said that the concept car showcases how an automaker could have a more personal relationship with the driver and allow certain transportation and delivery companies to have better management of their vehicles. He added that Qualcomm is proud to represent San Diego as the company enters the auto industry via an existing international network of partners and customers. Not as automakers, to be clear, but as drivers of technology merging more and more into future vehicles.

“What we are doing here is affecting the global automotive ecosystem," Duggal said. "We are now participating, really, across the world."



Previously...........
Capture5.PNG
 
Last edited:
  • Like
  • Thinking
Reactions: 19 users
D

Deleted member 118

Guest
Good find Rocket. This is an excellent overview, starting historically and progressing to now and beyond.

Some might want to move forward in the video, straight to the Event Based focus.

Vision is one of the hottest themes that is going to rocket out of the Spiking Neural Network gates! 🚀🚀🚀

View attachment 33645

FRAME-BASED: A conventional camera takes an arbitrary number of pictures per second, usually around 30 fps, in which all pixels record in synchrony regardless of what is going on in the scene.

EVENT-BASED: In Prophesee Metavision patented sensor (discover our Evaluation Kits), there is a new kind of pixel. Each of them is powered by its own independent intelligent processing. This allows them to only records when they sense a change or movement. The information created does not arrive frame by frame. Rather, movement is captured as a continuous stream of information.

Prophesee sees between the frames, where all traditional frame-based systems are blind.
Never watched any of it, so glad it’s of interest.
 
  • Haha
  • Fire
Reactions: 4 users

stuart888

Regular
  • Like
Reactions: 10 users

stuart888

Regular

Attachments

  • 1680640246747.png
    1680640246747.png
    87.2 KB · Views: 52
  • Like
Reactions: 4 users

IloveLamp

Top 20

When Yeh turned on the car, a voice welcomed him by name — straight-up "Knight Rider" vibes.

“As you can see, the vehicle just recognized me, said my name through a facial-recognition app, put on my favorite color scheme — in this case, blue — it’s got my seat position, and it’s playing my music,” Yeh said, pointing to each of these features inside the car.
.
Cars are becoming much more safe because they're more aware of their surroundings

Nakul Duggal, senior VP and GM of automotive, Qualcomm Technologies

Duggal added that the industry as a whole is headed more toward driver-assistance technology and not so much toward self-driving vehicles.



“A feature that automakers are starting to introduce is monitoring of the driver: Is the driver distracted? Is the driver drowsy? Is he impaired in any way? Can the driver take over and bring the vehicle to a halt carefully?”



Duggal said that the concept car showcases how an automaker could have a more personal relationship with the driver and allow certain transportation and delivery companies to have better management of their vehicles. He added that Qualcomm is proud to represent San Diego as the company enters the auto industry via an existing international network of partners and customers. Not as automakers, to be clear, but as drivers of technology merging more and more into future vehicles.

“What we are doing here is affecting the global automotive ecosystem," Duggal said. "We are now participating, really, across the world."



Previously...........
View attachment 33647
Screenshot_20230405_075345_LinkedIn.jpg
 
  • Like
  • Love
  • Fire
Reactions: 28 users

alwaysgreen

Top 20
  • Like
  • Thinking
  • Haha
Reactions: 16 users
Good morning,

I do self invest with QSuper and have just been notified that Self Invest is closing to new investors as of the 1 July 2023.

Just thinking that this might have something to do the superannuation changes to tax concessions, and tax to be paid on unrealised capital gains that Treasurer Jim Chalmers wants to bring in. Maybe the funds are thinking along the lines that the changes will be to hard to manage with regards to implementing tax paid on unrealised capital gains.

I would be interested to know if anyone else has received any such notification from the super funds with the ceasing of accepting any new members to direct invest within their funds.

Regards
Meetupsoon.
 
  • Like
  • Sad
Reactions: 6 users

GDJR69

Regular
  • Like
  • Love
  • Haha
Reactions: 14 users

alwaysgreen

Top 20
  • Like
  • Haha
  • Love
Reactions: 10 users

stuart888

Regular
Steering the Models!

SparesGBT was @Rise from the ashes ; this morning. Now a new thing: "Steering the Models"!!!

Makes so much sense. Starts at what I posted here: 20:29 into the podcast. Fantastic podcast.

Steering the Models must be new!
1680652295782.png





Greylock general partner Reid Hoffman talks with OpenAI CEO Sam Altman about their mutual learnings from the AI tools released to date, primarily GPT-4 4, OpenAI's language model that co-wrote Hoffman's recent book “Impromptu." Keeping up with all the AI advancements – and questioning both the pros and cons of their immediate impact – is why Hoffman wrote Impromptu. As he describes, the book is intended to act as a travelog of his experience with GPT-4. Above all, he sees AI as a human amplifier, but urges it must be developed safely. Safety is exactly what OpenAI is striving for, Altman says, and Hoffman’s detailed chronicle of his experience with GPT-4 is exactly the kind of data the organization needs to continue developing the technology with safety in mind.
 
  • Like
  • Love
Reactions: 4 users
Top Bottom