BRN Discussion Ongoing

TheDrooben

Pretty Pretty Pretty Pretty Good
Me neither😂
pro.gif
 
  • Haha
Reactions: 7 users

TechGirl

Founding Member
  • Haha
  • Like
  • Love
Reactions: 9 users

ndefries

Regular
  • Haha
  • Love
Reactions: 5 users

Yak52

Regular
Trading activity or the LACK of it today especially after yesterday afternoon trading would seem to suggest that LDA have completed their PUT.

My GUESS is these CXXT trades are possibly shorts closing out slowly.
I also believe that most of the "SHORTS" held are for "Hedging purposes" by the Big Instos against their long holdings. Some smaller shorters exist which gives us that varying daily "shorts" being taken out. IMO

11:30am until 01:00pm has the ASX showing 99% trades being "Cross Trades" ie:CXXT and mostly only 1 >25 share trades.
Volume total ASX is 1.2 Mil for 3 hours of trading.

If this is the case the we can expect BRN announcement of close of LDA PUT with the amounts/$ values etc. Maybe this afternoon??.

Yak52:cool:

ASX DATA ONLY CHI-X NOT INCLUDED.
1:05:02 PM0.46220.925CXXT
1:05:02 PM0.46210046.250CXXT
1:05:02 PM0.4605,0002,300.000XT
1:03:47 PM0.46220.925CXXT
1:03:00 PM0.4626831.450CXXT
1:03:00 PM0.4603,4341,579.640
1:02:48 PM0.46220.925CXXT
1:02:48 PM0.4628941.163CXXT
1:02:48 PM0.46231.388CXXT
1:02:47 PM0.46212256.425CXXT
1:01:05 PM0.4603,0751,414.500
1:01:05 PM0.4603,0681,411.280
12:54:33 PM0.46210.463CXXT
12:54:33 PM0.462198.788CXXT
12:54:33 PM0.460954438.840
12:42:17 PM0.46220.925CXXT
12:42:17 PM0.462104.625CXXT
12:42:17 PM0.460593272.780
12:41:32 PM0.462146.475CXXT
12:41:32 PM0.460662304.520
12:40:33 PM0.46210.463CXXT
12:40:33 PM0.4624721.738CXXT
12:39:29 PM0.4622411.100CXXT
12:39:29 PM0.4621,197553.613CXXT
12:38:32 PM0.46210.463CXXT
12:38:32 PM0.4625023.125CXXT
12:36:00 PM0.46210.463CXXT
12:36:00 PM0.4625023.125CXXT
12:33:28 PM0.46210.463CXXT
12:33:28 PM0.4625625.900CXXT
12:31:27 PM0.46220.925CXXT
12:31:27 PM0.4627635.150CXXT
12:30:57 PM0.46220.925CXXT
12:30:56 PM0.4628941.163CXXT
12:30:31 PM0.46210.463CXXT
12:30:31 PM0.4627032.375CXXT
12:30:31 PM0.4623,4991,618.288NXXT
12:24:58 PM0.462177.863CXXT
12:24:58 PM0.460849390.540
12:24:42 PM0.462115.088CXXT
12:24:42 PM0.462517239.113CXXT
12:23:21 PM0.46210.463CXXT
12:23:21 PM0.4625424.975CXXT
12:20:49 PM0.46210.463CXXT
12:20:49 PM0.4625324.513CXXT
12:19:18 PM0.46210.463CXXT
12:19:18 PM0.4625324.513CXXT
12:17:47 PM0.46210.463CXXT
12:17:47 PM0.4624621.275CXXT
12:16:47 PM0.46210.463CXXT
12:16:47 PM0.4625123.588CXXT
12:16:25 PM0.462188.325CXXT
12:16:25 PM0.460861396.060XT
12:15:54 PM0.46210.463CXXT
12:15:54 PM0.4625023.125CXXT
12:15:54 PM0.462125.550CXXT
12:15:26 PM0.460587270.020
12:15:26 PM0.46273.238CXXT
12:15:14 PM0.462360166.500CXXT
12:15:14 PM0.46218,0008,325.000CXXT
12:13:54 PM0.46231.388CXXT
12:13:54 PM0.46214968.913CXXT
12:13:54 PM0.4622511.563CXXT
12:13:45 PM0.4627,4123,428.050CXXT
12:13:45 PM0.4621,260582.750CXXT
12:13:45 PM0.46210.463CXXT
12:13:45 PM0.4625224.050CXXT
12:12:14 PM0.46210.463CXXT
12:12:14 PM0.4625525.438CXXT
12:10:43 PM0.46210.463CXXT
12:10:43 PM0.4625826.825CXXT
12:09:59 PM0.46220.925CXXT
12:09:59 PM0.4626630.525CXXT
12:09:59 PM0.4603,3341,533.640
12:09:12 PM0.46220.925CXXT
12:09:12 PM0.4626228.675CXXT
12:07:41 PM0.46210.463CXXT
12:07:41 PM0.4626329.138CXXT
12:06:10 PM0.46210.463CXXT
12:06:10 PM0.4626530.063CXXT
12:04:38 PM0.46220.925CXXT
12:04:38 PM0.4626530.063CXXT
12:02:07 PM0.4624520.813CXXT
11:59:05 AM0.46210.463CXXT
11:59:05 AM0.4625123.588CXXT
11:58:11 AM0.46210.463CXXT
11:57:16 AM0.46283.700CXXT
11:57:16 AM0.462435201.188CXXT
11:57:16 AM0.46021,7399,999.940
11:57:03 AM0.46210.463CXXT
11:57:03 AM0.4625525.438CXXT
11:56:52 AM0.46252.313CXXT
11:56:52 AM0.462217100.363CXXT
11:55:02 AM0.46210.463CXXT
11:55:02 AM0.4625726.363CXXT
11:52:30 AM0.46210.463CXXT
11:52:30 AM0.4624520.813CXXT
11:49:59 AM0.4623415.725CXXT
11:49:59 AM0.4624822.200CXXT
11:48:34 AM0.4629443.475CXXT
11:48:34 AM0.46220494.350CXXT
11:47:27 AM0.46212959.663CXXT
11:47:27 AM0.4626027.750CXXT
11:46:57 AM0.46210.463CXXT
11:46:57 AM0.4628941.163CXXT
11:46:52 AM0.4626128.213CXXT
11:46:52 AM0.462209.250CXXT
11:44:27 AM0.4627333.763CXXT
11:42:35 AM0.46220.925CXXT
11:42:24 AM0.46216978.163CXXT
11:39:13 AM0.4628639.775CXXT
11:39:13 AM0.46231.388CXXT
11:35:27
 
  • Like
  • Love
  • Fire
Reactions: 10 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
This is a serious matter rocket! I'm not sure where the naughty corner is but off you go.
That's OK @ndefries, I can show him where it is.
 
  • Haha
  • Like
  • Wow
Reactions: 12 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Looks like we're in the right place at the right time.

April 6, 2023

The Automotive Market Pivots Hard to Generative AI and the Metaverse​

Rob Enderle
digital_twin_shutterstock_Chesky-300x187.jpg

(Chesky/Shutterstock)
NVIDIA held its GTC conference in March, and much of the content had to do with how the automotive market is rethinking the 300+ new factories it will be building to create the next generation of electric, self-driving cars, trucks and flying vehicles. Generative AI and the metaverse will potentially provide a customization capability that hasn’t been seen since the beginning of the automotive industry when cars were more custom-built than line-built. I expect this advancement to improve customer retention, customer satisfaction, reliability, and performance and to substantially reduce market failures. Other industries will embrace these technologies, as well, for similar reasons.
Let’s talk about how generative AI and the metaverse is already resulting in a massive change in future car and truck factories and how the related companies will engage more deeply with and become much closer to their users.

The Metaverse from Cradle to Grave​

Typically, cars are conceived as ideas. These ideas are then winnowed down into a couple of concepts. The concepts are made into clay models and circulated for comment. Prototypes are built and taken to car shows and tested on private then public roads. Focus groups are brought in to see if there is a market for the car. This process can take over five years, generally doesn’t anticipate what the competitive market will be when the car is released, and often lacks enough customer voice throughout the process. The result is that cars sell poorly once they come out. In addition, during line setup, problems are often discovered late in the process which delays manufacturing and incurs otherwise avoidable costs to redesign and reconfigure the lines.
The metaverse (mostly NVIDIA Omniverse which is dominant in automotive) is increasingly being used by a wide variety of car makers to not only design and receive feedback on the new car, but to virtually design factories and their manufacturing floors to assure that major parts of the new or existing factory won’t need to be redone and potentially reduce time-to-market significantly.
In addition, the metaverse is being used to design and get feedback on the car, which eliminates the problems associated with configuring the lines because those problems can now be identified in the metaverse. This allows for less costly corrections should they be necessary.
A digital twin of the car is created that allows the buyer to not only build the car they want but allows them to follow the car through the build process and address any questions about the choices they made. This digital twin will remain tied to the car. It will help the user fix some things and enable the user to not only better discover potential problems but help them fix them if they are remote from a dealer. The car company can follow the life of the vehicle in order to understand and address points of failure that might otherwise arise later and damage its relationship with the user.

Generative AI at the Heart of the Customer Relationship​

These next generation cars are slated to have generative AI interfaces that enable the driver to converse with their vehicle in natural language as opposed to packaged and irritating fixed commands. This conversational interface has already spread through Microsoft’s developer tools and most recently through Windows, Office, and the Edge browser. Even though Google was caught sleeping, this interface should quickly spread across its platforms, as well. This means we’ll be surrounded increasingly by things that we can use natural language to interface with.


(Andrey Suslov/Shutterstock)
The implication is that, over time, car owners will interface with their car and car company through generative AI and develop more of a collegial relationship with their vehicles and car companies. These AIs will not only help users learn about their new vehicles and how to operate them, but help them select the right vehicle and configuration before they buy the car. In addition, I expect this will evolve to a point where we will first interface with our generative automotive AI during the purchase process and be able to apply the most successful upselling capabilities while balancing the need to maintain a trusted relationship between the buyer and the related car company.
Instead of interacting only when the buyer has a very serious problem or during the buying process, future buyers will stay engaged through the generative AI with their car company. I also expect this generative AI experience won’t just be in the car but will extend into the home and business as users demand a more consistent AI experience across an ever-widening field of products much like we saw with tools like Apple’s Siri and especially Amazon Alexa. But car companies are already pulling the plug on these third-party tools in favor of their own to embrace their customers and tie them more tightly to the company by addressing the customer retention problem.
Wrapping up:
Automotive companies appear to be racing ahead of everyone else to apply metaverse and generative AI to their products and factories. But the benefits of this move, which include speed to market, fewer mistakes, better reliability, better performance and higher customer satisfaction and customer loyalty, will spread to other industries. Generative AI efforts will consolidate to approach the goal that users will likely prefer of having only one generative AI interface into all their smart products. This suggests future market expansions for automotive companies that will want to partner with or buy into related markets that will benefit from the automotive companies’ leadership as they leverage the user’s need for a consistent AI experience.
In short, while the metaverse and generative AI will hit the automotive market hard at first, once the benefits are validated, they will rapidly spread to other markets and change every aspect of how we design, build, monetize and service products and how these companies create and maintain a deeper relationship with customers.
About the author: As President and Principal Analyst of the Enderle Group, Rob Enderle provides regional and global companies with guidance in how to create credible dialogue with the market, target customer needs, create new business opportunities, anticipate technology changes, select vendors and products, and practice zero dollar marketing. For over 20 years Rob has worked for and with companies like Microsoft, HP, IBM, Dell, Toshiba, Gateway, Sony, USAA, Texas Instruments, AMD, Intel, Credit Suisse First Boston, ROLM, and Siemens.

 
  • Like
  • Fire
Reactions: 13 users
D

Deleted member 118

Guest
  • Haha
  • Like
Reactions: 9 users

TECH

Regular
Good morning all,

Is it just my imagination or is the volume starting to dry up.

And for the individuals whom keep on insisting that I said that our Founder wasn't attending this years AGM, imagining or
assuming something that I have posted just makes an ass out of u.

No posts have been edited or deleted, I am certainly not that arrogant to suggest or say something on this forum unless it
was truly factual (opinions aside) I value the relationship that I have, so please stop it.

Keep up with the great news feeds, I certainly appreciate it when I have the time to read it all 😟 Tech x 👍
Looks like we're in the right place at the right time.

April 6, 2023

The Automotive Market Pivots Hard to Generative AI and the Metaverse​

Rob Enderle
digital_twin_shutterstock_Chesky-300x187.jpg

(Chesky/Shutterstock)
NVIDIA held its GTC conference in March, and much of the content had to do with how the automotive market is rethinking the 300+ new factories it will be building to create the next generation of electric, self-driving cars, trucks and flying vehicles. Generative AI and the metaverse will potentially provide a customization capability that hasn’t been seen since the beginning of the automotive industry when cars were more custom-built than line-built. I expect this advancement to improve customer retention, customer satisfaction, reliability, and performance and to substantially reduce market failures. Other industries will embrace these technologies, as well, for similar reasons.
Let’s talk about how generative AI and the metaverse is already resulting in a massive change in future car and truck factories and how the related companies will engage more deeply with and become much closer to their users.

The Metaverse from Cradle to Grave​

Typically, cars are conceived as ideas. These ideas are then winnowed down into a couple of concepts. The concepts are made into clay models and circulated for comment. Prototypes are built and taken to car shows and tested on private then public roads. Focus groups are brought in to see if there is a market for the car. This process can take over five years, generally doesn’t anticipate what the competitive market will be when the car is released, and often lacks enough customer voice throughout the process. The result is that cars sell poorly once they come out. In addition, during line setup, problems are often discovered late in the process which delays manufacturing and incurs otherwise avoidable costs to redesign and reconfigure the lines.
The metaverse (mostly NVIDIA Omniverse which is dominant in automotive) is increasingly being used by a wide variety of car makers to not only design and receive feedback on the new car, but to virtually design factories and their manufacturing floors to assure that major parts of the new or existing factory won’t need to be redone and potentially reduce time-to-market significantly.
In addition, the metaverse is being used to design and get feedback on the car, which eliminates the problems associated with configuring the lines because those problems can now be identified in the metaverse. This allows for less costly corrections should they be necessary.
A digital twin of the car is created that allows the buyer to not only build the car they want but allows them to follow the car through the build process and address any questions about the choices they made. This digital twin will remain tied to the car. It will help the user fix some things and enable the user to not only better discover potential problems but help them fix them if they are remote from a dealer. The car company can follow the life of the vehicle in order to understand and address points of failure that might otherwise arise later and damage its relationship with the user.

Generative AI at the Heart of the Customer Relationship​

These next generation cars are slated to have generative AI interfaces that enable the driver to converse with their vehicle in natural language as opposed to packaged and irritating fixed commands. This conversational interface has already spread through Microsoft’s developer tools and most recently through Windows, Office, and the Edge browser. Even though Google was caught sleeping, this interface should quickly spread across its platforms, as well. This means we’ll be surrounded increasingly by things that we can use natural language to interface with.


(Andrey Suslov/Shutterstock)
The implication is that, over time, car owners will interface with their car and car company through generative AI and develop more of a collegial relationship with their vehicles and car companies. These AIs will not only help users learn about their new vehicles and how to operate them, but help them select the right vehicle and configuration before they buy the car. In addition, I expect this will evolve to a point where we will first interface with our generative automotive AI during the purchase process and be able to apply the most successful upselling capabilities while balancing the need to maintain a trusted relationship between the buyer and the related car company.
Instead of interacting only when the buyer has a very serious problem or during the buying process, future buyers will stay engaged through the generative AI with their car company. I also expect this generative AI experience won’t just be in the car but will extend into the home and business as users demand a more consistent AI experience across an ever-widening field of products much like we saw with tools like Apple’s Siri and especially Amazon Alexa. But car companies are already pulling the plug on these third-party tools in favor of their own to embrace their customers and tie them more tightly to the company by addressing the customer retention problem.
Wrapping up:
Automotive companies appear to be racing ahead of everyone else to apply metaverse and generative AI to their products and factories. But the benefits of this move, which include speed to market, fewer mistakes, better reliability, better performance and higher customer satisfaction and customer loyalty, will spread to other industries. Generative AI efforts will consolidate to approach the goal that users will likely prefer of having only one generative AI interface into all their smart products. This suggests future market expansions for automotive companies that will want to partner with or buy into related markets that will benefit from the automotive companies’ leadership as they leverage the user’s need for a consistent AI experience.
In short, while the metaverse and generative AI will hit the automotive market hard at first, once the benefits are validated, they will rapidly spread to other markets and change every aspect of how we design, build, monetize and service products and how these companies create and maintain a deeper relationship with customers.
About the author: As President and Principal Analyst of the Enderle Group, Rob Enderle provides regional and global companies with guidance in how to create credible dialogue with the market, target customer needs, create new business opportunities, anticipate technology changes, select vendors and products, and practice zero dollar marketing. For over 20 years Rob has worked for and with companies like Microsoft, HP, IBM, Dell, Toshiba, Gateway, Sony, USAA, Texas Instruments, AMD, Intel, Credit Suisse First Boston, ROLM, and Siemens.


Nice post, what most can't see in that photo is the forest for the trees ! meaning what exactly Tech ?

Look carefully, there's a hidden message in the photo, I can clearly see a baby "AKIDA" in his hands. :ROFLMAO::ROFLMAO::ROFLMAO::ROFLMAO::ROFLMAO::ROFLMAO::ROFLMAO::rolleyes:
 
  • Like
Reactions: 17 users
  • Haha
  • Like
Reactions: 6 users

FlipDollar

Never dog the boys
  • Haha
Reactions: 2 users

Boab

I wish I could paint like Vincent
  • Haha
Reactions: 12 users
Was looking at the list that @Stable Genius put up not long ago.

Started asking myself questions on the more recent flurry, if you will, of partnership releases.

Like, AI Labs, Teksun, emotion3D, VVDN, Intellisense - though known of via NASA public info, the relationship has now been publicly solidified by BRN.

Why now?

Do we consider that some / all of these "partners" were part of the EAP or just left field new?

By defining them as partners and not clients or having formal agreements you could possibly skirt the supposed issue of disclosing projected income as per the ASX BS.

If so, we know how long it took Renesas to get through their programs before now taping out for a 3rd party.

Is it conceivable these more recent public acknowledgements are a precursor that these partners are nearing or have completed their DD and are nearing product confirmations, releases in due course?

Has this been the sort of clients (partners) the BRN tech hires past several months have been working with to get to product stage?

If they were to be going to mkt then are they able to just get Megachips to design etc their Akida integrated ASIC, if thats the design and pay Megachips accordingly which then flows to us.

Appears able to get Renesas to do similar maybe with MCUs and $ to us accordingly.

All without individual licences.

Just some random thoughts.

NASA
FORD
VALEO
MERCEDES
SIFIVE
MAGIKEYE
MEGACHIPS
RENESAS
BIOTOME
PROPHESEE
EDGE IMPULSE
ARM
NVISO
GLOBAL FOUNDRIES
VVDN
SOCIONEXT
MOSCHIP
INTEL
NANOSE
AI LABS
SAHOMA CONTROLWARE
EMOTION3D
TEKSUN
INTELLISENSE
 
  • Like
  • Fire
  • Love
Reactions: 47 users

Kachoo

Regular

Shorts dropping. Not sure how much weight I put in the daily reports being accurate. But with this low volume there aren't to many chips available on the market to cover this short fall.
 
  • Like
  • Fire
Reactions: 13 users
Was looking at the list that @Stable Genius put up not long ago.

Started asking myself questions on the more recent flurry, if you will, of partnership releases.

Like, AI Labs, Teksun, emotion3D, VVDN, Intellisense - though known of via NASA public info, the relationship has now been publicly solidified by BRN.

Why now?

Do we consider that some / all of these "partners" were part of the EAP or just left field new?

By defining them as partners and not clients or having formal agreements you could possibly skirt the supposed issue of disclosing projected income as per the ASX BS.

If so, we know how long it took Renesas to get through their programs before now taping out for a 3rd party.

Is it conceivable these more recent public acknowledgements are a precursor that these partners are nearing or have completed their DD and are nearing product confirmations, releases in due course?

Has this been the sort of clients (partners) the BRN tech hires past several months have been working with to get to product stage?

If they were to be going to mkt then are they able to just get Megachips to design etc their Akida integrated ASIC, if thats the design and pay Megachips accordingly which then flows to us.

Appears able to get Renesas to do similar maybe with MCUs and $ to us accordingly.

All without individual licences.

Just some random thoughts.

NASA
FORD
VALEO
MERCEDES
SIFIVE
MAGIKEYE
MEGACHIPS
RENESAS
BIOTOME
PROPHESEE
EDGE IMPULSE
ARM
NVISO
GLOBAL FOUNDRIES
VVDN
SOCIONEXT
MOSCHIP
INTEL
NANOSE
AI LABS
SAHOMA CONTROLWARE
EMOTION3D
TEKSUN
INTELLISENSE
I see Teksun as competition for Megachips. Looking at their website they appear to do the same thing. VVDN as well only being in India will target that zone and also have a large manufacturing capability.

Megachips as we know just invested $20 or 30 mil to set up in USA. Fingers crossed they can get traction. Having Nintendo in their back pocket is very promising.

Those 3 are who I think smaller companies will go to for niche devices.

The larger OEMs will be using ARM, SiFive and Intel.

Unless you’re Valeo/Mercedes/Renesas who have their own engineers to keep it in house.

Edge impulse is to train or do the software side of it.

Emotion3D are competition for Nviso.

Intellisense, Blue Ridge are Defence with NASA and probably Bascomm Hunter who I’m guessing are going to use SiFive.

Nanose and Know Labs, Biotome are medical, again with NASA and likely TATA.

TATA not on the list yet but they will be covering everything!

That’s just off the top of my head!

Future looks rock solid to me!

Edit: just saw Nviso changed to Nvidia so I’m fixing that up :)
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 58 users

Tothemoon24

Top 20

Has anyone dived into the company Oculi ?

Oculi Forms strategic partnership with GlobalFoundries to advance edge sensing technology​

BALTIMORE, MD, UNITED STATES -- February 15, 2023 -- Oculi today announced a strategic partnership with GlobalFoundries Inc. (GF) (Nasdaq: GFS), a global leader in feature-rich semiconductor manufacturing, to manufacture OCULI SPU S12, the world's first single-chip intelligent Software-Defined Vision Sensor. This new chip will be used in smart devices and homes, industrial, IoT, automotive markets and wearables including AR/VR.
The OCULI SPU S12 is the first of 3 product lines that will disrupt and set a new standard in vision technology. It is based on GF’s 55LPx which is a feature-rich platform that supports Radio Frequency (RF), ultra-low power (ULP), embedded non-volatile memory (eNVM) and high voltage BCDLite® options. This makes the platform an ideal solution for System on Chip (SoC) integration to enable more functionality, less energy consumption and smaller form factor electronic applications.
Dr. Charbel Rizk, Oculi founder and CEO commented "Oculi has solved decades-long fundamental challenges by developing a new form of vision sensor that applies selectivity to process only the most relevant information 30X faster, 1/10th the power drain, and protects privacy/data security at the sensor. Oculi technology, which features pre-processing inside the pixel that emulates the human eye, will ultimately power many smart devices. Oculi’s new vision is ideal for edge applications such as always-on gesture/face/people tracking and low-latency eye tracking, while alternative solutions are too slow, big, and power inefficient. GF is an excellent partner to enable us to quickly get our product to our customers."
In collaborating with Oculi, Mike Hogan, chief business officer at GF, recognized the synergy for both companies in this engagement. As Mike stated, "Visual AI at the edge is the growth vector to enhance the connection between the digital and physical world. Oculi's architecture and GF’s production-proven 55LPx feature-rich platform enables device makers to get to market faster while also by optimizing data usage and reducing energy consumption.”
20230216_1.jpg
 
  • Like
  • Fire
  • Thinking
Reactions: 9 users

AARONASX

Holding onto what I've got
It was previously mentioned where Tony stated along the line if a customer was using Akida they would be keeping it quite. IMO they must be a lot of NDA ready and waiting for release... IMO they're all waiting for each other to announce first in the mean time they're grinding away honing in on their products...Once the first one comes out I'm highly bullish they'll be a run on others all rush to be 2nd 3rd 4th...no one wants to jump out too early 😀
 
  • Like
  • Thinking
  • Fire
Reactions: 14 users

Has anyone dived into the company Oculi ?

Oculi Forms strategic partnership with GlobalFoundries to advance edge sensing technology​

BALTIMORE, MD, UNITED STATES -- February 15, 2023 -- Oculi today announced a strategic partnership with GlobalFoundries Inc. (GF) (Nasdaq: GFS), a global leader in feature-rich semiconductor manufacturing, to manufacture OCULI SPU S12, the world's first single-chip intelligent Software-Defined Vision Sensor. This new chip will be used in smart devices and homes, industrial, IoT, automotive markets and wearables including AR/VR.
The OCULI SPU S12 is the first of 3 product lines that will disrupt and set a new standard in vision technology. It is based on GF’s 55LPx which is a feature-rich platform that supports Radio Frequency (RF), ultra-low power (ULP), embedded non-volatile memory (eNVM) and high voltage BCDLite® options. This makes the platform an ideal solution for System on Chip (SoC) integration to enable more functionality, less energy consumption and smaller form factor electronic applications.
Dr. Charbel Rizk, Oculi founder and CEO commented "Oculi has solved decades-long fundamental challenges by developing a new form of vision sensor that applies selectivity to process only the most relevant information 30X faster, 1/10th the power drain, and protects privacy/data security at the sensor. Oculi technology, which features pre-processing inside the pixel that emulates the human eye, will ultimately power many smart devices. Oculi’s new vision is ideal for edge applications such as always-on gesture/face/people tracking and low-latency eye tracking, while alternative solutions are too slow, big, and power inefficient. GF is an excellent partner to enable us to quickly get our product to our customers."
In collaborating with Oculi, Mike Hogan, chief business officer at GF, recognized the synergy for both companies in this engagement. As Mike stated, "Visual AI at the edge is the growth vector to enhance the connection between the digital and physical world. Oculi's architecture and GF’s production-proven 55LPx feature-rich platform enables device makers to get to market faster while also by optimizing data usage and reducing energy consumption.”
20230216_1.jpg
@Tothemoon24

Some prev posts when I went digging a little on them as from memory someone picked up maybe Rob mentioned them in a video or something. Was very brief mention.

Posts not in date order but bits on Oculi fwiw.








 
  • Like
  • Fire
  • Haha
Reactions: 13 users

Tothemoon24

Top 20
@Tothemoon24

Some prev posts when I went digging a little on them as from memory someone picked up maybe Rob mentioned them in a video or something. Was very brief mention.

Posts not in date order but bits on Oculi fwiw.








Thank you @Fullmoonfever much appreciated
 
  • Like
Reactions: 5 users

davidfitz

Regular
We must be getting close to an announcement?

Next-gen Mercedes-Benz E-Class global debut on April 25​

The next-gen E-Class to emphasise more on tech; could get on-board AI.

Mercedes-Benz E-class: AI-equipped MBUX interface

The latest version of MBUX also includes artificial intelligence, which learns the driver’s routine and pre-empts their needs – for example, automatically warming the seat when the temperature drops, or winding down the window at the entrance to a frequented car park. The new E-Class also receives a digital instrument display and an optional head-up display, said to have a wider field of view than previous iterations.
 
  • Like
  • Fire
Reactions: 35 users
Top Bottom