TheDrooben
Pretty Pretty Pretty Pretty Good
Me neither
Me neither
1:05:02 PM | 0.462 | 2 | 0.925 | CXXT |
1:05:02 PM | 0.462 | 100 | 46.250 | CXXT |
1:05:02 PM | 0.460 | 5,000 | 2,300.000 | XT |
1:03:47 PM | 0.462 | 2 | 0.925 | CXXT |
1:03:00 PM | 0.462 | 68 | 31.450 | CXXT |
1:03:00 PM | 0.460 | 3,434 | 1,579.640 | |
1:02:48 PM | 0.462 | 2 | 0.925 | CXXT |
1:02:48 PM | 0.462 | 89 | 41.163 | CXXT |
1:02:48 PM | 0.462 | 3 | 1.388 | CXXT |
1:02:47 PM | 0.462 | 122 | 56.425 | CXXT |
1:01:05 PM | 0.460 | 3,075 | 1,414.500 | |
1:01:05 PM | 0.460 | 3,068 | 1,411.280 | |
12:54:33 PM | 0.462 | 1 | 0.463 | CXXT |
12:54:33 PM | 0.462 | 19 | 8.788 | CXXT |
12:54:33 PM | 0.460 | 954 | 438.840 | |
12:42:17 PM | 0.462 | 2 | 0.925 | CXXT |
12:42:17 PM | 0.462 | 10 | 4.625 | CXXT |
12:42:17 PM | 0.460 | 593 | 272.780 | |
12:41:32 PM | 0.462 | 14 | 6.475 | CXXT |
12:41:32 PM | 0.460 | 662 | 304.520 | |
12:40:33 PM | 0.462 | 1 | 0.463 | CXXT |
12:40:33 PM | 0.462 | 47 | 21.738 | CXXT |
12:39:29 PM | 0.462 | 24 | 11.100 | CXXT |
12:39:29 PM | 0.462 | 1,197 | 553.613 | CXXT |
12:38:32 PM | 0.462 | 1 | 0.463 | CXXT |
12:38:32 PM | 0.462 | 50 | 23.125 | CXXT |
12:36:00 PM | 0.462 | 1 | 0.463 | CXXT |
12:36:00 PM | 0.462 | 50 | 23.125 | CXXT |
12:33:28 PM | 0.462 | 1 | 0.463 | CXXT |
12:33:28 PM | 0.462 | 56 | 25.900 | CXXT |
12:31:27 PM | 0.462 | 2 | 0.925 | CXXT |
12:31:27 PM | 0.462 | 76 | 35.150 | CXXT |
12:30:57 PM | 0.462 | 2 | 0.925 | CXXT |
12:30:56 PM | 0.462 | 89 | 41.163 | CXXT |
12:30:31 PM | 0.462 | 1 | 0.463 | CXXT |
12:30:31 PM | 0.462 | 70 | 32.375 | CXXT |
12:30:31 PM | 0.462 | 3,499 | 1,618.288 | NXXT |
12:24:58 PM | 0.462 | 17 | 7.863 | CXXT |
12:24:58 PM | 0.460 | 849 | 390.540 | |
12:24:42 PM | 0.462 | 11 | 5.088 | CXXT |
12:24:42 PM | 0.462 | 517 | 239.113 | CXXT |
12:23:21 PM | 0.462 | 1 | 0.463 | CXXT |
12:23:21 PM | 0.462 | 54 | 24.975 | CXXT |
12:20:49 PM | 0.462 | 1 | 0.463 | CXXT |
12:20:49 PM | 0.462 | 53 | 24.513 | CXXT |
12:19:18 PM | 0.462 | 1 | 0.463 | CXXT |
12:19:18 PM | 0.462 | 53 | 24.513 | CXXT |
12:17:47 PM | 0.462 | 1 | 0.463 | CXXT |
12:17:47 PM | 0.462 | 46 | 21.275 | CXXT |
12:16:47 PM | 0.462 | 1 | 0.463 | CXXT |
12:16:47 PM | 0.462 | 51 | 23.588 | CXXT |
12:16:25 PM | 0.462 | 18 | 8.325 | CXXT |
12:16:25 PM | 0.460 | 861 | 396.060 | XT |
12:15:54 PM | 0.462 | 1 | 0.463 | CXXT |
12:15:54 PM | 0.462 | 50 | 23.125 | CXXT |
12:15:54 PM | 0.462 | 12 | 5.550 | CXXT |
12:15:26 PM | 0.460 | 587 | 270.020 | |
12:15:26 PM | 0.462 | 7 | 3.238 | CXXT |
12:15:14 PM | 0.462 | 360 | 166.500 | CXXT |
12:15:14 PM | 0.462 | 18,000 | 8,325.000 | CXXT |
12:13:54 PM | 0.462 | 3 | 1.388 | CXXT |
12:13:54 PM | 0.462 | 149 | 68.913 | CXXT |
12:13:54 PM | 0.462 | 25 | 11.563 | CXXT |
12:13:45 PM | 0.462 | 7,412 | 3,428.050 | CXXT |
12:13:45 PM | 0.462 | 1,260 | 582.750 | CXXT |
12:13:45 PM | 0.462 | 1 | 0.463 | CXXT |
12:13:45 PM | 0.462 | 52 | 24.050 | CXXT |
12:12:14 PM | 0.462 | 1 | 0.463 | CXXT |
12:12:14 PM | 0.462 | 55 | 25.438 | CXXT |
12:10:43 PM | 0.462 | 1 | 0.463 | CXXT |
12:10:43 PM | 0.462 | 58 | 26.825 | CXXT |
12:09:59 PM | 0.462 | 2 | 0.925 | CXXT |
12:09:59 PM | 0.462 | 66 | 30.525 | CXXT |
12:09:59 PM | 0.460 | 3,334 | 1,533.640 | |
12:09:12 PM | 0.462 | 2 | 0.925 | CXXT |
12:09:12 PM | 0.462 | 62 | 28.675 | CXXT |
12:07:41 PM | 0.462 | 1 | 0.463 | CXXT |
12:07:41 PM | 0.462 | 63 | 29.138 | CXXT |
12:06:10 PM | 0.462 | 1 | 0.463 | CXXT |
12:06:10 PM | 0.462 | 65 | 30.063 | CXXT |
12:04:38 PM | 0.462 | 2 | 0.925 | CXXT |
12:04:38 PM | 0.462 | 65 | 30.063 | CXXT |
12:02:07 PM | 0.462 | 45 | 20.813 | CXXT |
11:59:05 AM | 0.462 | 1 | 0.463 | CXXT |
11:59:05 AM | 0.462 | 51 | 23.588 | CXXT |
11:58:11 AM | 0.462 | 1 | 0.463 | CXXT |
11:57:16 AM | 0.462 | 8 | 3.700 | CXXT |
11:57:16 AM | 0.462 | 435 | 201.188 | CXXT |
11:57:16 AM | 0.460 | 21,739 | 9,999.940 | |
11:57:03 AM | 0.462 | 1 | 0.463 | CXXT |
11:57:03 AM | 0.462 | 55 | 25.438 | CXXT |
11:56:52 AM | 0.462 | 5 | 2.313 | CXXT |
11:56:52 AM | 0.462 | 217 | 100.363 | CXXT |
11:55:02 AM | 0.462 | 1 | 0.463 | CXXT |
11:55:02 AM | 0.462 | 57 | 26.363 | CXXT |
11:52:30 AM | 0.462 | 1 | 0.463 | CXXT |
11:52:30 AM | 0.462 | 45 | 20.813 | CXXT |
11:49:59 AM | 0.462 | 34 | 15.725 | CXXT |
11:49:59 AM | 0.462 | 48 | 22.200 | CXXT |
11:48:34 AM | 0.462 | 94 | 43.475 | CXXT |
11:48:34 AM | 0.462 | 204 | 94.350 | CXXT |
11:47:27 AM | 0.462 | 129 | 59.663 | CXXT |
11:47:27 AM | 0.462 | 60 | 27.750 | CXXT |
11:46:57 AM | 0.462 | 1 | 0.463 | CXXT |
11:46:57 AM | 0.462 | 89 | 41.163 | CXXT |
11:46:52 AM | 0.462 | 61 | 28.213 | CXXT |
11:46:52 AM | 0.462 | 20 | 9.250 | CXXT |
11:44:27 AM | 0.462 | 73 | 33.763 | CXXT |
11:42:35 AM | 0.462 | 2 | 0.925 | CXXT |
11:42:24 AM | 0.462 | 169 | 78.163 | CXXT |
11:39:13 AM | 0.462 | 86 | 39.775 | CXXT |
11:39:13 AM | 0.462 | 3 | 1.388 | CXXT |
11:35:27 |
That's OK @ndefries, I can show him where it is.
That's OK @ndefries, I can show him where it is.
Looks like we're in the right place at the right time.
April 6, 2023
The Automotive Market Pivots Hard to Generative AI and the Metaverse
Rob Enderle
(Chesky/Shutterstock)
NVIDIA held its GTC conference in March, and much of the content had to do with how the automotive market is rethinking the 300+ new factories it will be building to create the next generation of electric, self-driving cars, trucks and flying vehicles. Generative AI and the metaverse will potentially provide a customization capability that hasn’t been seen since the beginning of the automotive industry when cars were more custom-built than line-built. I expect this advancement to improve customer retention, customer satisfaction, reliability, and performance and to substantially reduce market failures. Other industries will embrace these technologies, as well, for similar reasons.
Let’s talk about how generative AI and the metaverse is already resulting in a massive change in future car and truck factories and how the related companies will engage more deeply with and become much closer to their users.
The Metaverse from Cradle to Grave
Typically, cars are conceived as ideas. These ideas are then winnowed down into a couple of concepts. The concepts are made into clay models and circulated for comment. Prototypes are built and taken to car shows and tested on private then public roads. Focus groups are brought in to see if there is a market for the car. This process can take over five years, generally doesn’t anticipate what the competitive market will be when the car is released, and often lacks enough customer voice throughout the process. The result is that cars sell poorly once they come out. In addition, during line setup, problems are often discovered late in the process which delays manufacturing and incurs otherwise avoidable costs to redesign and reconfigure the lines.
The metaverse (mostly NVIDIA Omniverse which is dominant in automotive) is increasingly being used by a wide variety of car makers to not only design and receive feedback on the new car, but to virtually design factories and their manufacturing floors to assure that major parts of the new or existing factory won’t need to be redone and potentially reduce time-to-market significantly.
In addition, the metaverse is being used to design and get feedback on the car, which eliminates the problems associated with configuring the lines because those problems can now be identified in the metaverse. This allows for less costly corrections should they be necessary.
A digital twin of the car is created that allows the buyer to not only build the car they want but allows them to follow the car through the build process and address any questions about the choices they made. This digital twin will remain tied to the car. It will help the user fix some things and enable the user to not only better discover potential problems but help them fix them if they are remote from a dealer. The car company can follow the life of the vehicle in order to understand and address points of failure that might otherwise arise later and damage its relationship with the user.
Generative AI at the Heart of the Customer Relationship
These next generation cars are slated to have generative AI interfaces that enable the driver to converse with their vehicle in natural language as opposed to packaged and irritating fixed commands. This conversational interface has already spread through Microsoft’s developer tools and most recently through Windows, Office, and the Edge browser. Even though Google was caught sleeping, this interface should quickly spread across its platforms, as well. This means we’ll be surrounded increasingly by things that we can use natural language to interface with.
(Andrey Suslov/Shutterstock)
The implication is that, over time, car owners will interface with their car and car company through generative AI and develop more of a collegial relationship with their vehicles and car companies. These AIs will not only help users learn about their new vehicles and how to operate them, but help them select the right vehicle and configuration before they buy the car. In addition, I expect this will evolve to a point where we will first interface with our generative automotive AI during the purchase process and be able to apply the most successful upselling capabilities while balancing the need to maintain a trusted relationship between the buyer and the related car company.
Instead of interacting only when the buyer has a very serious problem or during the buying process, future buyers will stay engaged through the generative AI with their car company. I also expect this generative AI experience won’t just be in the car but will extend into the home and business as users demand a more consistent AI experience across an ever-widening field of products much like we saw with tools like Apple’s Siri and especially Amazon Alexa. But car companies are already pulling the plug on these third-party tools in favor of their own to embrace their customers and tie them more tightly to the company by addressing the customer retention problem.
Wrapping up:
Automotive companies appear to be racing ahead of everyone else to apply metaverse and generative AI to their products and factories. But the benefits of this move, which include speed to market, fewer mistakes, better reliability, better performance and higher customer satisfaction and customer loyalty, will spread to other industries. Generative AI efforts will consolidate to approach the goal that users will likely prefer of having only one generative AI interface into all their smart products. This suggests future market expansions for automotive companies that will want to partner with or buy into related markets that will benefit from the automotive companies’ leadership as they leverage the user’s need for a consistent AI experience.
In short, while the metaverse and generative AI will hit the automotive market hard at first, once the benefits are validated, they will rapidly spread to other markets and change every aspect of how we design, build, monetize and service products and how these companies create and maintain a deeper relationship with customers.
About the author: As President and Principal Analyst of the Enderle Group, Rob Enderle provides regional and global companies with guidance in how to create credible dialogue with the market, target customer needs, create new business opportunities, anticipate technology changes, select vendors and products, and practice zero dollar marketing. For over 20 years Rob has worked for and with companies like Microsoft, HP, IBM, Dell, Toshiba, Gateway, Sony, USAA, Texas Instruments, AMD, Intel, Credit Suisse First Boston, ROLM, and Siemens.
The Automotive Market Pivots Hard to Generative AI and the Metaverse
NVIDIA held its GTC conference in March, and much of the content had to do with how the automotive market is rethinking the 300+ new factories it will bewww.datanami.com
no hablo inglés gringoIn English please
no hablo inglés gringo
no hablo inglés gringo
I see Teksun as competition for Megachips. Looking at their website they appear to do the same thing. VVDN as well only being in India will target that zone and also have a large manufacturing capability.Was looking at the list that @Stable Genius put up not long ago.
Started asking myself questions on the more recent flurry, if you will, of partnership releases.
Like, AI Labs, Teksun, emotion3D, VVDN, Intellisense - though known of via NASA public info, the relationship has now been publicly solidified by BRN.
Why now?
Do we consider that some / all of these "partners" were part of the EAP or just left field new?
By defining them as partners and not clients or having formal agreements you could possibly skirt the supposed issue of disclosing projected income as per the ASX BS.
If so, we know how long it took Renesas to get through their programs before now taping out for a 3rd party.
Is it conceivable these more recent public acknowledgements are a precursor that these partners are nearing or have completed their DD and are nearing product confirmations, releases in due course?
Has this been the sort of clients (partners) the BRN tech hires past several months have been working with to get to product stage?
If they were to be going to mkt then are they able to just get Megachips to design etc their Akida integrated ASIC, if thats the design and pay Megachips accordingly which then flows to us.
Appears able to get Renesas to do similar maybe with MCUs and $ to us accordingly.
All without individual licences.
Just some random thoughts.
NASA
FORD
VALEO
MERCEDES
SIFIVE
MAGIKEYE
MEGACHIPS
RENESAS
BIOTOME
PROPHESEE
EDGE IMPULSE
ARM
NVISO
GLOBAL FOUNDRIES
VVDN
SOCIONEXT
MOSCHIP
INTEL
NANOSE
AI LABS
SAHOMA CONTROLWARE
EMOTION3D
TEKSUN
INTELLISENSE
@Tothemoon24Has anyone dived into the company Oculi ?
Oculi Forms strategic partnership with GlobalFoundries to advance edge sensing technology
BALTIMORE, MD, UNITED STATES -- February 15, 2023 -- Oculi today announced a strategic partnership with GlobalFoundries Inc. (GF) (Nasdaq: GFS), a global leader in feature-rich semiconductor manufacturing, to manufacture OCULI SPU S12, the world's first single-chip intelligent Software-Defined Vision Sensor. This new chip will be used in smart devices and homes, industrial, IoT, automotive markets and wearables including AR/VR.
The OCULI SPU S12 is the first of 3 product lines that will disrupt and set a new standard in vision technology. It is based on GF’s 55LPx which is a feature-rich platform that supports Radio Frequency (RF), ultra-low power (ULP), embedded non-volatile memory (eNVM) and high voltage BCDLite® options. This makes the platform an ideal solution for System on Chip (SoC) integration to enable more functionality, less energy consumption and smaller form factor electronic applications.
Dr. Charbel Rizk, Oculi founder and CEO commented "Oculi has solved decades-long fundamental challenges by developing a new form of vision sensor that applies selectivity to process only the most relevant information 30X faster, 1/10th the power drain, and protects privacy/data security at the sensor. Oculi technology, which features pre-processing inside the pixel that emulates the human eye, will ultimately power many smart devices. Oculi’s new vision is ideal for edge applications such as always-on gesture/face/people tracking and low-latency eye tracking, while alternative solutions are too slow, big, and power inefficient. GF is an excellent partner to enable us to quickly get our product to our customers."
In collaborating with Oculi, Mike Hogan, chief business officer at GF, recognized the synergy for both companies in this engagement. As Mike stated, "Visual AI at the edge is the growth vector to enhance the connection between the digital and physical world. Oculi's architecture and GF’s production-proven 55LPx feature-rich platform enables device makers to get to market faster while also by optimizing data usage and reducing energy consumption.”
Thank you @Fullmoonfever much appreciated@Tothemoon24
Some prev posts when I went digging a little on them as from memory someone picked up maybe Rob mentioned them in a video or something. Was very brief mention.
Posts not in date order but bits on Oculi fwiw.
BRN Discussion Ongoing
My theory is that the broker handling the stop loss can see the stop loss, and should be obliged to keep that information confidential. Ideally, the information should be kept secret from the ASX, with the broker's trading computer programmed to place the sell order when the price falls to the...thestockexchange.com.au
BRN Discussion Ongoing
@Tothemoon24 your post is exciting. Oculi's technology is the same technology developed at John Hopkin's university as descibed in your post. Brainchip is currently engaged with Oculi. @chapman89 post today shows that Oculi has entered into a strategic agreement with Global Foundaries (as we...thestockexchange.com.au
BRN Discussion Ongoing
I often go back to the Renesas tape out Ann to reread its context. Key question for me is the following couple of paragraphs plus the common theme we are now seeing with 22nm both through Renesas in Dec and by BRN with GF... They way I read the below is that Renesas have had someone come to...thestockexchange.com.au
BRN Discussion Ongoing
Does anyone know when the quarterly is due? End of this month. But Fed up with our Jurassic banking system - I transferred $2500 to my trading account on Saturday and it still isn’t available. I’m sure it’ll be clear tomorrow once the price has pumped up again 🫣 This is one thing that...thestockexchange.com.au
BRN Discussion Ongoing
@Tothemoon24 your post is exciting. Oculi's technology is the same technology developed at John Hopkin's university as descibed in your post. Brainchip is currently engaged with Oculi. @chapman89 post today shows that Oculi has entered into a strategic agreement with Global Foundaries (as we...thestockexchange.com.au
BRN Discussion Ongoing
Does anyone know when the quarterly is due? End of this month. But Fed up with our Jurassic banking system - I transferred $2500 to my trading account on Saturday and it still isn’t available. I’m sure it’ll be clear tomorrow once the price has pumped up again 🫣 This is one thing that...thestockexchange.com.au
BRN Discussion Ongoing
Could be totally wrong but gist I get for their Oculi tech from the patent that @Diogenese also found is driven around the pixel identification and changes and they speak of the vision component primarily that needs to dovetail into an AI processing aspect maybe? Driven around threshold values...thestockexchange.com.au
BRN Discussion Ongoing
Could be totally wrong but gist I get for their Oculi tech from the patent that @Diogenese also found is driven around the pixel identification and changes and they speak of the vision component primarily that needs to dovetail into an AI processing aspect maybe? Driven around threshold values...thestockexchange.com.au