BRN Discussion Ongoing

Diogenese

Top 20
NVIDIA have announced DRIVE Thor SoC will be based on Blackwell GPU architecture.



View attachment 59364


And here's something I think is VERY INTERESTING! Thor will be paired with a "YET TO BE NAMED GPU"!!! 👀

View attachment 59366



EXTRACT

View attachment 59365


How many gazillions is 2,000 trillion?
 
  • Haha
  • Fire
  • Like
Reactions: 5 users
Part 2

Now, if you’ve learned anything about Jensen so far—the man who has no problem betting the farm for a village—you know another big company gamble is coming along. 🎲
Here’s a quote by Jensen. It’s somewhat random to insert here, but I think it paints a great picture of who Nvidia’s captain is.
My will to survivor exceeds almost everybody else’s will to kill me
— Jensen
What a legend. 🫡
Now let’s see, pun absolutely intended, how the chips landed with his 2012 bet.


How They Grow: Powering The Next Stage Of The Internet

Over a decade ago, in a more suave setting than a bullet-ridden Denny’s Diner, the Nvidia founders reviewed the landscape yet again and spotted the next big wave: The data center.
Today, it’s Nvidia’s biggest driver of growth, and coupled with their Omniverse and AI platforms, it plays a crucial role in the building of the 3D Internet (what Nvidia calls the metaverse) and AI development.

The Nvidia Data Center: One platform, unlimited acceleration.​

So far, we’ve covered how there were two programmable (AKA, follows specific instructions) processors:
  • The CPU: Intel’s main competency, with transistors focused on linear computing. For years, it was the sole programmable element of a computer.
  • The GPU: Nvidia’s expertise, with transistors focused on parallel computing. Initially used for rendering 3D graphics, GPUs’ processing efficiencies make them ideal for things like crypto and AI.
The problem—as Jensen saw coming—with both of them is that they’ve reached their physical upper limit of computational capabilities. He also knew that operating at the limit wasn’t going to be good enough.
For instance, back in 2012, Nvidia sparked the era of modern AI by powering the breakthrough AlexNet neural network. They knew AI was coming long before it was hot, and they knew the type of horsepower the engine for it would demand.
So, seeing that (1) the limit was nearing, and (2) there would be demand for computational power far beyond what single PCs or even servers could provide, he leaned into Nvidia’s expertise to invest heavily in a new, third, class of programmable processors:
  • The DPU (Data Processing Unit). To avoid the technical mumbo-jumbo here, just know that these chips are ultra-efficient in data functions and are the building blocks of hyper-scale data centers. The DPU is at the center court of the computational arena today, with insane demand since it powers AI.
As Elon Musk said to the WSJ…they are considerably harder to get right now than drugs.
And I trust him. 😉
https://substackcdn.com/image/fetch/f_auto,q_auto:good,fl_progressive:steep/https://substack-post-media.s3.amazonaws.com/public/images/ea781045-7798-4192-8d29-abf115184e15_828x552.png
Elon sending it.
Essentially, Nvidia realized the future of accelerated computing was going to be a full-stack challenge, demanding a deep understanding of the problem domain, optimizing across every layer of computing, and all three chips—GPU, CPU, and DPU.
But as we’ve seen multiple times in this newsletter—selling products alone doesn’t get you to the 4-comma club. Although, selling a chip with unparalleled demand at $30K a pop won’t hurt you.
No, only building an integrated platform will get you there.
Which is exactly what they’ve done. In partnership with AWS, they’re building out the world's most scalable, on-demand cloud AI infrastructure platform optimized for training increasingly complex large language models (LLMs) and developing generative AI applications.
But Nvidia’s massive data center strategy isn’t their only platform play besides CUDA. 🤔

An homage to HBO’s Silicon Valley and the iconic data center
Nope. So let’s talk about the other two:
  • Nvidia’s AI Platform-as-a-Service
  • Nvidia’s Omniverse Platform

Chips & Systems: AIaaS— a classic Pick-and-Shovel play

The Pick-and-Shovel strategy is a nod to the California gold rush in the 1800s, where there were two types of people: Those who looked for gold, and those who sold the tools (picks and shovels) to mine it.
The latter folks not being in the business of caring whether they found it or not, since selling the tools was a sure-fire way of making money without the risk of investing in finding gold, which had huge uncertainties.
In the world of AI, Nvidia is in the same supply chain business. And not just by providing the hardware and data centers (Chips) to support all the AI players, but by selling builders access to their computation services and software to actually build and train their AI models (Systems).
It’s called the Nvidia DGX Cloud, and it’s an AI supercomputer accessible from your browser. 👀
The DGX Cloud service already includes access to Nvidia AI software, AI frameworks, and pre-trained models. Through it, Nvidia is also providing tools for building, refining, and operating custom large language and generative AI models.
At a high level, their full-stack AI ecosystem looks like this:
  • AI Supercomputer: An all-in-one AI training service that gives enterprises immediate access to their own cloud-based supercomputers
  • AI Platform Software: The software layer of the Nvidia AI platform, Nvidia AI Enterpriseďž powers the end-to-end workflow of AI. Simply put, it streamlines the development and deployment of production AI.
  • AI Models and Services: Nvidia AI Foundations are cloud services within the supercomputer for customizing and operating text, visual media, and biology-based generative AI models. Think of them like Shopify plugins that make the Shopify platform more flexible and useful.
Nvidia’s goal, simply, is to solve as many problems as deeply as possible in the journey of creating AI products. They’re not worried about who the next Chatbot winner is, because whoever it is, will be built off Nvidia’s engine.
Very clearly, they are at the outset of building an AI-as-a-service (AIaaS) business component, and it is having a fundamentally transformative impact on the business, and more significantly, the entire AI industry.
As The Motley Fool wrote, this early leadership could turn into a lasting competitive advantage, creating a defensible flywheel for:
Nvidia's leadership position in advanced processing hardware already looks quite strong and hard to disrupt, but the company may be in the early stages of building another powerful competitive advantage. While algorithms and processing hardware play very important roles in constructing and running AI systems, data can be thought of as the other key component. As AI models are trained with a greater array of relevant valuable data, they tend to become more effective and capable.
By establishing itself as an early leader in AIaaS offerings, Nvidia is positioned to generate huge swaths of valuable data that help to inform and improve its own artificial intelligence capabilities. Through these improvements, the company should be able to deliver better services for its customers.
In turn, this will once again generate more valuable data for the company, setting up a network effect and virtuous cycle that could turn early leadership in AI services into a long-term competitive edge that competitors find very difficult to disrupt.
Looking at Nvidia’s success and rigor in being strategically nimble, I’m betting this is exactly what will happen.
Now, combining their AIaaS with Nvidia Omniverse (below), we see that Nvidia has created a two-piece platform-as-a-service play.


What you can do with this ⚒️🧠
  • Anticipate industry shifts: Nvidia foresaw the limitations of CPUs and GPUs and invested in the DPU before the demand skyrocketed. Always be forward-thinking and anticipate where your industry is headed.
  • Diversify your offerings: Nvidia didn't just rely on selling chips; they built an integrated platform. Diversifying your product or service offerings can open up new revenue streams and make your business more resilient.
  • Leverage partnerships: Nvidia's partnership with AWS allowed them to build a scalable AI infrastructure platform. Find the right industry leaders or complementary businesses to amplify your reach and capabilities.
  • Platforms, not just products: Platform plays, as we’ve seen time and again, can lead to exponential growth. Platforms create ecosystems where third-party developers or businesses add value, leading to huge network effects.
  • Solve multiple Jobs-to-be-done: Nvidia's full-stack AI ecosystem is designed to simplify the AI development process for developers. Always think about how you can remove friction and make it easier for your customers to achieve their goals.
  • Build for the long-term: Nvidia's investments in AIaaS and their platform strategy are long-term plays that could give them a lasting competitive advantage. A great example of how to apply long-term thinking is here: Ants & Aliens: Long-term product vision & strategy
  • Stay nimble: Despite being massive, Nvidia has shown agility in their strategic decisions and ability to pivot into new areas. Regardless of size, be mindful of the traps that lead companies into the innovator’s dilemma.


Subscribe


Nvidia Omniverse: The platform for the useful metaverse

If Nvidia AI is their answer to every artificial intelligence question, then Omniverse is Nvidia’s answer to every metaverse question—the 3D evolution of the internet.
It’s a platform (also a chips-and-systems play) for virtual world-building and simulations, focused on enterprise customers. It’s based on Universal Scene Description (USD) technology, which, I know, means nothing to either of us. All you need to understand is that it’s the solution to everything 3D-related. Originally invented by Pixar, USD is well positioned to become the open standard that enables 3D development.
Reading up on their Omniverse platform, and it’s clear they have 3 main objectives here:
  1. Make Omniverse the go-to platform for Augmented Reality and Virtual Reality (AR/VR) development
  2. Sign up major partners to use Omniverse
  3. Create a strong and defensible ecosystem around Omniverse
To get there and turn those goals into an actionable strategy, they’re implementing three tactics:
  1. In order to become the prevalent system behind every 3D project, they’re focusing on offering the best simulation tech to the biggest companies and supporting every business that builds applications on top of Omniverse.
  2. In order to make 3D content creation and sharing as easy as possible, they’re focusing on individual and team collaboration tools for real-time interaction with partners and clients, world-building tools, a library of prebuilt objects and environments, compatibility with other sources of graphics and formats, and the leading game engines, Unreal and Unity.
  3. They’re securing partnerships with major cloud service providers (Azure, AWS, Google Cloud) to adopt Nvidia’s tech and architecture, essentially forcing the competition to use their technology, in turn, almost eliminating it.
The customer here, to be clear, is not game developers. Nvidia is not thinking about the metaverse in the gaming or social context. Although, their hardware is certainly involved in powering those. 👏
Rather, they’re betting on the more practical and realistic metaverse by looking at players like:
  • Artists of 3D content
  • Developers of AIs trained in virtual worlds
  • Enterprises that require highly detailed simulations
AKA, think robotics, the auto industry, climate simulations, space industry design and testing, virtual training, etc. In short, their Omniverse X AI platforms are turbocharging science.
One epic example of this is how Nvidia is creating a digital twin of the planet—named Earth-2—to help predict climate change decades in advance. It’s well worth the 1 minute 29s watch of Jensen explaining how their supercomputer is helping us save the world. 🌎

Super cool.
Like I said, Nvidia is at the forefront of some of the most consequential initiatives and is probably the best example we’ve seen of how engineering innovations really do move the world forward.

What you can do with this ⚒️🧠
When your category evolves, double down on your capabilities. Just because your previous strategic advantage can’t be maintained, it doesn’t mean it’s worthless.
The benefit of being a category queen is that:
  • You identify your category’s evolution early
  • You get to convert your advantage to serve you in the emerging new category
You have two enemies when your category evolves:
  • Complacency: falsely believing that you can sustain your current competitive advantage (see The Innovators Dilemma)
  • Overreaching: abandoning your advantage and trying to build new capabilities from zero

Okay, so just to regroup real quick, we’ve seen 3 key platforms (all relying on their hardware) that Nvidia uses to drive their growth.
  • Their CUDA platform gives them exposure to various general domains (gaming, automotive)
  • Their Data Centers + AI Platform gives them exposure to the rapidly growing AI sector
  • Their Omniverse Platform gives them exposure to the professional metaverse
As they called out in a recent investor presentation, that’s a big fucking TAM.

But as we’ve seen already with winning platforms like Stripe (read deep dive), Shopify (read deep dive), and Epic Games (read deep dive)—the best of them don’t just capture markets super effectively, but, they actively work to make their markets bigger.
Shocker, but with a market cap of 7X those three companies combined, Nvidia is no different. 👇

Seeding their own multi-market ecosystem: A lesson on expanding your TAM

Nvidia grows their own market in three powerful ways:
  • By incubating AI startups
  • By investing in companies integrated with the Nvidia ecosystem
  • By offering formalized training and expertise development for their chips and systems.
Spearheaded by our main character, Jensen, these initiatives are prime examples of his commitment to fostering innovation, providing scale to their business, and positioning Nvidia as the leader in the AI, data science, and high-performance computing (HPC) arenas.
Let’s run through them real quick.

1. Nvidia’s Inception AI Startup Program​

Inception is a free accelerator program designed to help startups build, grow, and scale faster through cutting-edge technology, opportunities to connect with venture capitalists and other builders, and access to the latest technical resources and expertise from Nvidia.
All in, they have over 15K startups that have been incubated in this program. And what’s interesting given you’ve never heard of it, is that those startups have raised an on-par amount of funding as the market cap of TechStars’s portfolio companies. (~$100B)
This is a powerful investment strategy that remains largely under-appreciated, as it drives adoption of their own hardware and software (locking in the next wave of startups to Nvidia), as well as gives Nvidia exposure to startups now more likely to flourish and push the boundaries of AI, data science, and HPC.
And then, as part of Nvidia’s “seed the ecosystem” platform, they recently launched an arm focused on driving bigger investments into these startups.
As you can tell, Jensen loves a platform.

2. Nvidia’s Venture Capital Alliance Program​

Here is what Nvidia said about the program when they launched it in 2021:
To better connect venture capitalists with NVIDIA and promising AI startups, we’ve introduced the NVIDIA Inception VC Alliance. This initiative, which VCs can apply to now, aims to fast-track the growth for thousands of AI startups around the globe by serving as a critical nexus between the two communities.
AI adoption is growing across industries and startup funding has, of course, been booming. Investment in AI companies increased 52 percent last year to $52.1 billion, according to PitchBook.
A thriving AI ecosystem depends on both VCs and startups. The alliance aims to help investment firms identify and support leading AI startups early as part of their effort to realize meaningful returns down the line."
Theďž NVIDIA Inception VC Alliance is part of theďž NVIDIA Inception program, an acceleration platform for over 7,500 startups (now 15,000 in 2023) working in AI, data science and HPC, representing every major industry and located in more than 90 countries.
Among its benefits, the alliance offers VCs exclusive access to high-profile events, visibility into top startups actively raising funds, and access to growth resources for portfolio companies. VC alliance members can further nurture their portfolios by having their startups join NVIDIA Inception, which offers go-to-market support, infrastructure discounts and credits, AI training through NVIDIA’s Deep Learning Institute, and technology assistance.
Again, this furthers the same advantage Nvidia gets from their accelerator—it helps drive the scale of their customers. As they grow, so does Nvidia’s TAM.

3. Nvidia’s Deep Learning Institute​

I love it when companies layer in education and training plays. And for a company like Nvidia, which needs more than a thoughtful onboarding experience to get a new customer setup, having meaningful learning material and access to exert resources is crucial.
The creation of their Deep Learning Institute (DLI) has proven to be a super valuable initiative with lots of benefits for Nvidia’s business and the wider community. Most significantly, it’s become a key driver in advancing knowledge and expertise in AI, accelerated computing, data science, graphics, and simulation.
Yet again, ensuring Nvidia’s flag is right there at the forefront of breakthrough thinking and innovation across most high-growth, deep-tech, sectors.
Through these three hidden gems, Nvidia demonstrates that they have been brilliant at creating scale for their business through partnering, investing, co-creating, and teaching their customers and partners their technologies to solve real-world problems.

What you can do with this ⚒️🧠
It’s rare to be in a position where you can grow your TAM through investing and incubating startups. Your business needs to be substantially large to do that. But here are a few other takeaways Nvidia bring us that are more realistic for most of us reading:
  • Think beyond direct sales: Nvidia's approach demonstrates that market leadership isn't just about selling products. It's about creating an ecosystem where your products become indispensable. The sweet spot: you’re able to create an environment where your product is deeply integrated into the fabric of an industry.
  • Innovate across multiple fronts: Nvidia isn't just innovating in terms of products. They're innovating in terms of how they engage with startups, how they connect with VCs, and how they educate the market. Always be thinking about how you can innovate across multiple fronts to drive growth and market leadership.
  • Layer in education: Offering educational resources or training can help customers get more value from your products and can position your company as a thought leader. Consider how you can educate your customers, whether it's through online courses, webinars, or other resources.
  • Create alliances with VCs: If you're in a high-growth industry, consider how you can facilitate connections between investors and startups that use or complement your products.
  • A rising tide lifts all boats: Nvidia actively works to make their market bigger by supporting their customers' growth. Always be thinking about how you can expand your TAM, whether it's by entering new markets, creating new use cases for your products, bringing non-customers into the market, or driving innovation in your industry that attracts investment.


 
  • Like
  • Fire
  • Love
Reactions: 23 users
Part 3

What could be next for Nvidia? Neuromorphic Computing? 🧠



Okay, so to understand neuromorphic computing (NC), all you need to know is this formula:


Juust kidding.
In short, the answer is much simpler: The future of chips is making physical computers think more like human brains, where chips use the same physics of computation as our own nervous system.
The term "neuromorphic" comes from the Greek words "neuron" (meaning nerve cell) and "morphe" (meaning form). In the context of computing, it refers to the use of electronic circuits and devices inspired by biological neurons' structure and function.
While still nascent, this is another massive technological feat. You might be thinking—but we’ve done something similar already with AI neural networks, what’s the difference?
Simply, we’ve made great progress on the software side of things in terms of mimicking the logic of how a human brain thinks. But solving those challenges on a physical chip is a different beast.
That’s what NC is solving though, and you can imagine how much more advanced our AI and computing will be when we have the chips and software both working in unison like a brain. 🧠
And just to illustrate the monumental difference on the chips’ side:
  • Your computer operates in binary. That’s 0s and 1s; Yes and Nos. It’s rigid, so, the code we use and the questions we ask these kinds of machines must be structured in rigid way.
  • With NC though, we go from rigid to flexible, as these chips will bring computers to the ability to have a gradient of understanding. I’m no engineer, but from what I’ve read, that’s huge.
Here are a few of the breakthrough benefits NC has pundits excited about:
In more succinct words: Neuromorphic Computing is the key to huge leaps in AI advancements.
Just imagine how fast things would be changing in the AI landscape if these were powering Nvidia’s data centers and AI platforms.

It sounds somewhat far-fetched, but that future is already here and in the works.
As was published in this research paper in Nature:
With the end of Moore’s law approaching and Dennard scaling ending, the computing community is increasingly looking at new technologies to enable continued performance improvements. Neuromorphic computers are one such new computing technology. The term neuromorphic was coined by Carver Mead in the late 1980s1,2, and at that time primarily referred to mixed analogue–digital implementations of brain-inspired computing; however, as the field has continued to evolve and with the advent of large-scale funding opportunities for brain-inspired computing systems such as the DARPA Synapse project and the European Union’s Human Brain Project, the term neuromorphic has come to encompass a wider variety of hardware implementations.
And builders are going after it. Intel is already working on these chips, as are various other startups.
In the long term, NC poses a technological obsolescence risk to traditional GPUs and DPUs. If these types of chips become successful, it could threaten Nvidia's business.
However, because NC has the potential to be a game-changer in many different areas of society, and its consequences could be far-reaching and complex, I have zero doubt in my mind that Jensen and his crew are sitting in a Denny’s somewhere, dice in hand, and mapping out Nvidia’s strategic future over some burnt coffee. ☕
 
  • Like
  • Fire
  • Love
Reactions: 31 users

Diogenese

Top 20
I am just trying to get up to date about the Nvidia GTC news.
Nvidia's Blackwell looks really impressive but power consumption for these ML systems is becoming mind boggling if you add up all instances/data centers etc.

A german tech journal had a nice quote (translated via deepl.com)

We should buy one for @Bravo's $5 party ...
 
  • Haha
  • Like
  • Fire
Reactions: 17 users
I like this analogy so what share prices would you consider to be reasonable for each of the following, just for shits and giggles ?.
2024
2025
2026
That's hard to say and depends on a lot of factors (even just for shits and giggles)..

Keeping it simple and supposing strong uptake, with solid deals and revenue.

2024 A$1.50 - A$2.50
2025 A$2.50 - A$4.50
2026 A$4 50 - A$7.00

At A$7.00 (assuming 1.8B SOI) that's a market cap of 12.6 Billion dollars and we've had a market cap of over 2 billion, on a Mercedes tweet.

So these figures, could well be conservative, but I would be happy of course to see them 😛

Company's have reached much higher market caps, on hundreds of millions of revenue (although while still making a loss).
If BrainChip is making 100s of millions of dollars in revenue, it will be at a margin of over 80%, with our IP model, so in that case, a market cap of over 30 billion AUD is easily achievable (considering AfterPay reached a MC of 33 billion, with around 500 million revenue, but while still running at a 50 million loss, from memory).

Time frames to these kind of revenues, are what we just don't know.
 
  • Like
  • Love
  • Fire
Reactions: 30 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
We should buy one for @Bravo's $5 party ...

How could I possibly decline such a generous offer dear Dodgy Knees?

It sounds powerful enough that I could probably rig it up to the hot tub's operating system when we hit $5! Could be useful for clothing detection purposes!
 
  • Haha
  • Fire
Reactions: 9 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
This would be an extremely good fit for us IMO. Could someone please get on the blower to Jensen Huang and let him know, if he doesn't already.

Lockheed, NVIDIA Make AI-Powered Digital Twin to Predict the Weather​


February 1, 2024¡2 min read



e634c60a18ff70260a29536f17679777





Lockheed Martin is collaborating with Nvidia to build a prototype that uses AI to combine data and detect anomalies in current environmental conditions. The tool could be increasingly important as the National Oceanic and Atmospheric Administration (NOAA) last year reported record ocean surface temperatures
The company’s have been working together since 2022 to build a prototype of an AI-driven Earth and Space Observing Digital Twin that can process live streams of weather data, apply AI to analyze the data and display current global environmental conditions from satellite and ground-based observations and output from weather forecasting models.

The project recently reached a significant milestone, demonstrating sea surface temperatures, one of NOAA’s critical data pipelines, to highlight multi-sensor fusion from satellite and model data along with short term temperature anomalies.
Most Read from IEN:

The Earth and Space Observing Digital Twin hopes to provide NOAA with a centralized approach to fuse and visualize data from various space and earth sensors. The demo showed NOAA and other government customers the potential of using AI to display high-resolution, accurate, and timely depictions of satellites and sensor data.
The NOAA is sent terabytes of weather data from multiple space and Earth-based sensors daily, accurately fusing the information could support real-time representations of global conditions.
According to Lynn Montgomery, AI research engineer at Lockheed Martin, the platform could be a “one-stop-shop for global weather monitoring” and enable more accurate initial conditions for predictive forecasting.
The project uses Lockheed Martin’s OpenRosetta3D software and the NVIDIA Omniverse development platform to build applications that aggregate data in real time. The digital twin uses these technologies to advance 4D visualizations and display this information in a digestible format.
Next, Lockheed will feed additional data streams, like space weather and sea ice concentrations, into the Earth and Space Observing Digital Twin next year.
This project was funded by NOAA SAE Joint Venture Partnerships for the exploration of possibilities for a future state and not a mission itself.

 
  • Like
  • Love
Reactions: 9 users
That's hard to say and depends on a lot of factors (even just for shits and giggles)..

Keeping it simple and supposing strong uptake, with solid deals and revenue.

2024 A$1.50 - A$2.50
2025 A$2.50 - A$4.50
2026 A$4 50 - A$7.00

At A$7.00 (assuming 1.8B SOI) that's a market cap of 12.6 Billion dollars and we've had a market cap of over 2 billion, on a Mercedes tweet.

So these figures, could well be conservative, but I would be happy of course to see them 😛

Company's have reached much higher market caps, on hundreds of millions of revenue (although while still making a loss).
If BrainChip is making 100s of millions of dollars in revenue, it will be at a margin of over 80%, with our IP model, so in that case, a market cap of over 30 billion AUD is easily achievable (considering AfterPay reached a MC of 33 billion, with around 500 million revenue, but while still running at a 50 million loss, from memory).

Time frames to these kind of revenues, are what we just don't know.
Brilliant Thankyou
 
  • Like
Reactions: 2 users

Diogenese

Top 20
Peter Van Der Made once said "4 bits are enough".



View attachment 59370



Find the missing EV manufacturer who uses Nvidia ...

"Next-gen AI Vehicle Partners
Car partners who’ve pledged their support to roll out next-gen AI vehicle fleets with Drive Thor are BYD, Hyper, XPENG, Li Auto and ZEEKR. Even commercial deployment vendors managing trucks, delivery vehicles, robotaxis and more are working on level 4 autonomous driving solutions with the NVIDIA Thor. These are Nuro, Plus, Waabi, and WeRide with Lenovo Vehicle Computing
."
 
  • Like
  • Fire
  • Thinking
Reactions: 4 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
I like this analogy so what share prices would you consider to be reasonable for each of the following, just for shits and giggles ?.
2024
2025
2026


2024

Screenshot 2024-03-19 at 1.34.24 pm.png


2025

Screenshot 2024-03-19 at 1.34.47 pm.png


2026


Screenshot 2024-03-19 at 1.35.53 pm.png
 
  • Haha
  • Like
  • Love
Reactions: 57 users

skutza

Regular
Just to make sure I'm not misunderstanding, you typically expect a huge event from a company that costs hundreds of thousands of dollars to showcase progress in their technology that they're still refining? But because we're a startup, you'll let it slide? (Besides, it's Edge Impulse with our Akida, but okay...)

And by the way in the next set of Quartals, you're probably going to be the first one to ask why we have -1.5 million in expenses. 😂🫵
No, I don't expect any of that. But my point was that this looks like a small science project at a USA science fair at Springfield elementary school. So I would rather that the company didn't post it on their socials. Let the dude have it on his LinkedIn. But again, just my feel, you're obviously happy with it, so good for you. My guess is I likely have about $300k more invested in BRN and would like to be sure I have invested it well. Currently I feel great about my investment, however ask me again after the qtrly.
 

Diogenese

Top 20
This would be an extremely good fit for us IMO. Could someone please get on the blower to Jensen Huang and let him know, if he doesn't already.

Lockheed, NVIDIA Make AI-Powered Digital Twin to Predict the Weather​


February 1, 2024¡2 min read



e634c60a18ff70260a29536f17679777





Lockheed Martin is collaborating with Nvidia to build a prototype that uses AI to combine data and detect anomalies in current environmental conditions. The tool could be increasingly important as the National Oceanic and Atmospheric Administration (NOAA) last year reported record ocean surface temperatures
The company’s have been working together since 2022 to build a prototype of an AI-driven Earth and Space Observing Digital Twin that can process live streams of weather data, apply AI to analyze the data and display current global environmental conditions from satellite and ground-based observations and output from weather forecasting models.

The project recently reached a significant milestone, demonstrating sea surface temperatures, one of NOAA’s critical data pipelines, to highlight multi-sensor fusion from satellite and model data along with short term temperature anomalies.
Most Read from IEN:

The Earth and Space Observing Digital Twin hopes to provide NOAA with a centralized approach to fuse and visualize data from various space and earth sensors. The demo showed NOAA and other government customers the potential of using AI to display high-resolution, accurate, and timely depictions of satellites and sensor data.
The NOAA is sent terabytes of weather data from multiple space and Earth-based sensors daily, accurately fusing the information could support real-time representations of global conditions.
According to Lynn Montgomery, AI research engineer at Lockheed Martin, the platform could be a “one-stop-shop for global weather monitoring” and enable more accurate initial conditions for predictive forecasting.
The project uses Lockheed Martin’s OpenRosetta3D software and the NVIDIA Omniverse development platform to build applications that aggregate data in real time. The digital twin uses these technologies to advance 4D visualizations and display this information in a digestible format.
Next, Lockheed will feed additional data streams, like space weather and sea ice concentrations, into the Earth and Space Observing Digital Twin next year.
This project was funded by NOAA SAE Joint Venture Partnerships for the exploration of possibilities for a future state and not a mission itself.


Weather prediction will be simplified when they get Blackwell running - when it's on it's hot, when it's not, it's not.
 
  • Haha
  • Like
Reactions: 10 users

IloveLamp

Top 20
  • Like
  • Fire
Reactions: 10 users

jtardif999

Regular
Hi all, is it just me or does this type of post/marketing actually put us backwards? I mean as nice as the information is, it seems like a very basic and ...... i don't know unprofessional finish/polish to the company? Cheap? i can totally say it's just my impression, but maybe others feel similar? Ho hum.
Cheap is the only way to eventually proliferate. 54% of companies surveyed recently said they would have to incorporate AI to help their business grow by 2030 or be left behind. (Paraphrasing a story I read on Apple News a few days ago.) There is a greater proportion of companies that won’t be able to afford the expensive infrastructure changes to incorporate the AI we currently have and this includes edge technology. Selling cheaper, lower power tech to that proportion could be the most lucrative thing we could do. BRN has the top down approach selling IP to bigger companies that can then produce large quantities at scale, but through Edge Impulse and similar and the university programs imo are also tackling proliferation from the bottom up. I think the Edge Box and the Cup Cake server are examples of plug and play for a larger audience and the Edge Impulse projects then provide real world examples of use cases that in time will take hold. We’ve seen in the last couple of days a medical tech example detecting pneumonia and now the defect monitoring example. The more these kinds of examples are shown the more Akida will be thought of as a go to imo. It feels like we are getting much closer to that tipping point right now!
 
  • Like
  • Fire
Reactions: 20 users

miaeffect

Oat latte lover
Weather prediction will be simplified when they get Blackwell running - when it's on it's hot, when it's not, it's not.
Blackwell heat issue? No problemo!
images - 2024-03-19T141957.942.jpeg
 
  • Haha
  • Like
Reactions: 24 users

IloveLamp

Top 20
  • Like
  • Love
  • Fire
Reactions: 13 users
Top Bottom