BRN Discussion Ongoing

Potato

Regular
So just to recap guys for this weekL:
- 4C release
- Semiconductor bill (possibly) to be passed this week in US congress.
link: https://thehill.com/homenews/senate...ngress-awaits-movement-on-semiconductor-bill/

Screen Shot 2022-07-26 at 11.07.58 am.png
 
  • Like
  • Love
Reactions: 15 users

Yak52

Regular
Based on that, I'd say it is likely that Andreas lurks among us here on TSE :cool::ninja:

I was just thinking that possibility myself - alwaysgreen!
And most likely a BRN shareholder as well.

Y.
 
  • Like
  • Fire
  • Thinking
Reactions: 10 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Not sure what happened with this case, but it really highlights the significance of Privacy concerns, in terms of what information is being collected and stored in electric vehicles, which is something that @cosors has also mentioned. In this case the Plaintiff alleges that Tesla’s driver monitoring practices violates his Illinois citizens’ statutorily protected privacy rights. It would be interesting to find what happens with this as I imagine it could set a legal precedent for future facial recognition technology.

It says " The class action complaint seeks to collect statutory damages of $5,000 for every time Tesla willfully or recklessly violated Illinois’ Biometric Information Privacy Act. It also seeks to collect statutory damages of $1,000 for each negligent violation of the state’s BIPA. Tesla’s legal team, for its part, is yet to issue a response to the complaint."

If only @Fact Finder were still here. I would love to hear his thoughts on this and, well, pretty much everything else just generally speaking. 🤗



PS: Obviously we don't have to worry about Privacy concerns thanks to AKIDA.🥳

Screen Shot 2022-07-26 at 2.47.49 pm.png
 
Last edited:
  • Like
  • Fire
  • Thinking
Reactions: 23 users

equanimous

Norse clairvoyant shapeshifter goddess
MSN.com big data investing ideas

1658805520524.png
 
Last edited:
  • Like
  • Love
  • Wow
Reactions: 18 users
F

Filobeddo

Guest
Looking at the same thing ;)

Calcs on number of shares in individual trades up until 10.30ish for those interested

View attachment 12445

Interestingly (or maybe not so ;)) for that same period 25% of individual trades or 1 in 4 🙃 were for exactly the same volume

may run that again end of day and compare
 
  • Like
  • Fire
Reactions: 8 users

Potato

Regular
Thoughts on the current price at $1.17??
 

DJM263

LTH - 2015
  • Haha
  • Like
  • Love
Reactions: 18 users

Dougie54

Regular
Thank you for all your words of both support for my idea (and it is only an idea!) and also those against it.
MD thanks for your post this morning and for now I am praying for a $40+ Mil 4C as I missed out on the Ken Shirts! Thanks but I believe you are reasonably safe for now.

I was waiting to see if anyone would have picked up on an interesting fact about PVDM chat to our forum member Crestman!

Quote - [ "I got the chance to speak to PVDM and I asked him about the ARM partnership. He told me that the agreement was that they will receive $1M upfront for every ARM customer who wants their device/chip to include Akida IP. And then will get ongoing royalties when that device/chip is manufactured on an ongoing basis"]
---------------------------------------------------------------------
Now that interesting fact I am talking about is this.
If Company xyz wants ARM Cortex with AKIDA IP included it must pay $1 Mil upfront fee plus later ongoing Royalty fees. Now forget the Royalty fees for now thats not in the box yet.
THINK about that $1 Mil upfront fee. To warrant paying that much upfront just for access to AKIDA on a ARM Cortex..................
YOU would need to make a substantial UNIT ORDER to make it viable.
HOW MANY units do you think would be the minimum needed?
10 Million units? Plus Royalties later? Sound feasible?
or MORE?100 million units?

10 Million units @ how much Royalties?
100 Million units @ how much Royalties?

By having that $1 Mil upfront fee it has automatically ensured ORDERS will be substantial in numbers!

So yeh, I would be very happy to see just (1) one single Upfront fee via ARM included in this next 4C.

Not to say there will not be "other" upfront fees from a different direction as they are on the cards as well just we have not become aware could exist yet. Plus Engineering fees & AKIDA Boards are still for sale out of Singapore! (note: the only place to get a single AKIDA Chip now)
--------------------------

SP being supported by a BoT with buying against that selling BoT.
The SP is just being played with really as they wait for the 4C to be released. BUT so far the "Shorters" have failed to create a DUMP.

IF we get confirmation of any upfront fees now, years end will get very interesting.

Yak52. :cool: GLTAH
Wow
 
  • Like
  • Love
Reactions: 3 users

Yak52

Regular
Thoughts on the current price at $1.17??

A BARGIN BASEMENT .........STEAL at this pricing!


Why? Thinking about ole' Andreas I posted about and the info I gained shows -

the MARKET SATURATION BRAINCHIP IS ACHIEVING GLOBALLY
with its INFORMATION and MARKETING Strategy!! :D:D


BUGS  BUNNY counting out money.gif


Yak52 :cool:
 
  • Like
  • Love
Reactions: 25 users
Thoughts on the current price at $1.17??
It's too low?..

With the current volume, the "days to cover" for shorters, would be up from @Slymeat's 8.8 days, to around 30 days..

He said over 10 days and they would be feeling uncomfortable?...
 
  • Like
  • Fire
Reactions: 17 users

BaconLover

Founding Member
Andreas is a shareholder.

Some of TSE members have chatted with him on various platforms including Twitter/LinkedIn etc.

I wouldn't read too much into his comments, it's not a testimony from an insider perspective.
Like most of us, he also comments on BRN and Akida which is great, to increase awareness amongst the masses on a variety of social media platforms.

DYOR.
 
  • Like
  • Love
  • Fire
Reactions: 21 users

Potato

Regular
i'd like to let everyone know that i am not Andreas
 
  • Haha
  • Like
Reactions: 10 users

Tony Coles

Regular
  • Haha
  • Like
Reactions: 20 users
F

Filobeddo

Guest
  • Haha
  • Like
Reactions: 13 users

Makeme 2020

Regular

How drone autonomy unlocks a new era of AI opportunities

Reese Mozer, American Robotics@reesemozer
July 23, 2022 1:10 PM
Beautiful misty dawn in the spring on the river. Robotics and drone concept

Image Credit: Anton Petrus/Getty
Join executives from July 26-28 for Transform's AI & Edge Week. Hear from top leaders discuss topics surrounding AL/ML technology, conversational AI, IVA, NLP, Edge, and more. Reserve your free pass now!




[Editor’s note: American Robotics is a commercial developer of automated drone systems.]
Drones have been talked about extensively for two decades now. In many respects, that attention has been warranted. Military drones have changed the way we fight wars. Consumer drones have changed the way we film the world. For the commercial market, however, drones have largely been a false start. In 2013, the Association for Unmanned Vehicle Systems International (AUVSI) predicted an $82 billion market by 2025. In 2016, PwC predicted $127 billion within the “near future.” But we aren’t anywhere close to those projections yet. Why is that?








Unmute

Duration 17:45
/
Current Time 1:20
Advanced Settings





FullscreenPlayUp Next






Presentation: Why operationalizing data mesh is critical for operating in the cloud
Let’s start with the primary purpose of drones in a commercial setting: data collection and analysis. The drone itself is a means to an end – a flying camera from which to get a unique aerial perspective of assets for inspection and analysis, be it a pipeline, gravel storage yard, or vineyard. As a result, drones in this context fall under the umbrella of “remote sensing.”
In the world of remote sensing, drones are not the only player. There are high-orbit satellites, low-orbit satellites, airplanes, helicopters and hot air balloons. What do drones have that the other remote sensing methods do not? The first thing is: image resolution.

EVENT​

Transform 2022
Register now for your free virtual pass to Transform’s AI Week, July 26-28. Hear from AI and data executives from Visa, Lowe’s eBay, Credit Karma, Kaiser, Honeywell, Google, Nissan, Toyota, John Deere, and more.
Register Here

What does “high resolution” really mean?​

One product’s high resolution is another product’s low resolution.

Image resolution, or more aptly Ground Sample Distance (GSD) in this case, is a product of two primary factors: (1) how powerful your imaging sensor is, and (2) how close you are to the object you are imaging. Because drones are typically flying very low to the ground (50-400 feet AGL), the opportunity to collect higher image resolutions than aircraft or satellites operating at higher altitudes is significant. Eventually you run into issues with physics, optics and economics, and the only way to get a better picture is to get closer to the object. To quantify this:
  • “High resolution” for a drone operating at 50ft AGL with a 60MP camera is around 1 mm/pixel.
  • “High resolution” for a manned aircraft service, like the now-defunct Terravion, was 10 cm/pixel.
  • “High resolution” for a low-orbit satellite service, like Planet Labs, is 50 cm/pixel.
Put another way, drones can provide upwards of 500 times the image resolution of the best satellite solutions.

The power of high resolution​

Why does this matter? It turns out there is a very direct and powerful correlation between image resolution and potential value. As the computing phrase goes: “garbage in, garbage out.” The quality and breadth of machine vision-based analytics opportunities are exponentially higher at the resolutions a drone can provide vs. other methods.

A satellite might be able to tell you how many well pads are in Texas, but a drone can tell you exactly where and how the equipment on those pads is leaking. A manned aircraft might be able to tell you what part of your cornfield is stressed, but a drone can tell you what pest or disease is causing it. In other words, if you want to resolve a crack, bug, weed, leak or similarly small anomaly, you need the proper image resolution to do so.

Bringing artificial intelligence into the equation​

Once that proper image resolution is obtained, now we can begin training neural networks (NNs) and other machine learning (ML) algorithms to learn about these anomalies, detect them, alert for them and potentially even predict them.
Now our software can learn how to differentiate between an oil spill and a shadow, precisely calculate the volume of a stockpile, or measure a slight skew in a rail track that could cause a derailment.
American Robotics estimates that over 10 million industrial asset sites worldwide have use for automated drone-in-a-box (DIB) systems, collecting and analyzing 20GB+ per day per drone. In the United States alone, there are over 900,000 oil and gas well pads, 500,000 miles of pipeline, 60,000 electrical substations, and 140,000 miles of rail track, all of which require constant monitoring to ensure safety and productivity.

As a result, the scale of this opportunity is actually hard to quantify. What does it mean to fully digitize the world’s physical assets every day, across all critical industries? What does it mean if we can start applying modern AI to petabytes of ultra-high-resolution data that has never existed before? What efficiencies are unlocked if you can detect every leak, crack and area of damage in near-real time? Whatever the answer, I’d wager the $82B and $127B numbers estimated by AUVSI and PwC are actually low.
So: if the opportunity is so large and clear, why haven’t these market predictions come true yet? Enter the second important capability unlocked by autonomy: imaging frequency.

What does “high frequency” really mean?​

The useful imaging frequency rate is 10x or more than what people originally thought.
The biggest performance difference between autonomous drone systems and piloted ones is the frequency of data capture, processing and analysis. For 90% of commercial drone use cases, a drone must fly repetitively and continuously over the same plot of land, day after day, year after year, to have value. This is the case for agricultural fields, oil pipelines, solar panel farms, nuclear power plants, perimeter security, mines, railyards and stockpile yards. When examining the full operation loop from setup to processed, analyzed data, it is clear that operating a drone manually is much more than a full-time job. And at an average of $150/hour per drone operator, it is clear a full-time operational burden across all assets is simply not feasible for most customers, use cases and markets.

This is the central reason why all the predictions about the commercial drone industry have, thus far, been delayed. Imaging an asset with a drone once or twice a year has little to no value in most use cases. For one reason or another, this frequency requirement was overlooked, and until recently [subscription required], autonomous operations that would enable high-frequency drone inspections were prohibited by most federal governments around the world.
With a fully-automated drone-in-a-box system, on-the-ground humans (both pilots and observers) have been removed from the equation, and the economics have completely changed as a result. DIB technology allows for constant operation, multiple times per day, at less than a tenth of the cost of a manually operated drone service.
With this increased frequency comes not only cost savings but, more importantly, the ability to track problems when and where they occur and properly train AI models to do so autonomously. Since you don’t know when and where a methane leak or rail tie crack will occur, the only option is to scan every asset as frequently as possible. And if you are gathering that much data, you better build some software to help filter out the key information to end users.

Tying this to real-world applications today​

Autonomous drone technology represents a revolutionary ability to digitize and analyze the physical world, improving the efficiency and sustainability of our world’s critical infrastructure.

And thankfully, we have finally moved out of the theoretical and into the operational. After 20 long years of riding drones up and down the Gartner Hype Cycle, the “plateau of productivity” is cresting.
In January 2021, American Robotics became the first company approved by the FAA to operate a drone system beyond visual line-of-sight (BVLOS) with no humans on the ground, a seminal milestone unlocking the first truly autonomous operations. In May 2022, this approval was expanded to include 10 total sites across eight U.S. states, signaling a clear path to national scale.
More importantly, AI software now has a practical mechanism to flourish and grow. Companies like Stockpile Reports are using automated drone technology for daily stockpile volumetrics and inventory monitoring. The Ardenna Rail-Inspector Software now has a path to scale across our nation’s rail infrastructure.
AI software companies like Dynam.AI have a new market for their technology and services. And customers like Chevron and ConocoPhillips are looking toward a near-future where methane emissions and oil leaks are significantly curtailed using daily inspections from autonomous drone systems.
My recommendation: Look not to the smartphone, but to the oil fields, rail yards, stockpile yards, and farms for the next data and AI revolution. It may not have the same pomp and circumstance as the “metaverse,” but the industrial metaverse might just be more impactful.
Reese Mozer is cofounder and CEO of American Robotics.
 
  • Like
  • Love
  • Fire
Reactions: 21 users

Potato

Regular
Screen Shot 2022-07-26 at 12.02.23 pm.png
 
  • Like
  • Fire
  • Haha
Reactions: 37 users

Potato

Regular
You heard it first folks, the president is ready to sign the bill.
 
  • Like
  • Fire
Reactions: 10 users

Newk R

Regular
  • Like
Reactions: 1 users

BaconLover

Founding Member
  • Haha
  • Like
  • Fire
Reactions: 69 users
Top Bottom