Boab
I wish I could paint like Vincent
You are a cruel man Rocket
Chart suggesting we could hit 40's .
Not advice, but with all potential akida links provided here, I'm still holding on.
Can't help but to feel very frustrated though with the lack of material announcements. Fair enough you can't put specific figures on collaborations and partnerships with other companies. So I guess we just have to believe what the CEO has said about increasing sales and focusing on financial figures in future reports.
I think a lot of us are in that position - expecting 2021 and 2022 to be a big year with many companies signing up to our IPs with some payments being made upfront etc.I think the management's main focus is on getting market's traction and becoming part of the eco system which is a key to success and to become industry standard imo.
But I think they've set incorrect / unrealistic hope for the investors... I do feel sad because I've increased my holdings significantly thinking IP agreements are imminent and will be followed by heaps of revenues in 2021 & 2022 which ate all my gains and now in deep red....
Although a lot of people told me multiple times that the revenue will come through MegaChips and Renesas, I still don't understand why we haven't landed any more IP agreements for years...
But given we are now part of multiple eco systems and formed great partnerships, hopefully revenue will follow... (but I am a bit afraid it might take years... hopefully I am wrong and it will happen soon) I haven't sold a single share.. I am still hopefulllll
"I suspect the Company, was expecting a major deal or 2, to drop soonish, which would explain the extended pricing period.."Hey, I'm a simple man too
Your return post had lots of good, well thought out points, which I'm sure many who are looking at the current share price and feeling a bit down, would have appreciated.
The ecosystem of partnerships we are building, is hugely important and will make this Company.
That's why we have Rob Telson on the job.
His likeable personality, is a perfect fit for building relationships with other companies.
He is old school, in this respect.
The people in a company, are just as, if not more so, important than the product.
And we have Great people.
Those looking at where the share price is, 52 week lows, keep in mind the LDA pricing period.
The floor price, set by the Company, is looking to have been around 50 cents, probably 55 and they just want to make sure, they pay as little as possible, by keeping it under that.
I suspect the Company, was expecting a major deal or 2, to drop soonish, which would explain the extended pricing period..
Whoever converts to a licensee will pay a licence fee - so we will see that somewhere in the financials (just as we saw the 4M in the recent report). Megachips will be mega for this imo, they can educate, develop solutions and support customers of our IP and we will only see the result of that in the financials. Then eventually BRN will reap the royalties from that. I think we have to be patient to the extent of seeing the 6 month reports to be assured that things are ticking along. AIMO.Renesas took 2 years to convert the License agreement to a tapeout.
Should any EAP customer or Partner wish to convert this year to a Licensee, we could see a chip by 2025, which I hope most of us have the patience for?
Here's the full interview for anyone interested. It sounds like Rain AI are years behind us if you ask me because they're still building their analogue brain.I don't want to get everyone too excited (which means I actually do) but Forbes published an article a few weeks ago about ChatGPT and how it costs millions of dollars a day to run and how neuromorphic computing could one day be the solution to driving down this staggering and unnecessary expense. The article says "But our brains are a million times more efficient than the GPUs, CPUs, and memory that make up ChatGPT’s cloud hardware. And neuromorphic computing researchers are working hard to make the miracles that big server farms in the clouds can do today much simpler and cheaper, bringing them down to the small devices in our hands, our homes, our hospitals, and our workplaces".
Baring in mind, this is interview with the CEO of Rain AI, Gordon Wilson, so naturally he doesn't mention us because we're his competitors, but he may as well be talking about us because we're the experts, aren't we?
ChatGPT Burns Millions Every Day. Can Computer Scientists Make AI One Million Times More Efficient?
“Deploying current ChatGPT into every search done by Google would require 512,820 A100 HGX servers with a total of 4,102,568 A100 GPUs,” they write. “The total cost of these servers and networking exceeds $100 billion of Capex alone, of which Nvidia would receive a large portion.”www.forbes.com
Extract Only
Perhaps that other way is analogous to something we already have a lot of familiarity with.
According to Rain AI’s Wilson, we have to learn from the most efficient computing platform we currently know of: the human brain. Our brain is “a million times” more efficient than the AI technology that ChatGPT and large language models use, Wilson says. And it happens to come in a very flexible, convenient, and portable package.
“I always like to talk about scale and efficiency, right? The brain has achieved both,” Wilson says. “Typically, when we’re looking at compute platforms, we have to choose.”
That means you can get the creativity that is obvious in ChatGPT or Stable Diffusion, which relies on data center compute to build AI-generated answers or art (trained, yes, on copyrighted images), or you can get something small and efficient enough to deploy and run on a mobile phone, but doesn’t have much intelligence.
That, Wilson says, is a trade-off that we don’t want to keep having to make.
Which is why, he says, an artificial brain built with memristors that can “ultimately enable 100 billion-parameter models in a chip the size of a thumbnail,” is critical.
For reference, ChatGPT’s large language model is built on 175 billion parameters, and it’s one of the largest and most powerful yet built. ChatGPT 4, which rumors say is as big a leap from ChatGPT 3 as the third version was from its predecessors — will likely be much larger. But even the current version used 10,000 Nvidia GPUs just for training, with likely more to support actual queries, and costs about a penny an answer.
Running something of roughly similar scale on your finger is going to be multiple orders of magnitude cheaper.
And if we can do that, it unlocks much smarter machines that generate that intelligence in much more local ways.
“How can we make training so cheap and so efficient that you can push that all the way to the edge?” Wilson asks. “Because if you can do that, then I think that’s what really encapsulates an artificial brain. It’s a device. It’s a piece of hardware and software that can exist, untethered, perhaps in a cell phone, or AirPods, or a robot, or a drone. And it importantly has the ability to learn on the fly. To adapt to a changing environment or a changing self.”
That’s a critical evolution in the development of artificial intelligence. Doing so enables smarts in machines we own and not just rent, which means intelligence that is not dependent on full-time access to the cloud. Also: intelligence that doesn’t upload everything known about us to systems owned by corporations we end up having no choice but to trust.
It also, potentially, enables machines that differentiate. Learn. Adapt. Maybe even grow.
My car should know me and my area better than a distant colleagues’ car. Your personal robot should know you and your routines, your likes and dislikes, better than mine. And those likes and dislikes, with your personal data, should stay local on that local machine.
There’s a lot more development, however, to be done on analog systems and neuromorphic computing: at least several years. Rain has been working on the problem for six years, and Wilson thinks shipping product in quantity — 10,000 units for Open AI, 100,000 units for Google — is at least “a few years away.” Other companies like chip giant Intel are also working on neuromorphic computing with the Loihi chip, but we haven’t seen that come to the market in scale yet.
If and when we do, however, the brain-emulation approach shows great promise. And the potential for great disruption.
“A brain is a platform that supports intelligence,” says Wilson. “And a brain, a biological brain, is hardware and software and algorithms all blended together in a very deeply intertwined way. An artificial brain, like what we’re building at Rain, is also hardware plus algorithms plus software, co-designed, intertwined, in a way that is really ... inseparable.”
You are a cruel man Rocket
Isn't it interesting that Mercedes claim level 4 by 2030 and Brainchip claim AGI by 2030?Crikey!
Mercedes Says Level 4 Self-Driving Will Happen By 2030
MAR. 02, 2023 6:32 PM ETBY JAY TRAUGOTT TECHNOLOGY / 5 COMMENTS
The German automaker has already beaten rivals for Level 3 approval.
Mercedes-Benz has gone on record stating that Level 4 self-driving is "doable" by decade's end. Speaking to Automotive News, the German automaker's Chief Technology Officer, Markus Schafer, said, "private-owned Level 4 cars, absolutely. This is something that I see in the future."
Mercedes' Level 3 technology, called Drive Pilot, is the first of its kind in the US and has already been approved for use in Nevada, and other states such as California are expected to follow suit in the coming months. For now, only the 2024 S-Class and EQS Sedan offer Level 3, both of which will go on sale in the second half of this year. Unlike Level 2+ systems like GM's Super Cruise, Ford's Blue Cruise, and Tesla's Autopilot and Full Self-Driving, Drive Pilot can "hand over the dynamic driving task to the vehicle under certain conditions."
The system utilizes a combination of LiDAR, radar, and various sensors to allow for safe highway driving at speeds of up to 40 mph.
Mercedes-Benz
Mercedes-Benz
Yes, Level 3 makes it possible to text and drive (although we should point out that this is still illegal), but the driver must still be ready to assume control of the vehicle at a moment's notice if the system detects any sort of obstacle. Level 4 takes things up a notch. "Just imagine you are in a big city, and you come from work, and you are sitting for two hours in traffic, and you press the button and go to sleep," Schaefer added. "There will be a demand for that."
One key difference between Level 3 and Level 4 is that human drivers don't have to keep an eye on the road in most conditions. Level 4 is ideally suited for heavy traffic within cities, but things like extreme weather are a different matter.
Level 5, which requires absolutely no human involvement, is still years away from becoming possible. Companies like Waymo and Cruise are currently testing driverless taxis with that tech, but it's still far from perfect and remains expensive.
Mercedes is taking responsibility for Drive Pilot's accuracy and safety by assuming liability if the vehicle were to be the cause of a highway crash, for example.
The carmaker's Level 3 headstart places it in a prime position to introduce Level 4. Its upcoming new Modular Architecture platform, due to launch in the middle of the decade, will come hardwired with Level 4 capability once the technology is ready and approved for use by government safety regulators. The race to Level 4 brings not only prestige and bragging rights but also significant revenue.
Automakers - especially luxury brands - know that consumers will be more than happy to pay more for the technology because it brings a lot of additional conveniences. But the most difficult task they will face - and Mercedes is no exception - is proving to the public that Level 4 is safe. The introduction of Level 3 Drive Pilot is a significant step in that direction.
Mercedes Says Level 4 Self-Driving Will Happen By 2030
The German automaker has already beaten rivals for Level 3 approval.carbuzz.com
I'll go with popular opinion, soI’m going with yes too.
@TechGirl posted an article last year which states Luminar are using Valeo’s Lidar sensors that will be included in some future Mercedes models.
Although it doesn’t guarantee they are using AKIDA we know that AKIDA is trusted by VALEO and that AKIDA can make Valeo’s senses smart.
"Just recently, Mercedes-Benz became the first automotive manufacturer in the world to receive an internationally valid system approval for highly automated driving (SAE Level 3) – a milestone in automotive development. The “Drive Pilot” highly automated driving system is to go into series production this year in the S-Class and the EQS. However, the lidar sensors for these models do not come from Luminar. Instead, Valeo supplies these components. Luminar, on the other hand, supplies its technology not only to Mercedes Benz, but also to Daimler Truck AG and Volvo. In addition, Luminar cooperates with Nvidia and Mobileye – so one can assume that their platforms for autonomous driving are equipped with Luminar lidars."
Mercedes teams with Luminar on lidar sensor technology
Mercedes-Benz and lidar expert Luminar Technologies, Inc. have entered into a partnership to develop future technologies for highly automated driving.www.eenewseurope.com
I'll just go anywhere with SofiaI'll go with popular opinion, so
Ok, so status quo then and the instos will continue to lend their shares to shorters...... excellent