@Diogenese please review my homework.
https://image-ppubs.uspto.gov/dirsearch-public/print/downloadPdf/11803741
https://image-ppubs.uspto.gov/dirsearch-public/print/downloadPdf/11803741
Does anyone here have any idea(s) on who actually cast those 174 million votes?Totally agree with your post ........... what some posters here are failing to realise, is that at the last AGM there was ~174 Million votes cast against the renumeration report being adopted, thus resulting in the "Strike 1" outcome ................ As all here should well and truely appreciate, is that out voting or negating that previous vote of ~174 Million vote imo is going to be a mammoth task both for us shareholders, as well as that of the Co.
Hi McH,
There was a time under the ebullient Lou "Hockey Sticks" Dinardo when reference was made to "a major North American vehicle maker" which drew a "please explain" from ASX and caused a possible breach of an NDA when we were forced to disclose Ford as an EAP.
Not only that, but it was a blot on our escutcheon, which has the potential to turn around and bite us on the hindquarters when we seek NTSE registration - our first yellow card.
Because the ASX, being gun-shy of tech companies since getting their fingers burnt in the dotcom bubble, is red hot on tech startups. (Don't you just love a metaphor salad?)
The company is well aware of the disgruntlement of the roiling mass of commentators on TSEx in relation to announcements, but they have chosen to be like Caesar's wife in this regard.
Being a bit of a share market ingenue back in 2018 when I first came across BRN, my first purchase was in FOMO mode because I read that PvdM was about to make a presentation to a gaggle of geeks in the US in the following week, and I expected that would launch us to the moon and beyond.* Not a sausage ...
So I think that, as far as management's concern for shareholders goes, they have their eye on the main prize, NYSE registration, and doing anything to pump the ASX price would earn a second yellow card.
*(I still expect that every day now.)
WHAT!!!!??????????????????Hi McH,
There was a time under the ebullient Lou "Hockey Sticks" Dinardo when reference was made to "a major North American vehicle maker" which drew a "please explain" from ASX and caused a possible breach of an NDA when we were forced to disclose Ford as an EAP.
Not only that, but it was a blot on our escutcheon, which has the potential to turn around and bite us on the hindquarters when we seek NTSE registration - our first yellow card.
Because the ASX, being gun-shy of tech companies since getting their fingers burnt in the dotcom bubble, is red hot on tech startups. (Don't you just love a metaphor salad?)
The company is well aware of the disgruntlement of the roiling mass of commentators on TSEx in relation to announcements, but they have chosen to be like Caesar's wife in this regard.
Being a bit of a share market ingenue back in 2018 when I first came across BRN, my first purchase was in FOMO mode because I read that PvdM was about to make a presentation to a gaggle of geeks in the US in the following week, and I expected that would launch us to the moon and beyond.* Not a sausage ...
So I think that, as far as management's concern for shareholders goes, they have their eye on the main prize, NYSE registration, and doing anything to pump the ASX price would earn a second yellow card.
*(I still expect that every day now.)
NEWSLETTER ON LINKEDIN | |||||
See what others are saying about this topic:Open on Linkedin | |||||
|
Nasdaq$$$$$ Listing YES !! VladHi McH,
There was a time under the ebullient Lou "Hockey Sticks" Dinardo when reference was made to "a major North American vehicle maker" which drew a "please explain" from ASX and caused a possible breach of an NDA when we were forced to disclose Ford as an EAP.
Not only that, but it was a blot on our escutcheon, which has the potential to turn around and bite us on the hindquarters when we seek NYSE registration - our first yellow card.
Because the ASX, being gun-shy of tech companies since getting their fingers burnt in the dotcom bubble, is red hot on tech startups. (Don't you just love a metaphor salad?)
The company is well aware of the disgruntlement of the roiling mass of commentators on TSEx in relation to announcements, but they have chosen to be like Caesar's wife in this regard.
Being a bit of a share market ingenue back in 2018 when I first came across BRN, my first purchase was in FOMO mode because I read that PvdM was about to make a presentation to a gaggle of geeks in the US in the following week, and I expected that would launch us to the moon and beyond.* Not a sausage ...
So I think that, as far as management's concern for shareholders goes, they have their eye on the main prize, NYSE registration, and doing anything to pump the ASX price would earn a second yellow card.
*(I still expect that every day now.)
Hi Pope,I received a email from LinkedIn today on a Mercedes update. Refer below
Anyone get good or bad vibes BRN is not in this current vehicle update as it’s AI refers to chip to cloud ?
NEWSLETTER ON LINKEDIN
See what others are saying about this topic:Open on Linkedin
Applications and Opportunities for AI @ Mercedes-Benz
You can hardly read an article about technology these days without the mention of artificial intelligence (AI). For example, you can read Mercedes-Benz AG early adopted guiding principles for AI that includes information on responsible use, privacy, security and reliability. We have been integrating various AI solutions throughout the company for several years now, focusing on continuous improvement.
Our goal is to build the most desirable cars and we believe intelligent software enables the industry-leading features that our customers expect. While we are the architects of our own software, there are times when we choose to partner with industry leaders to gain a competitive advantage.
Here are several ways we use AI to help us create the best cars in the world.
How MB.OS uses AI to enhance our user experience
The chip-to-cloud Mercedes-Benz Operating System (MB.OS) allows us to decouple our software releases from vehicle hardware, which makes our software updates an order of magnitude faster than before. Thanks to integration with the AI-enhanced Mercedes-Benz Intelligent Cloud (MIC), our vehicles will always remain up to date. The ability to roll out software optimisations globally in a matter of hours also significantly increases our vehicles' overall safety and quality.
The MIC uses AI to assist in vehicle development, production and fleet operations, which manages our over-the-air (OTA) software update process. Given the consent of our customers, the MIC enables us to monitor and analyse the current software status of any vehicle in the network using predictive failure analysis. This means our system detects potential errors using an intelligent algorithm to ensure software quality from end to end.
For instance, our MBUX Zero Layer update offers a more personalised and intuitive user experience, with the most relevant functions always in view. Media, telephony and navigation are the most important functions for most customers, so they always remain visible. We use AI to automatically offer dozens of additional functions relevant to each customer. These could include comfort programmes, status from connected smart home devices, or a suggestion to call your mother on her birthday.
A preview of the upcoming MB.OS recently launched in the new E-Class demonstrated its ability to create user-defined "routines" that link together a series of the customers’ favourite vehicle functions. We are currently developing AI "routines" meaning the system will be able to learn how vehicle occupants use different features so it can automate the functions for a more personalised experience. The new E‑Class also has the onboard technology to park itself with our INTELLIGENT PARK PILOTfeature.
Strong partners for AI-driven applications
While we define and control our own hardware and software architecture, the integration of industry-leading technology provides our customers access to best-in-class services, content and features. Our key partners include NVIDIA which contributes with AI and automated driving systems expertise; Google for our next-generation navigation experience and cloud services, plus Microsoft and OpenAI, as featured in our recent ChatGPT beta programme, which enhanced our "Hey Mercedes" MBUX voice assistant.
NVIDIA
We are currently building an enhanced automated driving system with NVIDIA, which will be fully updatable and makes the most of NVIDIA’s SoC. It will be capable of conducting 254 trillion operations per second to process driving automation data in real-time. We are also harnessing NVIDIA’s competence in AI for our digital twins, which simulate production runs in software, digitalise manufacturing and detecting potential errors earlier. AI enables cost-saving opportunities over conventional programmable logic controllers (PLC) by enormously reducing the required processing time. The successful application of AI-controlled process engineering is an important step in the digitalisation of our vehicle production.
Google
We’re using Google’s open infrastructure, AI, machine learning, cloud, maps and data capabilities to securely innovate and scale production, accelerate our sustainability efforts, analyse fleet data, advance autonomous driving and create an enhanced customer experience.
Mercedes-Benz builds its own navigation experience enhanced by the Google Maps Platform, offering detailed location information, predictive and real-time traffic data and enhanced routing directly embedded into MB.OS. This year, we enabled access to Place Details by Google over-the-air for all MBUX customers, providing detailed information about more than 200 million locations worldwide. Our latest release was in Japan. Our integrated Google Maps data can even be used for assisted-driving features, such as automatic speed adjustments at intersections. Our partnership goes beyond navigation as we recently introduced YouTube in our current vehicle fleet.
Microsoft
To make vehicle production more efficient, resilient and sustainable, we have connected our MO360 Data Platform to the Microsoft Cloud. This digital-production ecosystem helps identify potential supply chain bottlenecks and enables dynamic prioritisation of production resources. Microsoft Azure provides the flexibility and cloud computing power to run AI and analytics at global scale, while addressing cybersecurity and compliance standards at a regional level.
Our software engineers are committed to open collaboration and work with a global community of internal and external developers using Free and Open Source Software (FOSS), including GitHub to improve both the quality of the software and the speed of delivery. They use Azure Data Lake, Databricks and Purview to process and govern huge amounts of data and run AI and analytics and use Azure DevOps for software deployment.
Earlier this year, we integrated ChatGPT through Azure OpenAI Service within our MO360 digital production ecosystem to optimise production processes and evaluate our processes and data in real-time. We leveraged the enterprise-grade capabilities of Microsoft’s cloud using intelligent tools in the MO360 digital-production ecosystem to accelerate error identification, analysis and quality management.
We also launched a ChatGPT beta programme in the U.S. for more than 900,000 vehicles. That complements the existing Hey Mercedes voice control by leveraging a large-language model to massively improve natural language understanding, enhancing our validated data with the more natural dialogue format of ChatGPT. Through Azure OpenAI Service, we’re tapping into large-scale generative AI models, enterprise-grade security, privacy and reliability, while retaining complete control over the IT processes. The collected voice command data is stored in the Mercedes-Benz Intelligent Cloud, where it is anonymised and analysed.
What’s next?
The future of AI presents opportunities in everything from software development, vehicle production and environmental responsibility to enhanced user experiences, including automated driving, navigation, comfort features and voice communications. We are at the forefront of this technology shift and bringing it to your next Mercedes-Benz.
What do you think the next big AI breakthrough will be? Let me know in the comments.
Don’t think anyone of note is up for re-election off top of my head at the 2024 AGM..Does anyone here have any idea(s) on who actually cast those 174 million votes?
Was it retail holders, if it was, then was it a group of retailers who were networked on a platform or forum.
Was it an institutional holder(s), what might their motive have been.
Was it a mixture of retail and institutional, but bottom line for me is - and has been, unless we know who the holders were that voted down the remuneration package, then all that can be done is speculation with no clear understanding of what or who motivated that vote.
Now on another matter, and I am working on a distant memory here, which is of a previous AGM, where there was a large vote against PvDM, I can't remember what it was in relation to - perhaps his (re)nomination to the Board, but I do have a recollection of a large vote against PVDM at a previous AGM. There was talk about it over at HC at the time my recollection is that after some discussion it remained a mystery.
Zounds! never was I so bethumped with words since I called my brother's father Dad.@Diogenese please review my homework.
https://image-ppubs.uspto.gov/dirsearch-public/print/downloadPdf/11803741
Any reason why this post didn’t get much excitement.Samsung Unveils 2024 Generative AI Roadmap for Home Appliances
Samsung Electronics is set to introduce its line of home appliances with the integration of generative artificial intelligence (AI) featureswww.koreatechtoday.com
Hold your horses, you don't know what you're talking about. The lowest form of Nasdaq listing requires an SP of $1. To meet this and have some reserve in case of falling SP most new entries start at Nasdaq with an SP between $3-5.Nasdaq$$$$$ Listing YES !! Vlad
Is this the pet that gets you excited about this news releaseArm Collaborates with Industry Leaders to Build AI Foundations of the Future
Arm is engaged in several new strategic partnerships that will fuel AI innovation and bring AI-based experiences to lifenewsroom.arm.com
Arm Collaborates with Industry Leaders to Build AI Foundations of the Future
By Richard Grisenthwaite, EVP, Chief Architect & Fellow, Arm
Artificial Intelligence (AI)Company NewsInternet of Things (IoT)
Share
The proliferation of artificial intelligence (AI) relies on continuous alignment between hardware and software innovation. It’s this combination which will improve AI capabilities at every technology touchpoint, from the smallest sensor running workloads at the edge to the biggest server handling complex workloads to train large language models (LLMs). As the ecosystem works to realize the true potential of AI, it’s clear that security, sustainability, and data ceilings will all be a challenge. Therefore, it’s vital that we continue to explore industry-wide collaboration, so that we can achieve AI at scale, including more inference at the edge. Arm is engaged in several new strategic partnerships that will fuel AI innovation and bring AI-based experiences to life.
In addition to our own technology platforms where AI development is already happening, Arm is working with leading technology companies, including AMD, Intel, Meta, Microsoft, NVIDIA, and Qualcomm Technologies, Inc. on a range of initiatives focused on enabling advanced AI capabilities for more responsive and more secure user experiences. These partnerships will create the foundational frameworks, technologies, and specifications required for the 15 million and counting Arm developers to deliver next-generation AI experiences across every corner of computing.
Powering AI at the edge
While generative AI and LLMs may be capturing headlines today, Arm has been at the forefront of delivering AI at the edge for years, with 70% of third-party AI applications running on Arm CPUs in the smartphone space. However, as we explore how to deliver AI in a sustainable way, and move data around efficiently, the industry needs to evolve to run AI and machine learning (ML) models at the edge, which is challenging as developers are working with increasingly limited computing resources.
Arm is working with NVIDIA to adapt NVIDIA TAO, a low-code open-source AI toolkit for Ethos-U NPUs, which helps to create performance-optimized vision AI models for the purpose of deploying them on these processors. The NVIDIA TAO provides an easy-to-use interface for building on top of TensorFlow and PyTorch, which are leading, free, open-source AI and ML frameworks. For developers, this means easy and seamless development and deployment of their models, while also bringing more complex AI workloads to edge devices for enhanced AI-based experiences.
Advancing neural networks across all devices and markets
A vital aspect of the continued growth of AI is advancing the deployment of neural networks at the edge. Arm and Meta are working to bring PyTorch to Arm-based mobile and embedded platforms at the edge with ExecuTorch. ExecuTorch makes it far easier for developers to deploy state-of-the-art neural networks that are needed for advanced AI and ML workloads across mobile and edge devices. Moving forward, the collaboration between Arm and Meta will ensure AI and ML models can be easily developed and deployed with PyTorch and ExecuTorch.
The work with Meta builds on significant investments that we have already made in the Tensor Operator Set Architecture (TOSA), which provides a common framework for AI and ML accelerators and supports a broad range of workloads employed by deep neural networks. TOSA will be the cornerstone of AI and ML for a diverse range of processors and billions of devices that are built on the Arm architecture.
Industry-wide scalable AI
Supporting the wide deployment of data formats is crucial for scaling AI at a relatively low cost. Arm has been working hard to support a variety of emerging small data types focused on AI workloads.
Last year, in a joint collaboration, Arm, Intel, and NVIDIA, published a new 8-bit floating point specification, the ‘FP8’. Since then, the format has gained momentum and the group of companies expanded to AMD, Arm, Google, Intel, Meta, and NVIDIA, who together created the official OCP 8-bit Floating Point Specification (OFP8). In our latest A-profile architecture update, we’ve added OFP8 consistent with this standard to support its rapid adoption in neural networks across the industry. OFP8 is an interchange 8-bit data format that allows the software ecosystem to share neural network models easily, facilitating the continuous advancement of AI computing capabilities across billions of devices.
Open standards are critical to driving innovation, consistency, and interoperability in the AI ecosystem. In continuing our work to support industry collaboration efforts on these standards we recently joined the Microscaling Formats (MX) Alliance, which includes AMD, Arm, Intel, Meta, Microsoft, NVIDIA, and Qualcomm Technologies, Inc. The MX Alliance recently collaborated on the specification for a new technology known as microscaling, which builds on a foundation of years of design space exploration and research, and is a fine-grained scaling method for narrow-bit (8-bit and sub 8-bit) training and inference of AI applications. This specification standardizes these narrow-bit data formats to remove fragmentation across the industry and enable scalable AI.
In the spirit of collaboration, the MX Alliance released this MX specification in an open, license-free format through the Open Compute Project, which consists of hyperscale data center operators and other industry players in computing infrastructure, to encourage broad industry adoption. This is in recognition of the need to provide equitable access to scalable AI solutions across the ecosystem.
Unprecedented AI innovation
Arm is already foundational to AI deployments in the world today and these collaborations are just some of the ways we are providing the technologies needed for developers to create advanced, complex AI workloads. From sensors, smartphones, and software defined vehicles, to servers and supercomputers, the future of AI will be built on Arm.
FYI concerning the vote against Peter's re -election as a director was 109,789,182 Million ~ 27.94% of votesDoes anyone here have any idea(s) on who actually cast those 174 million votes?
Was it retail holders, if it was, then was it a group of retailers who were networked on a platform or forum.
Was it an institutional holder(s), what might their motive have been.
Was it a mixture of retail and institutional, but bottom line for me is - and has been, unless we know who the holders were that voted down the remuneration package, then all that can be done is speculation with no clear understanding of what or who motivated that vote.
Now on another matter, and I am working on a distant memory here, which is of a previous AGM, where there was a large vote against PvDM, I can't remember what it was in relation to - perhaps his (re)nomination to the Board, but I do have a recollection of a large vote against PVDM at a previous AGM. There was talk about it over at HC at the time my recollection is that after some discussion it remained a mystery.
LOL. Too much time on vibrate drains the battery you know.An AI arms race in mobile phones, is on its way... have a read of this and tell me we are not perfectly placed for the perfect storm that is approaching. Generative AI in your hand with low power consumption. Just like the phone became our stereo, TV, radio, camera, laptop etc, they will soon be our personal assistant, teacher, fitness coach, personal doctor, therapist, best friend, lover? Lol https://www.cnet.com/tech/mobile/ai-is-coming-for-your-phone-in-a-big-way/
Billions of devices built on Arm/Brn’s Aikida IP.
Thanks for your reply but I was, as stated, voicing my opinion, not stating any fact. I think it salient to remember that the 1st/2nd strike rule was put in place for good reason.From memory the strike was a consequence of a minority vote which achieved the minimum number necessary to pass and certainly doesn't represent my self or the massive majority... your thoughts basically don't represent the general consensus. Let us not forget this.