BRN Discussion Ongoing

Proga

Regular
Well, I'll be danged if this isn't one of the most comprehensive articles I've read on how Mercedes plans to roll out the new operating system. The article was published in April and I'm telling you there are just SOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOO many juicy kernels in here, I don't know where to begin!!!


Set up a team of 10,000 people to challenge the vehicle operating system, what are the odds of Mercedes-Benz?​



2022-04-12 23:53:24

__Qf2AjLwojIjJCLyojI0JCLiADMwEzLcVDOyYjM3kzM3QTMvwFMvwFdi9FcwF2c3VmbvwVbvNmLn1Wa0dmLzdXZul2Lc9CX6MHc0RHaiojIsJye.webp




Subverting the existing operating system from the underlying logic, Mercedes-Benz MB.OS will include four major application functions of infotainment, automatic driving, body and comfort, and driving and charging. In the era of software-defined cars, auto companies' self-developed operating systems can not only provide users with a more intelligent and convenient car experience, but also bring sustainable profits to enterprises. Mercedes-Benz, which plans to officially launch MB.OS in 2024, is at the forefront.

Wen 丨 wisdom driving network Huang Huadan

In the tide of "software-defined cars", Mercedes-Benz also wants to take the core software technology into its own hands.

On April 8, Mercedes-Benz spent 200 million euros to open a new software center in Sindelfingen to promote its self-developed software capabilities, enabling the self-developed operating system MB by 2024. Os's goal to bring to market.

At present, most of the development of in-vehicle systems is based on platforms such as Linux, Android and QNX, including the MBUX in-vehicle system currently used by Mercedes-Benz, which is also based on Linux.

The goal of MB.OS development is to subvert the existing system from the underlying logic, which is not a simple in-car entertainment system, nor a single intelligent driving system, but the operating system of the whole vehicle.

The four application functions officially designated by Mercedes-Benz include: infotainment functions, automatic driving functions, body and comfort functions, and driving and charging functions.

__Qf2AjLwojIjJCLyojI0JCLiADMwEzLcZDOyYjM3kzM3QTMvwFMvwFdi9FcwF2c3VmbvwVbvNmLn1Wa0dmLzdXZul2Lc9CX6MHc0RHaiojIsJye.webp




The software covers the user interface, application software, middleware, and the underlying operating system, while the hardware is equipped with cables, electronic/electrical hardware, chips, sensors, and so on.

Even for a giant like Mercedes-Benz, it is not easy to completely build such a system.

Mercedes-Benz's investment is to optimize its software strategy, that is, to integrate technology imported from many suppliers, while controlling the core technology itself.

Magnus Oestberg, Chief Software Officer of Mercedes-Benz, said at a roundtable: "We don't do everything ourselves, we value our partners very much, but of course, the most important part we do ourselves. ”

To this end, Mercedes-Benz plans to hire 3,000 new employees worldwide, of which 750 will be recruited at the new software center in Sindelfingen to develop functions in in-car entertainment and autonomous driving.

According to Reuters, Mercedes plans to build a team of 10,000 software engineers around the world in Berlin, China, India, Israel, Japan and the United States. There are currently about 600 vacancies. Markus Schaefer, Chief Technology Officer of Mercedes-Benz, said: "Software engineers are very sought-after and talent is in short supply. ”

In a Capgemini survey of 572 automotive industry executives, 97 percent of executives surveyed said that within five years, four out of every 10 employees, from IT architects to cloud management experts to cybersecurity experts, would need software skills.

This also reflects the great impact of software on the future automotive industry.

——01——

From MBUX to MB.OS

MBUX, an abbreviation for "Mercedes-Benz User Experience", is a user-experience-focused infotainment system launched by Mercedes in 2018. The system is currently used in a number of Mercedes-Benz models, including the new Mercedes-Benz S-Class, the EQS pure electric sedan and the new Mercedes-Benz C-Class. Its super screen and zero-level interface have become the iconic technology configuration of Mercedes-Benz models.

__Qf2AjLwojIjJCLyojI0JCLiADMwEzLcdDOyYjM3kzM3QTMvwFMvwFdi9FcwF2c3VmbvwVbvNmLn1Wa0dmLzdXZul2Lc9CX6MHc0RHaiojIsJye.webp


However, under the general trend of software-driven, Mercedes-Benz intends to integrate the concept of digitalization into the whole vehicle, connect the various systems in the car with cloud computing and the Internet of Things, and combine it with the electric drive architecture and software architecture to create a complete ecosystem.

In the vision of future cars, Mercedes-Benz CEO Kang Linsong believes that it is crucial to develop a software platform with independent intellectual property rights. "The brain and central nervous system of future vehicles are the only way to maintain the digital sovereignty of OEMs."

In 2020, Mercedes-Benz announced that it will develop its own vehicle operating system, MB.OS, which is scheduled to be ready in 2024 and will be installed on an electric model based on the MMA platform launched in 2025.

In its electrification and software strategy, in addition to the vehicle operating system MB.OS, Mercedes-Benz has also launched two pure electric platforms. Among them, the EVA platform is suitable for large electric vehicles, including EQS, EQE and other models, and EQS has been launched this year.

__Qf2AjLwojIjJCLyojI0JCLiADMwEzLcVDO0YjM3kzM3QTMvwFMvwFdi9FcwF2c3VmbvwVbvNmLn1Wa0dmLzdXZul2Lc9CX6MHc0RHaiojIsJye.webp




MMA is a platform for compact and medium-sized electric vehicles. Models with the planned platform will be available from 2025.

Mercedes-Benz's plan is to build mb.OS for the first time in compact and medium-sized electric vehicles based on the MMA platform, and to adopt a self-developed operating system in all subsequent models.

The reason for choosing to launch MB.OS on the MMA platform model, Mercedes-Benz's consideration is that the users of the entry-level model are younger, more receptive to new things, and can also propose improvements to the system.

At the same time, from the perspective of time, the positioning of the MMA platform is for large-scale manufacturing, the price is down the platform, which requires a lot of resource investment, and also needs the lessons of the EVA platform, so it takes a certain amount of time to achieve, and Mercedes-Benz plans to launch it in 2025.

The development of the vehicle operating system MB.OS also requires the accumulation of time, which is planned to be achieved in 2024, and the two coincide in time.

——02——

Collaboration with Nvidia on autonomous driving technology

On June 23, 2020, Mercedes-Benz announced a partnership with Nvidia to use NVIDIA's AI chip "Orin" in its next-generation vehicles.

Mercedes-Benz's Orin will be the highest-end version and will build a software-defined computing architecture for autonomous driving based on the NVIDIA DRIVE platform.

That is, MB. Os's autonomous driving capabilities will use NVIDIA's self-driving car development environment.

__Qf2AjLwojIjJCLyojI0JCLiADMwEzLcZDO0YjM3kzM3QTMvwFMvwFdi9FcwF2c3VmbvwVbvNmLn1Wa0dmLzdXZul2Lc9CX6MHc0RHaiojIsJye.webp


At the NVIDIA Technology Conference "GTC 2022" from March 21 to 24, Mercedes-Benz held a briefing on its MB.OS system.

In the computer world, operating systems often manage computer assets, including CPU, memory, hard disk and SSD storage, and various interfaces. Managed by using an operating system, applications running on it can access the computer's assets without having to understand the differences in hardware.

By managing the car's hardware assets, Mercedes-Benz's MB.OS also provides middleware, applications, and user interfaces to four domains.

In addition, a large amount of data is generated when the car moves, MB. The OS will also manage data security.

In the field of autonomous driving, Mercedes-Benz's current S-Class and EQS can use L3 autonomous driving at German high speeds when conditions are eligible, and it also has L4 level automatic parking technology. In cooperation with NVIDIA, Mercedes-Benz will continue to deepen in the autonomous driving of L2 and L3.

——03——

Interactive experience upgrade

From an interactive perspective, MB. Os will also revolutionize MBUX.

Mercedes-Benz has partnered with navigation engine supplier Navus and video and game engine developer Unity to upgrade its navigation system to the new MB.OS to support 3D navigation.

__Qf2AjLwojIjJCLyojI0JCLiADMwEzLcdDO0YjM3kzM3QTMvwFMvwFdi9FcwF2c3VmbvwVbvNmLn1Wa0dmLzdXZul2Lc9CX6MHc0RHaiojIsJye.webp




The Unity engine is often used as a game engine, while in Mercedes-Benz's MB.OS system it will be used as a 3D engine to implement 3D navigation functions. The gamified experience will be the highlight of Mercedes-Benz's future.

Using Mercedes's stunning "Hyperscreen" curved screen enables 3D navigation to be magnified from the satellite view to 10 meters. At the same time, it also considers different times of the day to ensure that users see an accurate display in real time.

__Qf2AjLwojIjJCLyojI0JCLiADMwEzLcRzM2YjM3kzM3QTMvwFMvwFdi9FcwF2c3VmbvwVbvNmLn1Wa0dmLzdXZul2Lc9CX6MHc0RHaiojIsJye.webp


Magnus Oestberg, chief software officer of Mercedes-Benz, said: "It's the environment in front of you, very realistic for a navigation system. “

In addition, according to Ostberg, the current "Hey Mercedes" voice assistant will be replaced by the 3D "Star Avatar", providing a more "digital butler" service experience.

At present, Mercedes-Benz is using machine learning to verify the accuracy of MBUX systems and the problems they encounter. The next generation of interactive systems will be able to learn the habits and tastes of each user, provide users with playlists that match their tastes, and even guide how to achieve more economical driving. The conversation with "Hey Mercedes" will also be smoother, and the system will be able to understand approximate instructions without the need for specific words to activate the corresponding functions.

——04——

Software ushers in a new era of profitability

According to the plan, from 2022 to 2023, Mercedes-Benz will be equipped with a lightweight version of the MB.OS operating system on the next generation of new E-class models, and the full version of the MB.OS operating system will be launched in 2024.

Before 2024, Mercedes-Benz will continue to upgrade the MBUX intelligent human-computer interaction system in the form of software packages. And after 2024, MB. The OS operating system will bring about fundamental changes, through which the software and services developed by Mercedes-Benz itself will be able to be directly upgraded. For the system hardware covering the four areas of the car, especially the chipset, Mercedes-Benz will also adopt a standardized standard to create convenience for upgrading.

"At that time, in addition to the purchased software, the underlying and intermediate software will be independently developed by Mercedes-Benz." Mercedes-Benz chief operating officer Xue Fuming said.

__Qf2AjLwojIjJCLyojI0JCLiADMwEzLcVzM2YjM3kzM3QTMvwFMvwFdi9FcwF2c3VmbvwVbvNmLn1Wa0dmLzdXZul2Lc9CX6MHc0RHaiojIsJye.webp




In addition, in order to meet the needs of the Chinese domestic market, Mercedes-Benz has also increased its R&D layout in China.

On October 20, 2021, Daimler China Technology and Development Center was officially opened in Beijing.

On March 18 this year, Mercedes-Benz announced the establishment of a R&D center in Shanghai to further expand its R&D layout in China.

With the rapid development of digitalization, artificial intelligence and other fields, the two R&D centers in Beijing and Shanghai will absorb more local talents, strengthen cooperation with local enterprises and universities, and feed the world through China's innovation inspiration.

Regarding the MB.OS plan, Kang Linsong said, "We will not hand over the whole system to other people, but they can use digital technology." "As planned, while the system remains open, Mercedes-Benz will retain control over the user interface and data collected for future cars.

In the future, Mercedes-Benz hopes to make money through new recurring revenue streams such as driver assistance and infotainment services.

"I believe that in the future, instead of selling a new car to a customer every few years to make money, it will make money through software updates." Conlinson said.

__Qf2AjLwojIjJCLyojI0JCLiADMwEzLcZzM2YjM3kzM3QTMvwFMvwFdi9FcwF2c3VmbvwVbvNmLn1Wa0dmLzdXZul2Lc9CX6MHc0RHaiojIsJye.webp


This model has been validated by the continued revenue generated by Tesla Autopilot and FSD.

In the matter of self-developed operating systems, The Ashkenazi giants seem to have reached a consensus early. In addition to Mercedes-Benz, Volkswagen also plans to launch its own operating system VW.OS by 2025.

After all, the operating system independently developed by the car company can more easily retrieve the underlying data of the vehicle. In the era of the Internet of Everything, smart cars will inevitably interact with other smart devices, and it is even more necessary to have a low-level operating system. Coupled with the temptation of continuous profitability, for powerful car companies, self-developed operating systems are indeed a good choice.

But there is no shortage of skeptical voices in the industry. The current on-board system is controlled by multiple operating systems, but the cockpit system contains multiple operating systems, automatic driving and need different systems, want to use a system to control the whole car, Mercedes-Benz can achieve?


Thanks @Bravo. I thought I had the jigsaw figured out and then you go and flip the bloody table.

According to the plan, from 2022 to 2023, Mercedes-Benz will be equipped with a lightweight version of the MB.OS operating system on the next generation of new E-class models, and the full version of the MB.OS operating system will be launched in 2024.

Before 2024, Mercedes-Benz will continue to upgrade the MBUX intelligent human-computer interaction system in the form of software packages. And after 2024, MB. The OS operating system will bring about fundamental changes, through which the software and services developed by Mercedes-Benz itself will be able to be directly upgraded. For the system hardware covering the four areas of the car, especially the chipset, Mercedes-Benz will also adopt a standardized standard to create convenience for upgrading.

Sounds to me the Manufacture Date will be 2024 and the Model Date will be 2025.

Mercedes-Benz's plan is to build mb.OS for the first time in compact and medium-sized electric vehicles based on the MMA platform, and to adopt a self-developed operating system in all subsequent models because the users of the entry-level model are younger, more receptive to new things, and can also propose improvements to the system. Won't be in ICE vehicles by the sound of it. They'll have to make do with the current system. Probably trying to entice buyers over to electric vehicles.
 
  • Like
  • Love
Reactions: 14 users

jtardif999

Regular
Re the whole Unity 3D gaming/visuals side of things, I found some interesting info. The first is an abstract that describes how spiking neural networks can be a powerful tool for applications in image-based rendering, computer graphics, robotics, photo interpretation, image retrieval, video analysis, etc.

The other is from a blog on Preimage website. I don't know who Preimage are but I thought the description of neural rendering processing was really interesting, especially where it talks about combining classical computer graphics with the recent advances in AI and getting the AI model to “learn” and predict how the image would have looked like if it was captured from a different angle. It also discusses other real-world applications and includes some videos as well.

Whilst this blog doesn't refer specifically to spiking neural networks, the previous abstract demonstrates how they would be of benefit to 3D image-based rendering IMO.







View attachment 14185








Creating Immersive & Large-scale Content using Neural Rendering​




March 9, 2022



Generating photo-realistic images and videos is a challenge at the heart of computer graphics. Over the last couple of decades, we have come a long way from the pixelated graphics of Doom 1993 to high-quality renders like that of Red Dead Redemption 2 below. If you have played some of these modern games, you must have wondered how a beautiful and photo-realistic scene like the one below is rendered on your screen in real-time.
MicrosoftTeams-image-1024x576.jpg

The game environment is a 3D model which is made up of millions of triangles/quadrilaterals that determine the shape, color, and appearance of the objects in that environment. Below is an example of how the main character in the above game might have been modeled:
MicrosoftTeams-image-1.png

When you play the game, the scene is “rendered” on your computer screen similar to how a photograph is formed in a digital camera. Game assets (i.e., 3D models) are projected on a virtual camera through a complicated process that approximates the physics of the image formation. All this computation happens in real-time on the GPU of your system as you keep moving through the game environment.
When designing the game environment, these game assets, ranging from small objects like bottles to large-scale scenes like an entire city, are created in 3D modeling software like Blender. In order to simulate realism, these assets have to be of high quality and need to possess intricate details like dents on bottles or rust on pipes. Not surprisingly, creating such environments requires sizable collaboration from human writers, artists, and developers working together using a variety of software tools.
But what if you can generate a synthetic 3D gaming environment using a neural network that has been trained on just a few images? This idea is extremely powerful in the creation of immersive 3D content that not only has high-quality details but also dimensional accuracy. We at Preimage think that its implications on inspections, gaming, films or development of AR/VR applications are immense. We will explore how Neural Rendering works and its use-cases below.

So What is Neural Rendering?

The main idea behind neural rendering is to combine the classical computer graphics with the recent advances in AI. The process involves training a neural network with images or video of a scene and getting the AI model to “learn” and predict how the image would have looked like if it was captured from a different angle. That means by capturing just a few photos of a scene you can render an image from any position and orientation.
Video Player


Once the AI model has “learnt” the scene it also facilitates other applications like changing the lighting, object shapes, and even modeling how the scene changes through time (allowing for a sort of time travel through the scene).

Real-World Applications

Industrial Inspections

Asset-heavy companies especially in industries like infrastructure, oil & gas, telecom, or mining carry out periodic inspection of their operational sites for continuous assessment and maintenance. It helps them catch problems early and reduce the chances of an accident, malfunction, or breakdown.
Conventionally, such industrial inspections are carried out in two ways:
  1. By creating 3D models using photos. Photos of the assets (telecom tower, refineries etc.) are captured using drones or DSLR and are then converted into 3D models using a 3D reconstruction software. These 3D models are dimensionally accurate with respect to the real-world and hence allow precise measurements and annotations. However, they fall short in terms of capturing fine details and texture-less regions like plain white walls. Moreover, generating an extremely high-resolution 3D model from photos is expensive and time-consuming with current software tools.
  2. Using just images and videos. Images and videos capture high-resolution details of scenes including texture-less regions. However, they have two downsides: (a) making measurements is extremely tricky with images (e.g., measuring the angle at which the telecom tower is pointing with respect to ground), and (b) an object annotated in image 1 is not automatically annotated in image 2. These drawbacks make the process of inspection extremely manual and intensive.
Neural rendering offers the perfect hybrid of the above two approaches. It allows one to view the scene from various angles with high-resolution details like images, and since it’s a 3D representation, it facilitates accurate measurements along with consistent annotations of objects across views.

Photo Tourism, Virtual Flythroughs & Education
Since Neural Rendering allows you to view a scene from any position or orientation, it has the potential to allow users to virtually fly/drive/walk through it as in an interactive video game. This provides a much more immersive experience to the user than what can be possible with recorded videos or images.
Consequently, this opens opportunities in many areas, including photo tourism. Imagine being able to walk through the galleries of Angkor Wat with photo-realistic views of the inscriptions and carvings. The same concept can also be used in real estate, hospitality, and event management industries for virtual demos of properties, which can be such a convenience for end-users.

The use-cases for immersive educational walkthroughs in museums and in education curriculums are also endless.

Relocalization

Visual relocalization is the process of estimating where a photo was captured in a 3D scene. With Neural Rendering, as discussed above, one can estimate how an image would look like if viewed from a particular angle. The process can also be flipped, i.e., given an image, the AI model can predict the position at which that image was taken.
This has several applications including visual answering of questions in Augmented Reality (for e.g., “Show me some good cafes around”), investigative journalism (determining where a photograph or video was taken), and even for geotagging unstructured image collections on the internet.

Challenges: Scale and Editing

Neural Rendering is a rapidly developing field, and even with promising signs of what it can accomplish, there still are major hurdles yet to be overcome.
Scalability is one of the problems that a lot of such AI models struggle with. An AI model that reconstructs small scenes (e.g., telecom tower) really well, might struggle to represent medium-to-large size areas (city scale). Many solutions have been proposed to counter this problem, ranging from hierarchical space partitioning to multiresolution hash input encoding.
Another challenge faced is interpreting the weights learned by the AI model into formats that are comprehensible for humans. Editing scenes produced by Neural Rendering is not fully understood yet, and thus makes editing one part of the scene without affecting the quality of the rest, a tough challenge.

Needs Active Discussion on Fake Content and Privacy

As the lines blur between what is real and what is synthetic, questions about fake content and privacy obviously emerge. It is already becoming hard for humans to differentiate between real and synthetically generated faces, which has led to the whole debate around deepfake technology. Obama totally gave an introduction at MIT Intro to Deep Learning, we believe him, it’s a video!

With neural scene editing tech getting better, similar techniques could be applied to generate realistic-looking edited environments. Methods to overcome these problems include standards to explicitly require synthetic content to be labelled as such.
Another major concern is privacy preservation, for example automatic blurring of human faces, number plates, or other personal identifiable information from neural renders. Such issues need to be talked about more and should be satisfactorily solved before this technology goes mainstream. This is important for building trustworthy AI systems.

The abstract seems to be discussing the merits of rank coding and STDP.. now which technology do we know of utilises this advantage? 😎
 
  • Like
  • Love
Reactions: 15 users
D

Deleted member 118

Guest
Or there is always maybe an major share holder cashing in some share, which seems visable
 
  • Like
Reactions: 2 users

robsmark

Regular
the share price manipulation of BRN is getting worse every week.
That’s what happens a company adds zero value from a shareholders perspective in eight months.

I’ve spoken to Tony about this recently and I’m sure there’s plenty going on behind the scenes (just to clarify Tony told me nothing he shouldn’t have - an astute professional), but at face value - it’s average.

They need to get some positive commercial uptake news out into the market now, or this type of fuckery will continue.
 
  • Like
  • Fire
  • Love
Reactions: 38 users

alwaysgreen

Top 20
That’s what happens a company adds zero value from a shareholders perspective in eight months.

I’ve spoken to Tony about this recently and I’m sure there’s plenty going on behind the scenes (just to clarify Tony told me nothing he shouldn’t have - an astute professional), but at face value - it’s average.

They need to get some positive commercial uptake news out into the market now, or this type of fuckery will continue.
What was his response?
 
  • Like
Reactions: 3 users
D

Deleted member 118

Guest
That’s what happens a company adds zero value from a shareholders perspective in eight months.

I’ve spoken to Tony about this recently and I’m sure there’s plenty going on behind the scenes (just to clarify Tony told me nothing he shouldn’t have - an astute professional), but at face value - it’s average.

They need to get some positive commercial uptake news out into the market now, or this type of fuckery will continue.
I think a major holder is selling some shares
 
  • Like
  • Sad
Reactions: 2 users

Makeme 2020

Regular
That’s what happens a company adds zero value from a shareholders perspective in eight months.

I’ve spoken to Tony about this recently and I’m sure there’s plenty going on behind the scenes (just to clarify Tony told me nothing he shouldn’t have - an astute professional), but at face value - it’s average.

They need to get some positive commercial uptake news out into the market now, or this type of fuckery will continue.
Question for all.
Tony works for the BRN and I'm sure does a professional job, But he as a Shareholder would he have Inside information on the company, How does that work..????
 
  • Like
Reactions: 1 users

robsmark

Regular
Question for all.
Tony works for the BRN and I'm sure does a professional job, But he as a Shareholder would he have Inside information on the company, How does that work..????
Haha - see what I just posted!
 
  • Like
Reactions: 3 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
The abstract seems to be discussing the merits of rank coding and STDP.. now which technology do we know of utilises this advantage? 😎


Gimme a S
Gimme a T
Gimme a D

Don't jolly well forget to gimme a P!




Here's something dear @FactFinder authored, which is most appropriately the very first post!

#1

Type "STDP" into the search function and read all about Synaptic Time Dependent Plasticity.


Screen Shot 2022-08-15 at 4.21.49 pm.png


PS: Bring back the Fact!
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 24 users

robsmark

Regular
I think a major holder is selling some shares
I dont think so - they wouldn’t have accumulated just to dump. They would have a price target and looking at our SP recently I’m guessing that hasn’t been hit yet.
 
  • Like
Reactions: 9 users

Makeme 2020

Regular
Haha - see what I just posted!
No i didn't see your last post buddy, i wasn't having a go at you my post came after yours chill I'm on your side.................
 
  • Like
Reactions: 6 users
D

Deleted member 118

Guest
I dont think so - they wouldn’t have accumulated just to dump. They would have a price target and looking at our SP recently I’m guessing that hasn’t been hit yet.
Someone is selling big time so it be interesting to see the top holders now
 
  • Like
  • Sad
Reactions: 3 users

alwaysgreen

Top 20
I sent him an email and too my surprise he called me.

We spent a good half an hour discussing various things from Nasdaq listing (he shares my perspective here, that is that’s there’s lots more value to pull from the ASX before listing in the US), to existing customers and timeframes. He told me that he isn’t privy to any information inside of an NDA - which I hadn’t considered, but certainly makes sense. He mentioned how the sales teams was actively working and very busy - I asked if it was active or reactive sales, and he said it was a mixture of both. He said that a couple of companies have dropped out of the EAP (as mentioned by the company previously) but this was due to them wanting to redesign their product due to the increased capability of Akida. He said that the company had expected to have had more customers signed by now, but they were taking their time. He mentioned that many companies were still trying to understand SNN processing method, but understood how it was the next step.

The conversation was based around how busy they are in Australia, and how the international offices were too.

It was positive, and he sounded engaged.

He also mentioned that he reads the posts here, so hi Tony - I hope it’s okay referencing our conversation!
Thanks mate.

Disappointing that it sounds like signing customers is proving to be a little difficult.

You are right in that our share price is in the hands of manipulators until we increase our sales revenue. It will take nerves of steel over the next 12-18 months, particularly if revenue expectations are not met. We have a 2 billion dollar market cap and at some point, we need to justify such a valuation. Hopefully, the next 18 months shows that we are worthy of a MC much higher!
 
  • Like
  • Love
Reactions: 15 users

robsmark

Regular
No i didn't see your last post buddy, i wasn't having a go at you my post came after yours chill I'm on your side.................
No mate, I wasn’t having a crack - I just found it funny that you posted that at the same time that I posted.
 
  • Like
Reactions: 12 users

alwaysgreen

Top 20
Someone is selling big time so it be interesting to see the top holders now

Selling big time?
Total traded on the ASX today was only 8 million odd shares. We have 1.7 billion-ish shares on our register.

We've had 80 -100 million days in the past. I don't think any of the top holders are having a major sell off event.

1660545702575.png
 
  • Like
  • Love
  • Fire
Reactions: 42 users
D

Deleted member 118

Guest
Selling big time?
Total traded on the ASX today was only 8 million odd shares. We have 1.7 billion-ish shares on our register.

We've had 80 -100million share days in the past.

View attachment 14205
But it’s been continuous for quite a while now and soon adds up
 
  • Like
Reactions: 3 users

robsmark

Regular
  • Like
  • Love
Reactions: 12 users
D

Deleted member 118

Guest
Volume suggests otherwise Rocket…
View attachment 14206
But it’s been going on for quite a while and I couldn’t work out if it was some one accumulating or shorting, now I think it’s someone selling some shares instead.
 
  • Thinking
Reactions: 1 users

Pmel

Regular
That’s what happens a company adds zero value from a shareholders perspective in eight months.

I’ve spoken to Tony about this recently and I’m sure there’s plenty going on behind the scenes (just to clarify Tony told me nothing he shouldn’t have - an astute professional), but at face value - it’s average.

They need to get some positive commercial uptake news out into the market now, or this type of fuckery will continue.
100% agree with you. When there is no news for a while the SP will drop. I read your other post about having a chat with tony. What is your opinion after the chat . Was it positive . I know he wont have mentioned anything he shouldn't but how did you feel specially when some of us are frustrated with no news coming put of the company. Thanks in advance .
 
  • Like
Reactions: 8 users

GazDix

Regular
But it’s been continuous for quite a while now and soon adds up
With small orders as we have seen it is certainly accumulation happening (institutional bots) and not a sell off happening. Usually the accumulator plays pong with itself to get the SP down and then... Zip buy more at a lower price as retail are scared out (or exhausted). This trading pattern will continue unless there is an announcement and I saw this too a lot when we were trading around the 30 cent mark a few years ago.

Good on to Robsmark to reaching out. All sounds positive behind the scenes.
 
  • Like
  • Love
Reactions: 25 users
Top Bottom