BRN Discussion Ongoing

Bravo

If ARM was an arm, BRN would be its biceps💪!
This is interesting. This extract is from an article published 9 hours ago and provides some insight into the progress of Qualcomm's new CPU processors called "Oryon" featuring Nuvia technology. Apparently they're being sampled by OEM's and exceeding expectation. The processors are set to come out in 2024, pending the outcome of a lawsuit that Arm has filed against Qualcomm.

EXTRACT
Screen Shot 2023-02-07 at 11.22.08 am.png


It has been stated previously that Qualcomm's Oryon would employ a custom CPU architecture based on the Arm instruction set, but it will not be an off-the-shelf Arm core architecture. Remember yonks ago Rob Telson "liked" Leendert Van Doorn's post in which Leendert talked about Arm's CPU technology being "years ahead of the competition". (see below).

It'll be interesting to see how this plays out seeing Arm has filed a lawsuit against Qualcomm for breach of certain license agreements with Arm and trademark infringement. Arm is seeking specific performance of the contractual obligation to destroy certain Nuvia designs, an injunction against trademark infringement as well as fair compensation for the trademark infringement.


Screen Shot 2023-02-07 at 11.39.44 am.png
 
  • Like
  • Fire
  • Thinking
Reactions: 42 users
Unsure if it’s been mentioned yet but on hotcrapper someone has posted and email from tony Dawe saying that Akida 2000 will also be released this year
 
Screenshot_20230207-120105.png

ROB GAVE A LIKE.
 

Attachments

  • imageonline-co-emojiadded.png
    imageonline-co-emojiadded.png
    1.4 MB · Views: 61
  • Like
  • Haha
  • Fire
Reactions: 9 users

Dhm

Regular
This is a letter from TD published by @YngInvstr in response to his/her question on the crapper.

Screen Shot 2023-02-07 at 12.37.38 pm.png
 
  • Like
  • Love
  • Fire
Reactions: 122 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Maybe some future competition on the horizon? @Diogense might be able to explain how analog neuromorphic computing compares? It says here that the " project, which started last month, aims to develop an analog neuromorphic computing approach based on oscillatory neural networks (ONNs) that seamlessly interfaces with sensors and processes analog data without any analog-to-digital conversion."

I remember that @TECH mentioned Simon Thorpe was working on a project with ONN's but I don't know if it's this particular one.


Oscillating neural networks for low power analog edge AI

Oscillating neural networks for low power analog edge AI​

Technology News | February 6, 2023
By Nick Flaherty
MATERIALS & PROCESSES AI

A European project including IBM and BMW is developing a new type of analog neural network with phase change materials to reduce the power consumption of machine learning at the edge of the network.


The new technique couples oscillating neural networks (ONN) with phase change materials and could reduce power consumption by a factor of 100 to 1000 say the researchers in the PHASTRAC (Phase Change Materials for Energy Efficient Edge Computing) project.

The project, which started last month, aims to develop an analog neuromorphic computing approach based on oscillatory neural networks (ONNs) that seamlessly interfaces with sensors and processes analog data without any analog-to-digital conversion.

The oscillating neurons will be implemented with vanadium dioxide (VO2) phase change material coupled with synapses implemented with bilayer resistive RRAM memories using molybdenum and hafnium dioxide (Mo/HfO2).


The project aims to develop new devices for implementing the ONN architecture and processing the analog sensor data. It is led by researchers from Technical University in Eindhoven, who held a session on ONN technology at the European Conference on High-performance Embedded Architecture and Compilation (HiPEAC) Conference last month in Toulouse. They are working with researchers from BMW, IBM Research in Zurich and the Pázmány Péter Catholic University in Budapest.

 
  • Like
  • Fire
Reactions: 9 users

GazDix

Regular
I agree. Finally having neuromorphic technology in a product may very well prove to be the inflection point that changes people’s attitudes from “this is a future technology” to “this is the latest technology.”

Just look at what’s happened to WBT since it was revealed that TSMC put ReRam in the iPhone 14. Perception is important.
Good point.

Another one is ChatGPT's chatbox reached 1 million user in under one week only.
After playing with it and seeing its adoption (I couldn't get onto it this morning because it was at full capacity), we cannot underestimate how good this is for AI's perception.
In the crypto market, all AI related coins have boomed like FET, AGIX and Render. This was led by ChatGPT. Google are now apparently going to release a competitor to it now as well.
For Brainchip this is great for the industry we are in. Now everyday ordinary people use AI and it is no longer a scary 'terminator' like idea. For a company creating 'essential' and 'ubiqitous' AI, this will ease customer adoption for our partners that use our IP once we start to grow rather than breaking onto the AI scene.

1675734742761.png


Our 'near environment' has improved greatly already in 2023 IMO. How popular will it be? If ChatGPT can explode with 1 million users in a week and our EAP/NDA partners have seen this, what will happen with our products' adoption? I imagine Chris Stevens has been quite busy since ChatGPT broke onto the scene.
 
  • Like
  • Fire
  • Thinking
Reactions: 18 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
The title of this article says it all.🥳




Tesla's FSD Can Be Outdriven By Mercedes​

Feb. 06, 2023 10:04 AM ETMercedes-Benz Group AG (MBGAF), MBGYYBAMXF, BMWYY, BYMOF, F, GOOG, GOOGL, HYMLF, HYMTF, NVDA, TM, TOYOF, TSLA, VLVLY, VOLAF, VOLVF31 Comments4 Likes

Summary​

  • Tesla, Inc.’s Full Self-Driving capability is not really self-driving and unique.
  • Mercedes-Benz Group AG’s Drive Pilot is much more advanced than the Tesla’s version.
  • Mercedes’ valuation is attractive given its technologic advancements and financials.
  • Main risk to Mercedes is coming from the continued bear market.

Mercedes Benz Logo auf Oldtimer. Mercedes-Benz ist ein deutscher Autohersteller. Die Marke wird für Luxusautos, Busse, Reisebusse und LKW verwendet.


RomanNerud/iStock Editorial via Getty Images

As my subscribers know, over the last months I have analyzed extensively electric vehicles ("EVs") to find opportunities that can bring money to me and my subscribers. So far, it has gone well. We managed to short a couple of names and get juicy profits. The more analysis I do, the clearer it is getting to me. The race is not about electric vehicles, it is about self-driving technology. Tesla, Inc. (TSLA) is considered the king in the industry because its Artificial Intelligence ("AI") and Full Self-Driving Capability (“FSD”) are perceived as the best ones. If and when Tesla manages to sell FSD as a service to other car makers, its revenue and gross margins will be completely different.
And it’s not just Tesla that considers Full-Self Driving a gold mine. Among the many companies working on autopilot are BMW (OTCPK:BAMXF), Volvo (OTCPK:VOLAF), Google (GOOG), and even Toyota (TM), which had been skeptical about autopilot back in 2015. Audi recently made aggressive moves into self-driving, and Audi A8, A6, and Q8 support autopilot function. In 2019, Hyundai (OTCPK:HYMLF) invested in a self-driving startup Aurora, but that technology hasn’t made its way to public release yet.
One of Tesla’s main rivals in the self-driving field is Mercedes-Benz Group AG (OTCPK:MBGAF, OTCPK:MBGYY). It has achieved a level 3 of driving autonomy, while Tesla has reached only level 2. We give a detailed look at Mercedes and believe that the market underappreciates its story.

Level 5, level 3, level 2— what does it all mean?

To understand what Tesla’s and Mercedes' autopilots represent, it’s helpful first to understand the National Highway Traffic Safety Administration’s admittedly wonky, five-level classification system for vehicle automation.
The early levels are familiar in our lives. They help us park, warn us of obstacles, and play the role of an executive officer while we are still captains of the ship.
Levels 2 and 3 are a big leap in technology development. So far, you’re still the driver and responsible for what is happening on the road, but most of the control can be entrusted to the car. At level 2, a vehicle can drive itself, but you need to be on constant alert and have both hands on the wheel. The car can still be confused by a traffic situation and give you control at any moment. At level 3, a car drives itself, and you can relax a bit. You can even take your hands off of the wheel in some areas. But the most important thing is that the car will alert you 10 seconds before you need to take over control. So you won’t have a panic attack when you suddenly need to gain control over the vehicle. Of course, you still aren’t allowed to use your phone or laptop while driving, but you can use all media sources in the car. For example, you can watch movies on the screen embedded into the car. Why only there? Because you shouldn’t miss the alert when the car can’t cope with the situation without you. Yes, you’re still the captain, even though the sailors are very smart.
If this sounds like science fiction, then just wait because we’ve come to the most exciting development. This is really something from science fiction. Let me introduce you to levels 4 and 5. From this time, the car is fully responsible for your comfort, safety, and transportation from point A to point B. Now the car, not you, is the key figure on the road. It adapts to weather conditions, controls the situation on the road, and doesn’t need a driver at all.
Unfortunately, levels 4 and 5 are still in development, so today, autopilot really means "assisted driving" and not "self-driving" since the driver still has to be alert and attentive at all times. It won’t be until level 4 or 5 fully autonomous cars hit the roads that the promise of Full Self-Driving will be a reality.

Ok, but how do car makers achieve a really self-driving car?

There are three types of sensing devices: RADAR/Ultrasonics, LIDAR, and camera. All approaches have their own advantages and disadvantages.
RADAR sensors use radio waves to find those same distances and make the same calculations. It’s like a cat who can see a moving target— even at night. But a problem appears when a person stands on the road. It becomes barely visible for RADAR. This means that it’s not very good at detecting pedestrians and stationary humans, and this is vital on the roads. It’s a very good sensor to increase safety, but it’s not ideal as a primary sensor.
An acronym of “Light Detection and Ranging,” a LiDAR sensor points a laser at objects in its field of view, measuring distance by the time it takes for the reflected light to return to the sensor. The LiDAR can’t detect lanes, but is able to detect humans reasonably well— but at a much higher cost.
The last approach is the camera system. This is the primary sensor (in conjunction with a front facing Radar) used in Tesla vehicles. A camera system is a wide angle camera equipped on the front or in a surround configuration on the car. Unlike RADAR or LIDAR, a camera sensing equipment is only as good as the software processing the inputs, a primary component of which is a Deep Neural Network. I know that it sounds scary, but we describe it simply. I promise.
Deep Neural Networks, or DNNs for short, were thought of first as a way to perfectly simulate the human and animal nervous system where a neuron— a nerve cell— fires for an object to be ‘recognized.’ A brain’s reaction is fast enough, but our main problem is that our body can’t always react quickly to danger. That is, we may see or feel it, but we don’t have time to do anything about it. And now, imagine a perfect car that can act as quickly as it recognizes the object.
The first DNN was created by Google. It really worked, except for one thing: the energy and resources spent on it were unimaginable. The role of the knight in golden armor was played by Nvidia Corporation (NVDA), which reduced costs by 150 times and became the market leader.
But how exactly does this technology function? Well, the human brain recognizes objects through its edges. It doesn’t see pixels; it sees edges. That’s why we are all afraid of the dark in childhood. In a dark, dark room, we can only see a dark, dark figure, and our brain doesn’t need anything else. This is enough for us to remember all the horror stories told around the campfire and how the dark, dark figure turned into a monster (although it was likely just a chair). In a word, our brain saw the edge, and then everything went like clockwork.
So a DNN tries to recreate how a human brain functions by programming it only to recognize edges. A ton of code is added, and then begins the unsupervised "Machine Learning" time period. In this, the DNN is given material, which is either images or videos, or data in any other form. Everything ends up in a very funny situation: several hundred engineers and highly qualified programmers somewhere in Silicon Valley circling ovals and squares so that in the future, what seems like science fiction to us now will one day become our reality.
One by one, the virtual neurons are created, unsupervised, and unprogrammed to recognize a specific edge. When enough time has passed, it can distinguish between whatever the DNN was told to look out for. The "intelligence" of this technology depends on its processing power and the time spent "learning." Admittedly, it’s kind of depressing if it takes three days and a billion operations to recognize (as Google did at first), and it’s an entirely different matter when everything happens in a fraction of a second. But now that we have learned all the subtleties of this technical miracle, let’s move on.

Enter Mercedes-Benz

Mercedes-Benz unveiled Drive Pilot, the world’s first level 3 driver assistance system approved by the German government, powered by Nvidia. Drive Pilot uses an extensive suite of radar, LIDAR, ultrasonic, and video camera sensors.

Drive Pilot Sensors

Mercedes' presentation

On suitable highway sections and where traffic density is high, Drive Pilot can offer to take over, initially up to the legally permitted speed of 37 mph/60 kmh. This limit will be increased to 81 mph/130 kmh next year. Though you can still only use level 3 in certain areas and under good weather conditions. The controls needed for this are located in the steering wheel rim, on the left and right, above the thumb recesses. When the driver activates Drive Pilot, the system controls the speed and distance and effortlessly guides the vehicle within its lane. The route profile, events occurring on the route, and traffic signs are considered. The system also reacts to unexpected traffic situations and handles them independently.
This autopilot system can also be operated in level 2 mode when all of the conditions aren’t met, such as highway speed. So features such as lane keeping and emergency braking are still available, but with the driver required to be ready to take control.
Of course, the cooperation of Mercedes with Nvidia played a very strong hand in Drive Pilot’s creation, as I have already mentioned that it was Nvidia who became the leader in the market of artificial intelligence. And what’s more, powering Drive Pilot is Nvidia's Orin system-on-chip, which eclipses Tesla’s performance and is in production now.
It is much higher than current Tesla's Level 2 despite promises of Elon Musk to reach Level 5 by the year's end, it is not possible.

Mercedes financials


Historic financials

Financial reports. Author's analysis

Mercedes has been focusing on high-margin products lately and reducing its product range. This strategic shift has paid off for the company as it has seen an improvement in its financial performance. The gross margin, which measures the profit a company makes from its sales, has increased from 15% to 22%. In addition to the improvement in gross margin, Mercedes has also optimized its operating expenses (Opex). This has increased EBIT. Unleveraged Cash Flow over the last 12 months achieved over $25 billion. An astonishing amount. In 2021 it was below 10 billion dollars. The positive cash flow generated by these improvements has allowed the company to invest in the transition to electric vehicles.

Valuation

Car makers' stocks have been extremely volatile recently. Initially, Tesla's stock plunged and provided a substantial investment opportunity one month ago. But its share price bounced off exceptionally quickly. At the current levels, Tesla is again priced extremely high; its EV/ 2024 Sales is again at 4.5x, compared with 2.6x just recently. Mercedes's stock has been on the ride since November last year, and its current share price of $78 results in an $83 billion market cap with EV/ 2024 Sales multiple of about 0.4x. It is comparable with other established players as Ford and General Motors.
Nobody expects that Mercedes will accelerate its revenue as Tesla will. But I believe the market still underappreciates the potential boost from Mercedes' self-driving platform. Otherwise, it should be more expensive than Ford, struggling with its EV transition.

Risks

The main risks come from very high competitive pressure and recession fears. Although EV sales did not decrease during Covid-19, almost all of Mercedes' revenue is generated by combustion engine cars that were severely hit during the Covid-induced recession. Therefore, if the economy deteriorates, Mercedes' earnings will decrease, leading to lower cash flow generation that is required to finance the transition to electric vehicles.
Additional risks are coming from potential technological challenges. The market can much better appreciate Tesla's AI-centered approach given an emerging AL hype. Tesla was one of the first to integrate cameras around its vehicle as standard equipment, allowing the car to see 360 degrees and up to 250 meters away. Tesla has also integrated powerful AI computing into its cars that enable the vehicle to process camera images faster than ever possible, making other sensors such as radar and ultrasonic sensors redundant, if not confusing, for the system to process. This has shocked many in the industry who feel quite the opposite and believe that more sensors such as LiDAR are needed, not fewer. It also raises the question of how autopilot will handle low visibility situations such as in the evening, dark, or foggy situations where the radar still functions. Tesla’s website previously proclaimed that its radar “Passes through fog, dust, rain, snow and under cars” and that “Radar plays an essential role in detecting and responding to forward objects,” something owners who drive in less-than-ideal conditions may now need to worry about.
Starting in mid-2021, Tesla no longer equipped vehicles with radar sensors and relied solely on cameras for autopilot and Full Self-Driving capabilities. It does so because it believes that its Deep Neural Network can be so well trained that it will recognize any needed objects without the help of radars and LiDARs.

Conclusion

Mercedes-Benz Group AG is exceptionally well-positioned for an emerging self-driving race, and despite a recent share price surge, there is still a lot of potential for investors. To buy Mercedes-Benz Group AG stock now or to wait depends on your recession probabilities. When macro uncertainty is so high as it is now, a deep, slow scaling into Mercedes-Benz Group AG stock can be a reasonable strategy.
Meanwhile, keep riding the cycle.
Editor's Note: This article discusses one or more securities that do not trade on a major U.S. exchange. Please be aware of the risks associated with these stocks.

This article was written by
Georgy Shishkov profile picture
Georgy Shishkov
621 Followers
Follow
Welcome to my page! My name is George and I am passionate about analyzing stocks. My seven years of experience in investment banking and the CFA II exam allow me to provide a fresh perspective to subscribers. My research is focused on opportunities in the Energy and EV space and I love using those findings to challenge extremely optimistic business assumptions and forecasts frequently presented by the management. I also believe that you need to follow cycles to earn in the markets. I focus a lot on the macro-environment in my research. My Master's Degree in Mathematical Methods in Economics equipped me well for diligent analysis. Feel free to enjoy my articles. I'm sure they will help you get less stressed and earn more in the markets.
Show more
Show More
Disclosure: I/we have no stock, option or similar derivative position in any of the companies mentioned, but may initiate a beneficial Short position through short-selling of the stock, or purchase of put options or similar derivatives in TSLA over the next 72 hours. I wrote this article myself, and it expresses my own opinions. I am not receiving compensation for it (other than from Seeking Alpha). I have no business relationship with any company whose stock is mentioned in this article.

 
  • Like
  • Fire
  • Love
Reactions: 43 users
Being an Intel IFS is not the same in them investing a measly (for them) $10Mill to come up with a chip that they can use/sell/own.

We went for the big play - IP only and everyone will que up. Except they didn't. What we were expecting was that Intel would license Akida IP and build their own version of Akida 1500. But they didn't. But they put our logo on a web page. Nice.

So now we're making the advanced chip the Intel won't own and didn't want to own.

So yes, becoming an Intel IFS member is a non event. That's why there was no announcement.
There seems to be a timeline from first playing with our tech and through to selling our tech. I would use Renesas as an example. First chip selling later this year. They have shown it to the world at CES so 2 to 3 years seems about right. Quite a few of our new known and confirmed additions may need 2 more years before we get sales. The good thing is first sales start later this year and then over next couple of years become more common. Then you have to wait for royalties to kick in so the longer you hold the better it will be. Depends on ones investing strategy. I think 5 years from now we will all be extremely happy. Longer for divvies IMO.

SC
 
  • Like
  • Fire
  • Wow
Reactions: 18 users
There seems to be a timeline from first playing with our tech and through to selling our tech. I would use Renesas as an example. First chip selling later this year. They have shown it to the world at CES so 2 to 3 years seems about right. Quite a few of our new known and confirmed additions may need 2 more years before we get sales. The good thing is first sales start later this year and then over next couple of years become more common. Then you have to wait for royalties to kick in so the longer you hold the better it will be. Depends on ones investing strategy. I think 5 years from now we will all be extremely happy. Longer for divvies IMO.

SC
I just got hit with a wow from rise from the ashes so re-read my post. I am not saying that it will take 5 years for a good price, I think it will be a good price in 1 to 2 years, I just meant in 5 years we will be extremely happy. Then onto divvies. Hope that is easier to understand.

SC
 
  • Like
  • Love
  • Haha
Reactions: 14 users

chapman89

Founding Member
Well the question I want to know is, once renesas tape out, how long does testing roughly take after? @Diogenese help us out.
 
  • Like
  • Fire
Reactions: 8 users

alwaysgreen

Top 20
Well the question I want to know is, once renesas tape out, how long does testing roughly take after? @Diogenese help us out.
Lololol. You have invested your hard earned into a company and you don't know the production timelines. Surely you've got to be joking!!

The above was in jest but how does it feel to be patronised in the same way you did to me?
 
  • Like
  • Haha
  • Sad
Reactions: 9 users
Interesting article on the Art of Holding.

Love this bit
“Successful investors can differentiate business performance from stock performance and can take advantage of those investors who can’t.”
 
  • Like
  • Love
  • Fire
Reactions: 23 users
Love this bit
“Successful investors can differentiate business performance from stock performance and can take advantage of those investors who can’t.”
Yep that's the best bit I thought over and over again I see it I invest in it sometimes the market takes it's damn time to catch up to the fundamentals.
 
  • Like
  • Love
Reactions: 6 users

Diogenese

Top 20
Maybe some future competition on the horizon? @Diogense might be able to explain how analog neuromorphic computing compares? It says here that the " project, which started last month, aims to develop an analog neuromorphic computing approach based on oscillatory neural networks (ONNs) that seamlessly interfaces with sensors and processes analog data without any analog-to-digital conversion."

I remember that @TECH mentioned Simon Thorpe was working on a project with ONN's but I don't know if it's this particular one.


Oscillating neural networks for low power analog edge AI

Oscillating neural networks for low power analog edge AI​

Technology News | February 6, 2023
By Nick Flaherty
MATERIALS & PROCESSES AI

A European project including IBM and BMW is developing a new type of analog neural network with phase change materials to reduce the power consumption of machine learning at the edge of the network.


The new technique couples oscillating neural networks (ONN) with phase change materials and could reduce power consumption by a factor of 100 to 1000 say the researchers in the PHASTRAC (Phase Change Materials for Energy Efficient Edge Computing) project.

The project, which started last month, aims to develop an analog neuromorphic computing approach based on oscillatory neural networks (ONNs) that seamlessly interfaces with sensors and processes analog data without any analog-to-digital conversion.

The oscillating neurons will be implemented with vanadium dioxide (VO2) phase change material coupled with synapses implemented with bilayer resistive RRAM memories using molybdenum and hafnium dioxide (Mo/HfO2).


The project aims to develop new devices for implementing the ONN architecture and processing the analog sensor data. It is led by researchers from Technical University in Eindhoven, who held a session on ONN technology at the European Conference on High-performance Embedded Architecture and Compilation (HiPEAC) Conference last month in Toulouse. They are working with researchers from BMW, IBM Research in Zurich and the Pázmány Péter Catholic University in Budapest.

1675739291090.png
 
  • Haha
  • Like
Reactions: 8 users

alwaysgreen

Top 20
Maybe some future competition on the horizon? @Diogense might be able to explain how analog neuromorphic computing compares? It says here that the " project, which started last month, aims to develop an analog neuromorphic computing approach based on oscillatory neural networks (ONNs) that seamlessly interfaces with sensors and processes analog data without any analog-to-digital conversion."

I remember that @TECH mentioned Simon Thorpe was working on a project with ONN's but I don't know if it's this particular one.


Oscillating neural networks for low power analog edge AI

Oscillating neural networks for low power analog edge AI​

Technology News | February 6, 2023
By Nick Flaherty
MATERIALS & PROCESSES AI

A European project including IBM and BMW is developing a new type of analog neural network with phase change materials to reduce the power consumption of machine learning at the edge of the network.


The new technique couples oscillating neural networks (ONN) with phase change materials and could reduce power consumption by a factor of 100 to 1000 say the researchers in the PHASTRAC (Phase Change Materials for Energy Efficient Edge Computing) project.

The project, which started last month, aims to develop an analog neuromorphic computing approach based on oscillatory neural networks (ONNs) that seamlessly interfaces with sensors and processes analog data without any analog-to-digital conversion.

The oscillating neurons will be implemented with vanadium dioxide (VO2) phase change material coupled with synapses implemented with bilayer resistive RRAM memories using molybdenum and hafnium dioxide (Mo/HfO2).


The project aims to develop new devices for implementing the ONN architecture and processing the analog sensor data. It is led by researchers from Technical University in Eindhoven, who held a session on ONN technology at the European Conference on High-performance Embedded Architecture and Compilation (HiPEAC) Conference last month in Toulouse. They are working with researchers from BMW, IBM Research in Zurich and the Pázmány Péter Catholic University in Budapest.

I'm always reluctant to send emails to Brainchip but is it worth running this past them?

We have been told by management in the past that we have a 3-5 year lead in the space so does this change anything?
 
  • Like
Reactions: 4 users

chapman89

Founding Member
Lololol. You have invested your hard earned into a company and you don't know the production timelines. Surely you've got to be joking!!

The above was in jest but how does it feel to be patronised in the same way you did to me?
Difference is I’m not chucking a dummy spit and planning on selling my shares mid 2023 as my research tells me I’m invested in the right company in the right industry with the right people running it!

Im asking productive questions and not telling people on a forum im selling my shares by mid year!
 
  • Like
  • Love
  • Fire
Reactions: 48 users
Lololol. You have invested your hard earned into a company and you don't know the production timelines. Surely you've got to be joking!!

The above was in jest but how does it feel to be patronised in the same way you did to me?
😂 This is in jest also mate. But you made me laugh hard with that post.
1askjx.jpg
 
  • Haha
  • Like
  • Love
Reactions: 20 users

ndefries

Regular
This is a letter from TD published by @YngInvstr in response to his/her question on the crapper.

View attachment 28894
Explains the PM role mentioning 2.0. Good times ahead. Also good to read 1.5 was aligned to a customer engagement! Aka a future royalty provider
 
  • Like
Reactions: 16 users
  • Like
  • Fire
Reactions: 7 users
Top Bottom