BRN Discussion Ongoing

JK200SX

Regular
Back in January we were all contemplating AKIDA going into space with NASA, and there was also the comment by Anil Mankar of the 19nm or 90nm chip that may be used by them.
Artemis 1 launches in about an hour......... possibility of an AKIDA chip onboard?
 
  • Like
  • Love
  • Thinking
Reactions: 15 users

wilzy123

Founding Member
There are two explanations for a post like this.

1. Somebody has access to pharmaceuticals that they shouldn't necessarily be indulging in

OR

2. Somebody is overdue for a session with their therapist

Either way, I value the enthusiasm.

 
  • Haha
  • Like
  • Fire
Reactions: 33 users

uiux

Regular
Back in January we were all contemplating AKIDA going into space with NASA, and there was also the comment by Anil Mankar of the 19nm or 90nm chip that may be used by them.
Artemis 1 launches in about an hour......... possibility of an AKIDA chip onboard?

There's cubesats onboard apparently..

But NASAs mission plans are explained here:


"July 2023: PACE4 nanosat mission through Van Allen belts. TRL"

Always a possibility there are classified project's happening in parallel to the public projects. If that's the case, who is to say Akida isn't already orbital?
 
Last edited:
  • Like
  • Love
  • Thinking
Reactions: 23 users

Kozikan

Regular
  • Haha
  • Like
Reactions: 7 users

uiux

Regular
Ford research on event based vision eg. Prophesee



EBSnoR: Event-Based Snow Removal by Optimal Dwell Time Thresholding

We propose an Event-Based Snow Removal algorithm called EBSnoR. We developed a technique to measure the dwell time of snowflakes on a pixel using event-based camera data, which is used to carry out a Neyman-Pearson hypothesis test to partition event stream into snowflake and background events. The effectiveness of the proposed EBSnoR was verified on a new dataset called UDayton22EBSnow, comprised of front-facing event-based camera in a car driving through snow with manually annotated bounding boxes around surrounding vehicles. Qualitatively, EBSnoR correctly identifies events corresponding to snowflakes; and quantitatively, EBSnoR-preprocessed event data improved the performance of event-based car detection algorithms.

---

This work was made possible in part by funding from Ford Motor Company University Research Program.
 
  • Like
  • Fire
  • Love
Reactions: 23 users

Proga

Regular
  • Like
Reactions: 4 users

uiux

Regular
 
  • Like
Reactions: 5 users

uiux

Regular
1661778212893.png
 
  • Haha
  • Like
Reactions: 9 users

Learning

Learning to the Top 🕵‍♂️
The mckinsey Technology Trends Report.
It's great to be a shareholder.
 
  • Like
  • Love
Reactions: 13 users
I personally think you can argue two ways:

1. As income is reported quarterly and licence fees are reported on this basis he could mean quarterly, or

2. As the Annual Report is final arbiter of the years performance he could mean annually.

I think 2. because he is American and quarterly reporting is not a thing required by the SEC.

My wild speculation so DYOR
FF

AKIDA BALLISTA
Or if company X pays 2million in licensing BRN could be expected to be paid 8 to 10 million in roalties from that same company once implemented in their product has taken place.

That was my laymans interpretation of that interview
 
  • Like
Reactions: 7 users
What an amazing individual you are. Many thanks I now have something to read at Spotlight while my darling wife takes hours to not make up her mind. LOL

FF

AKIDA BALLISTA
amazing individual:ROFLMAO: You're too kind with your words. I hope the spotlight shopping expedition was not too painful.
It brings me back to the days when I was a toddler going to Kmart with my mother. The dress aisles were so boring when I just wanted to hang out in the toy section and the bicycle section.
 
  • Like
  • Love
  • Fire
Reactions: 12 users
The risk is in the tail:

“ If the announcement is not capable of being drafted to meet these requirements without including the commercially sensitive information, then Listing Rule 3.1 will require the commercially sensitive information to be disclosed”

The problem with the ASX is they will not give guidance or advice to companies in advance and only step forward to judge after publication when the ASX can direct additional information.

It is a legal nightmare for small growing technology companies on the ASX.

My opinion only DYOR
FF

AKIDA BALLISTA
Look up Asx Listing Rules Guidance Note 8

SC
 
  • Like
Reactions: 1 users

Proga

Regular
True, generally, younger people are MORE tech savvy, however, the some of the older generation likes to read the cars user manual back to front first. There is also a reasonable older generation here that are invested in BRN, you could say they are also tech savvy

Either way like most things in life people are adaptable and can learn!

If Merc are going to role it out, they're going to capitalise and role it out on all applicable models,
Sorry @AARONASX I missed your post. Too busying defending myself from a couple of grubs.

If Merc are going to role it out, they're going to capitalise and role it out on all applicable models - that was my first thought until Bravo posted the article. Bravo and I had been discussing it for a few weeks and my thinking not dissimilar to yours was Merc was going to wow the market and role it out in all models. I posted that a couple of times in our discussions. But looks like they are taking the conservative route to make sure they get it right before unleashing it on their more valuable higher margin clients. Interesting to note, the C and E class are MB's highest selling models.

When reading it I had the same thoughts about us older'ish tech savvy posters in here. It was disappointing as hell and made me re-evaluate my own retirement plans. Bravo and I were hoping for it to be released in 2023 for the 2024 model but as Bravo reposted today it was going to be released in 2024 for the 2025 models. Then only in the MMA platform electric versions. I knew they weren't going to electrify the B-class and if they did would have been produced on the MMA platform, so didn't include them in my original post. I read another article some months back they were still thinking about producing ICE vehicles after 2030 for non-western countries if the demand was there. Obviously they'll be slower to adopt and implement the infrastructure required for EV's to start legislating the banning of ICE vehicles.

from the article:
Mercedes-Benz's plan is to build mb.OS for the first time in compact and medium-sized electric vehicles based on the MMA platform, and to adopt a self-developed operating system in all subsequent models - I'm assuming/praying this includes ICE vehicles as well not just their electric range

The reason for choosing to launch MB.OS on the MMA platform model, Mercedes-Benz's consideration is that the users of the entry-level model are younger, more receptive to new things, and can also propose improvements to the system.

I hope this helps
 
Last edited:
  • Like
Reactions: 5 users

Cardpro

Regular
I DON'T KNOW ABOUT ANYONE ELSE, BUT I"M GETTING VERY EXCITED!!!!!!!

The new Mercedes EQE SUV will make its official world debut on October 16 2022!

So, we all know that Mercedes-Benz, plans to officially launch MB.OS in 2024. But what is less well-known is that from 2022 to 2023, Mercedes-Benz will be equipped with a lightweight version of the MB.OS operating system on the next generation of new E-class models, before the full version of the MB.OS operating system is launched in 2024.

Here are some snippets from various articles about the EQE SUV which are making me feel obligated to dash off to Dan Murphy's to top-up my diminishing supply of champers!

🍾🥂



View attachment 15305




View attachment 15306

View attachment 15307





View attachment 15308








EQE and the MBUX Hyper screen
 
  • Like
  • Fire
Reactions: 10 users
  • Like
  • Fire
Reactions: 3 users

Sirod69

bavarian girl ;-)

Event-Based Vision Systems Market Study Reveals Growth Factors Size, Share, and Competitive Outlook for Future 2021-2030​

Latest AI-driven advancements in computer vision focus on emulating the characteristics of the human eye in a vision sensor system.


Competitive Intelligence

The section provides a detailed description of established companies, startups, and research institutes working on event-based cameras. Different parameters, including company overview, technology stack, partnerships, key personnel, future roadmap, and limitations have been considered for a comprehensive competitive profiling.

A key highlight to emerge from this analysis is that several European startups are directly competing against Samsung in the event-based vision technology domain.

Further, a benchmarking matrix of the commercialized and in-pipeline products has also been included for an in-depth analysis.

Companies mentioned in the report

1. Prophesee
2. iniVation
3. Insightness
4. Qelzal
5. MindTrace
6. CelePixel
7. Sunia
8. Australian Institute of Technology
9. Samsung
10. Sony

 
  • Like
  • Love
  • Fire
Reactions: 17 users
Dance Show Off GIF by TLC Europe


Saying how many shares you bought.. subtle flexes going on.
These posts you are referring to always give me the green-eyed Fomo feeling. I must resist the urge to search the couches of friends and family.
I wonder if there is any coin in that couch. 🤔
 
  • Haha
  • Like
Reactions: 9 users

equanimous

Norse clairvoyant shapeshifter goddess

Event-Based Vision Systems Market Study Reveals Growth Factors Size, Share, and Competitive Outlook for Future 2021-2030​

Latest AI-driven advancements in computer vision focus on emulating the characteristics of the human eye in a vision sensor system.


Competitive Intelligence

The section provides a detailed description of established companies, startups, and research institutes working on event-based cameras. Different parameters, including company overview, technology stack, partnerships, key personnel, future roadmap, and limitations have been considered for a comprehensive competitive profiling.

A key highlight to emerge from this analysis is that several European startups are directly competing against Samsung in the event-based vision technology domain.

Further, a benchmarking matrix of the commercialized and in-pipeline products has also been included for an in-depth analysis.

Companies mentioned in the report

1. Prophesee
2. iniVation
3. Insightness
4. Qelzal
5. MindTrace
6. CelePixel
7. Sunia
8. Australian Institute of Technology
9. Samsung
10. Sony

8. Australian Institute of Technology/Universities


What we do​

The Australian Artificial Intelligence Institute is internationally recognised as one of the top AI research hubs through the Institute's excellence in research, collaboration and education.
Distinguished Professor Jie Lu

Distinguished Professor Jie Lu
Director, Australian Artificial Intelligence Institute, University of Technology Sydney

VISION
To achieve excellence and innovation in sustainable and comprehensible artificial intelligence by developing powerful theoretical foundations, innovative technologies and application systems and by leading knowledge advancement which translates into significant social and economic impacts.
The Australian Artificial Intelligence Institute (AAII), led by Distinguished Professor Jie Lu, is Australia's largest research hub in the field of artificial intelligence. AAII, previously known as the Centre for Artificial Intelligence, was established in March 2017 at the UTS School of Computer Science in the Faculty of Engineering and IT. The research entity became an institute in August 2020, in recognition of its high-quality HDR outputs, broad research scope, and significant local and international collaboration.
Boasting eight research labs, AAII currently has 34 academic staff (including four distinguished professors, one FTSE, two FIEEE and one highly cited researcher), 10 postdoctoral associates, and more than 190 PhD students. AAII core members have won 21 Australian Research Council (ARC) projects (including ARC Discovery, Linkage, Future Fellow and Discovery Early Career Research Awards) and 50 national and international industry projects. Since its inception, AAII staff have published over 1000 papers, with 450 of these in high reputational international journals. Furthermore, AAII core members have delivered more than 20 keynote presentations in national and international conferences, and AAII students have received more than ten best paper awards from leading journals and conferences, including national and international awards. The UTS School of Computer Science is ranked in the top 29 worldwide (ARWU), and AAII is a key contributor to this outstanding achievement.
As the biggest centre for Artificial Intelligence in Australia, AAII has a team of world class researchers undertaking programs in major fronts of Artificial Intelligence

Distinguished Professor Jie Lu​

AAII consists of eight key research laboratories with three main research areas: fundamental research, technology transfer research, and applied research.

Fundamental Research​

  • Computational Intelligence
  • Deep Learning
  • Transfer Learning
  • Large-scale Graph Processing
  • Concept Drift
  • Reinforcement Learning
  • Pattern Recognition
  • Probabilistic Machine Learning
  • Big Dimensionality
  • Neuromorphic Computing
  • AI-Driven Software Security Analysis
  • Computer Vision
  • Explainable AI

Technology Transfer Research​

  • Brain-Computer Interface
  • Recommendation Systems
  • Social Networks
  • Social Robotics
  • Decision Support Systems
  • Cloud Computation
  • Blockchain
  • Human Autonomy Team
  • Bioinformatics
  • Data Science and Visualisation
  • Text Mining
  • AI Privacy & Security
  • Network Analytics

Applied Research​

  • Health Care
  • Financial Services
  • Internet of Things
  • Business Intelligence
  • Logistics
  • Transportation
  • Education
  • Defence
  • Marine Safety
  • Property
  • Food
  • Weather prediction

The Jericho Smart Sensing Lab at the University of Sydney Nano Institute has developed a prototype sensor system for the RAAF that mimics the brain’s neural architecture to deliver high-tech sensing technology.​

Dubbed MANTIS, the system under development integrates a traditional camera with a visual system inspired by neurobiological architectures. This ‘neuromorphic’ sensor works at 'incredibly high' speeds, which will allow aircraft, ships and vehicles to identify fast moving objects, such as drones.


An international team of researchers involving Monash University has demonstrated the world’s fastest and most powerful optical neuromorphic processor for artificial intelligence (AI), which operates faster than 10 trillion operations per second (TeraOPs/s) and is capable of processing ultra-large scale data.

 
  • Like
  • Love
  • Fire
Reactions: 14 users
Top Bottom