BRN Discussion Ongoing

wilzy123

Founding Member
Have we looked Dryad Networks.....
Early forest fire detection. Didnt register for the Whitepaper but some familiar buzzy terms used in capability statements and marketing material.

View attachment 40970 View attachment 40971

We'll probably dramatically dilute discussion about BRN and the internets will run out of hard drive space if we're gonna start posting links where the only connection to BRN are the broad terms you've highlighted....
 
  • Like
Reactions: 3 users

robsmark

Regular
@Realinfo to answer your question last week, yes I did mention January 2025 again, and you are basically correct in assuming that
I personally believe that revenue would have started to show a repeating pattern on review of the next 6 quarters, that is, growing
and heading northwards, I also believe that the share price will reflect that growth coming through, that being said, not North of
around $1.75 taking out the stag spike or overshoot that will probably occur in all the excitement, with fence sitters and the herd all
jumping aboard, giving us a market cap. of around 3 billion AUD, which would be at my top end...not advice for new investors, just
a believers private views on display.

Some may see that valuation (guess) as either to light on or way over the top, but that's what makes the markets so interesting, it's
full of bullshitters, crystal balls and in real terms, winners and losers etc.

Cheers and good evening....Tech 🍷
Hey Tech… This post? What question of mine was you answering?
 
  • Haha
  • Like
Reactions: 7 users

MDhere

Regular
i. got busy looking for a house in melb today, did i miss something. i little birdie said i missed something... 🤣 but managed to log on and saw nearly 40 so maybe i will be busy again tomorrow but was curious to know what i missed if anyone can pm me 🤣🤣
 

keyeat

Regular
i. got busy looking for a house in melb today, did i miss something. i little birdie said i missed something... 🤣 but managed to log on and saw nearly 40 so maybe i will be busy again tomorrow but was curious to know what i missed if anyone can pm me 🤣🤣
SP went up and then back down .....


Season 4 Episode 13 GIF by The Simpsons
 
  • Haha
Reactions: 8 users
S

Straw

Guest
  • Like
Reactions: 5 users

IloveLamp

Top 20
  • Haha
  • Sad
Reactions: 14 users

IloveLamp

Top 20
This might help to satisfy the doubters of our likely association with BMW

Is it just me, or are Robs likes beginning to make a lot more sense...?

Pay attention folks, greatness unfolding at Brainchip.
Screenshot_20230802_221522_LinkedIn.jpg
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 44 users

Learning

Learning to the Top 🕵‍♂️

Attachments

  • Screenshot_20230802_222936_LinkedIn.jpg
    Screenshot_20230802_222936_LinkedIn.jpg
    402.4 KB · Views: 97
Last edited:
  • Like
  • Love
  • Fire
Reactions: 42 users

Pmel

Regular
Until we sign up a good IP contract or significant revenue we aren't going anywhere. Either stay at the same level or drop even more. It is what it it is. I know few here would give me hard time here for this, some have done that in the past and sent me private message telling me how silly i am. But truth is truth. SP will reflect what company achieves.
 
  • Like
  • Fire
  • Love
Reactions: 19 users

Pmel

Regular
Until we sign up a good IP contract or significant revenue we aren't going anywhere. Either stay at the same level or drop even more. It is what it it is. I know few here would give me hard time here for this, some have done that in the past and sent me private message telling me how silly i am. But truth is truth. SP will reflect what company achieves.
Happy to post the nasty comments few post from time to time just to express your concerns. Some here are just to put you down for that.
 
  • Like
  • Thinking
Reactions: 5 users

Krustor

Regular
Happy to post the nasty comments few post from time to time just to express your concerns. Some here are just to put you down for that.
Just forgot to change accounts? :unsure:
 
  • Haha
  • Like
Reactions: 24 users

robsmark

Regular
Happy to post the nasty comments few post from time to time just to express your concerns. Some here are just to put you down for that.
Just forgot to change accounts? :unsure:
I’m laughing but I shouldn’t be. I’m guessing he just replied to the wrong person.

I personally know Pmel and can assure you he isn’t a manipulator or a downramper. Just a regular guy watching his significant investment getting smashed.
 
  • Like
  • Sad
Reactions: 13 users

wilzy123

Founding Member
  • Haha
  • Like
Reactions: 10 users

Frangipani

Regular
395545C7-BCB8-4197-BF51-EF222DCF8E0E.jpeg


I am aware that the above article on neuromorphic computing that Gabriel Rubio, CEO of SecuRED (a small business “specialising in innovative security and privacy technology solutions including AI”) shared on LinkedIn, has been posted here a couple times of before, but check out the comment section.

Not surprisingly, there is a post by Nick Brown promoting Brainchip 😊 - both he and @chapman89 should really think about changing their profile pictures to something along the lines of

15E7F492-F6FB-4EDE-947B-2721203E8245.jpeg

or

12A2FA95-A748-4557-B97C-A3B644EA2B48.jpeg


🤣🤣🤣

But now have a look at Robert Moore’s comment. While not referring to Brainchip specifically, his enthusiastic assessment of the disruptive nature of neuromorphic technology is yet another validation by someone with an intriguing professional background.

E6CE9F74-50F6-474C-9C51-E84DFB38AF88.jpeg

1F4AC819-434B-4BA8-BCF7-BD3D19F22402.jpeg


D6288500-A479-434C-BDCF-91A54904136D.png




According to Wikipedia, Booz Allen Hamilton is “an American government and military contractor, specializing in intelligence... The company's stated core business is to provide consulting, analysis and engineering services to public and private sector organizations and nonprofits.”

However, it should also be noted that “Booz Allen has particularly come under scrutiny for its ties to the government of Saudi Arabia and the support it provides to the Saudi armed forces. Alongside competitors McKinsey & Company and Boston Consulting Group, Booz Allen are seen as important factors in Crown Prince Mohammed bin Salman’s drive to consolidate power in the Kingdom.[89] On the military side, Booz Allen is employing dozens of retired American military personnel to train and advise the Royal Saudi Navy and provide logistics for the Saudi Army, but denies its expertise is used by Saudi Arabia in its war against Yemen. Additionally, it also entered an agreement with the Saudi government that involves the protection and cyber-security of government ministries,[90] with experts arguing that these defensive maneuvers could easily be used to target dissidents.”

This connection to Saudi-Arabia reminded me of the following slide in the moonbeam Emerging Technology Assessment presentation @Rise from the ashes shared with us yesterday:

FAE66273-8B34-4995-BAF0-C995688D996F.jpeg


Intel has been open about collaborating with the Kingdom of Saudi-Arabia on the US $ 500 billion NEOM desert megacity project, which some view as the world’s first futuristic smart city and ecological prestige project and others as a repressive ruler’s megalomaniac fantasy and ecological disaster “being built on forcible evictions, state violence and death sentences” (https://www.dw.com/en/saudi-arabias-neom-a-prestigious-project-with-a-dark-side/a-65664704).


C8BDAE91-71AC-42B7-9A17-0AB8BDE049A9.jpeg

(And of course “Jesse was here”… 😂)

It is obviously an ethical question whether or not to do business with a government such as that of Saudi-Arabia. I wonder whether or not Brainchip will clearly position itself?

Some (admittedly hard to digest) food for thought:
 
Last edited:
  • Like
  • Love
  • Wow
Reactions: 22 users

Frangipani

Regular

Can AI Continue To Scale?​

Forbes Technology Council
Peter van der Made
Forbes Councils Member
Forbes Technology Council
COUNCIL POST| Membership (Fee-Based)

Aug 2, 2023,09:30am EDT
Peter van der Made is the founder and CTO of BrainChip Ltd. BrainChip produces advanced AI processors in digital neuromorphic technologies.



GETTY

Artificial intelligence is rapidly being deployed within all aspects of business and finance. Some exciting successes are putting pressure on the industry to embrace this new technology. No one wants to be left behind.

The core technologies behind AI are neural network models, deep learning algorithms and massive data sets for training. The model is constructed for a specific purpose such as object recognition, speech recognition and object tracking. A “model” describes how the neural network is constructed, how many parameters the network has and how many layers.

The overall accuracy of the neural network is a function of the quality and size of the training data set, the number of parameters and the training procedure. This is not an exact science. Too much training, and the model will respond well to the training set but not to real-world situations. This is “overfitting” the model. Too little training, and the model will not be able to respond to all known situations.

No model is perfect. There is always a margin of error and the occurrence of outlier conditions for which the model has no parameters. Over the last 10 years, models have become more complex as capabilities and accuracy have increased.

The models used for large language models such as Bard and GPT-4 use hundreds of billions of parameters and need massive data sets to train on. Even the most powerful personal computers cannot handle large models that require considerable computational power and memory resources. The computing is done via the internet (the cloud) on large data center computers—a server farm.

Server farms are used in applications such as natural language processing, generating text and images, classifying video streams, and IoT process control and monitoring. Wired estimates that training a large model like GPT-4 costs $100 million, using as many as 10,000 systems with powerful A100 GPU processor arrays over 11 months. The largest known model is Google GLaM, with more than 1 trillion parameters. Models are getting larger and larger, but can these systems continue to scale?

According to SemiAnalysis chief analyst Dylan Patel (via Insider), the cost of running ChatGPT is estimated to be as high as $700,000 daily. This cost is broken down into maintenance, depreciation on the computer resources, and electricity consumption of the servers and cooling systems. In a study published jointly by Google and UC Berkeley (via Scientific American), the amount of power used by GPT-3 is 1,287 megawatt hours.

This is of great concern when multiplied by the number of server farms worldwide and the increase in AI processing. The power consumption of server farms will likely increase as more people start to access online AI. Server farms could consume more than 20% of the world’s electricity by 2025.

Server farms use large racks with powerful computers and GPUs. They contain thousands of processing cores that can be used as parallel processing units to compute the function of a neural network. The power used by a single GPU can be as high as 400 watts, and a server may use up to 32 of those GPUs. A company’s cluster of large data centers may deploy as many as 2.5 million servers. Even if only half of the servers contain GPUs, a worst-case calculation will reach 16,000 megawatt hours. That is a lot of greenhouse gases.

There are several ways to reduce the environmental impact of server farms. One part of the solution is more efficient hardware, together with the use of renewable energy. Another is to use hybrid solutions that perform much of the processing distributed at the edge in specialized, low-power but high-performance neuromorphic hardware. Neuromorphic processing takes inspiration from the energy-efficient methods of the brain.

The human brain contains approximately 86 billion neuron cells (about 80 times that of GLaM, the largest of the large language models) with an estimated 100 trillion connections (roughly 100 times that of GLaM). Each cell has a variable amount of electrochemical memory. The information stored in this biological memory can be considered equivalent to the parameters in a neural network model.

The brain model is dynamic in contrast to artificial neural networks. It creates new connections and more memory as we learn, and it prunes redundant connections when we sleep. The human brain neural network, even though larger than the largest AI model, consumes only the energy equivalent of 20 watts—less than a light bulb. The brain’s structure is vastly different from the neural network models used in today’s AI systems, notwithstanding the successes we have seen over the last few years.

Neuromorphic processing borrows from the efficient processing techniques of the brain by copying its behavior into digital circuits. While digital circuits may not be as power-efficient as analog circuits, stability, interchangeability and speed outweigh the slight power advantage. Using neuromorphic computing engines is transparent to the developer and the user because of an event-driven convolution shell.

Neuromorphic processing can run convolutional neural networks (CNN) and can run image classification on ImageNet1000, real-time video classification, odor and taste recognition, vibration analysis, voice and speech recognition, and disease and anomaly detection. Using these functions in portable and battery-powered tools is possible because of its low power consumption.

It is possible to reduce the excessive power consumption of data centers by using distributed AI processing in fast neuromorphic computing devices, which reduces operating costs and increases the functionality and responsiveness of edge products. Neuromorphic processing can help compensate for AI’s expected negative environmental impact.

Forbes Technology Council is an invitation-only community for world-class CIOs, CTOs and technology executives. Do I qualify?



Follow me on LinkedIn. Check out my website.
Peter van der Made
Peter van der Made

Peter van der Made is the founder and CTO of BrainChip Ltd. BrainChip produces advanced AI processors in digital neuromorphic technologies. Read Peter van der Made's full executive profile here.
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 82 users

cosors

👀

Can AI Continue To Scale?​

Forbes Technology Council
Peter van der Made
Forbes Councils Member
Forbes Technology Council
COUNCIL POST| Membership (Fee-Based)

Aug 2, 2023,09:30am EDT
Peter van der Made is the founder and CTO of BrainChip Ltd. BrainChip produces advanced AI processors in digital neuromorphic technologies.



GETTY

Artificial intelligence is rapidly being deployed within all aspects of business and finance. Some exciting successes are putting pressure on the industry to embrace this new technology. No one wants to be left behind.

The core technologies behind AI are neural network models, deep learning algorithms and massive data sets for training. The model is constructed for a specific purpose such as object recognition, speech recognition and object tracking. A “model” describes how the neural network is constructed, how many parameters the network has and how many layers.

The overall accuracy of the neural network is a function of the quality and size of the training data set, the number of parameters and the training procedure. This is not an exact science. Too much training, and the model will respond well to the training set but not to real-world situations. This is “overfitting” the model. Too little training, and the model will not be able to respond to all known situations.

No model is perfect. There is always a margin of error and the occurrence of outlier conditions for which the model has no parameters. Over the last 10 years, models have become more complex as capabilities and accuracy have increased.

The models used for large language models such as Bard and GPT-4 use hundreds of billions of parameters and need massive data sets to train on. Even the most powerful personal computers cannot handle large models that require considerable computational power and memory resources. The computing is done via the internet (the cloud) on large data center computers—a server farm.

Server farms are used in applications such as natural language processing, generating text and images, classifying video streams, and IoT process control and monitoring. Wired estimates that training a large model like GPT-4 costs $100 million, using as many as 10,000 systems with powerful A100 GPU processor arrays over 11 months. The largest known model is Google GLaM, with more than 1 trillion parameters. Models are getting larger and larger, but can these systems continue to scale?

According to SemiAnalysis chief analyst Dylan Patel (via Insider), the cost of running ChatGPT is estimated to be as high as $700,000 daily. This cost is broken down into maintenance, depreciation on the computer resources, and electricity consumption of the servers and cooling systems. In a study published jointly by Google and UC Berkeley (via Scientific American), the amount of power used by GPT-3 is 1,287 megawatt hours.

This is of great concern when multiplied by the number of server farms worldwide and the increase in AI processing. The power consumption of server farms will likely increase as more people start to access online AI. Server farms could consume more than 20% of the world’s electricity by 2025.

Server farms use large racks with powerful computers and GPUs. They contain thousands of processing cores that can be used as parallel processing units to compute the function of a neural network. The power used by a single GPU can be as high as 400 watts, and a server may use up to 32 of those GPUs. A company’s cluster of large data centers may deploy as many as 2.5 million servers. Even if only half of the servers contain GPUs, a worst-case calculation will reach 16,000 megawatt hours. That is a lot of greenhouse gases.

There are several ways to reduce the environmental impact of server farms. One part of the solution is more efficient hardware, together with the use of renewable energy. Another is to use hybrid solutions that perform much of the processing distributed at the edge in specialized, low-power but high-performance neuromorphic hardware. Neuromorphic processing takes inspiration from the energy-efficient methods of the brain.

The human brain contains approximately 86 billion neuron cells (about 80 times that of GLaM, the largest of the large language models) with an estimated 100 trillion connections (roughly 100 times that of GLaM). Each cell has a variable amount of electrochemical memory. The information stored in this biological memory can be considered equivalent to the parameters in a neural network model.

The brain model is dynamic in contrast to artificial neural networks. It creates new connections and more memory as we learn, and it prunes redundant connections when we sleep. The human brain neural network, even though larger than the largest AI model, consumes only the energy equivalent of 20 watts—less than a light bulb. The brain’s structure is vastly different from the neural network models used in today’s AI systems, notwithstanding the successes we have seen over the last few years.

Neuromorphic processing borrows from the efficient processing techniques of the brain by copying its behavior into digital circuits. While digital circuits may not be as power-efficient as analog circuits, stability, interchangeability and speed outweigh the slight power advantage. Using neuromorphic computing engines is transparent to the developer and the user because of an event-driven convolution shell.

Neuromorphic processing can run convolutional neural networks (CNN) and can run image classification on ImageNet1000, real-time video classification, odor and taste recognition, vibration analysis, voice and speech recognition, and disease and anomaly detection. Using these functions in portable and battery-powered tools is possible because of its low power consumption.

It is possible to reduce the excessive power consumption of data centers by using distributed AI processing in fast neuromorphic computing devices, which reduces operating costs and increases the functionality and responsiveness of edge products. Neuromorphic processing can help compensate for AI’s expected negative environmental impact.

Forbes Technology Council is an invitation-only community for world-class CIOs, CTOs and technology executives. Do I qualify?



Follow me on LinkedIn. Check out my website.
Peter van der Made
Peter van der Made

Peter van der Made is the founder and CTO of BrainChip Ltd. BrainChip produces advanced AI processors in digital neuromorphic technologies. Read Peter van der Made's full executive profile here.
I can not decide therefore
❤️‍🔥
 
  • Like
  • Haha
Reactions: 9 users

Krustor

Regular
I’m laughing but I shouldn’t be. I’m guessing he just replied to the wrong person.

I personally know Pmel and can assure you he isn’t a manipulator or a downramper. Just a regular guy watching his significant investment getting smashed.
I know why I already called him "Pumuckl" a few month ago...

Nevertheless, lets just go on.
 
  • Like
Reactions: 1 users

Frangipani

Regular
We know that Brainchip and the AFRL are an item (via ISL) - to me, the article below seems to suggest that this relationship may have evolved into another ménage à trois? 🤔 Well, as long as all three partners are happy with this arrangement, they have my blessing… My speculation only - DYOR.



Sarcos Awarded Artificial Intelligence Contract by Air Force Research Laboratory​

July 26, 2023 by Sarcos
Bolstered by multiple, multi-million dollar, multi-year DoD contracts — New Advanced Technologies Division formed to advance generalized AI for unstructured, dynamic environments

Division pioneering innovative AI approach applicable to myriad industrial robotics and autonomous vehicles

SALT LAKE CITY—July 26, 2023—
Sarcos Technology and Robotics Corporation (“Sarcos”) (NASDAQ: STRC and STRCW), a leader in the design, development, and manufacture of advanced robotic systems, solutions, and software that redefine human possibilities, announced today the award of an expanded contract (FA8750-22-C-1005) from the Air Force Research Laboratory (AFRL) for continued development of artificial intelligence (AI)-driven methods and techniques that autonomously control a Heterogeneous Sensing Network (HSN).

As part of the AFRL contract, Sarcos is developing a collaborative sensing solution that enables its Department of Defense (DoD) partners to quickly, accurately, and safely identify, track, and classify time-critical objects using autonomous, heterogeneous sensor networks and AI to improve the operations, safety, data collection, and communication of autonomous platforms, such as Unmanned Aircraft Systems (UASs) and Unmanned Aerial Vehicles (UAVs).

“Sarcos’ unique approach to reinforced learning and AI enables autonomous systems to work together more effectively,” said Dr. Peter Zulch, AFRL. “Sarcos utilizes advanced AI-driven methods and techniques to improve the operations, safety, data collection, and communication of our autonomous platforms.”

The methods developed will harness the power of a myriad of sensor data to create more robust data sets that enable accurate autonomous operations in dynamic and unstructured environments, such as subsea operations and solar panel installations over diverse terrain, and the application for air force systems. The approach models how humans detect and adapt to their surroundings – using the multiple senses of sight, sound, and feel – to make real-time decisions and adjustments to operate effectively in real-world environments.


“This continued work is critical to advancing our AI platform to benefit customers across industries,” said Dr. Denis Gargic, chief technology officer, Sarcos. “The ability to harness and unify the power of sensors to adapt to dynamic inputs in unstructured environments allows for increased accuracy and continued operations despite changing conditions.”

Sarcos Forms New AI-Focused Business Unit to Meet Increased Demand
Software as a service (SaaS) and AI applications for robotics systems are emerging as expected growth drivers for Sarcos. As a result of the demand for autonomous solutions and building on the work derived from multiple, multi-million dollar, multi-year, AI-focused DoD contracts, Sarcos is also announcing the formation of a new Advanced Technologies division to be led by Dr. Garagić, Sarcos’ chief technology officer.

The Advanced Technologies division will work to progress the development and productization of Sarcos’ AI and machine learning (ML) software platform for generalizable autonomy. The AI and ML software platform will focus on enabling robots to learn from experience using a success-based learning approach. The AI and ML platform will be designed to be usable across a variety of autonomous systems, including factory robots and drones. Additionally, the division will continue to pioneer new algorithms, models, and techniques to unlock new possibilities in the field of robotics, with a focus on initiatives aimed at pushing the boundaries of AI capabilities for robotics operating in dynamic, unstructured environments which pose unique challenges due to their complex and unpredictable nature.

“Dynamic and unstructured environments present an exciting opportunity for AI innovation,” said Laura Peterson, interim president and CEO, Sarcos. “With Dr. Garagic’s expertise and leadership, our Advanced Technology division is well positioned to develop cutting-edge AI solutions that can excel in unstructured environments, enabling industries to achieve unprecedented efficiency and effectiveness.”

The newly formed division will also collaborate with industry partners, leveraging their domain knowledge and expertise to address market-specific challenges. Sarcos believes this market-led approach will accelerate the adoption of the technology and drive real-world applications fostering growth and innovation within these industries and beyond.

“I am honored to lead this newly formed division at Sarcos alongside some of the brightest minds in AI and ML,” said Dr. Garagic. “I look forward to continued collaboration with my colleagues, peers, industry partners, and others to solve some of the biggest AI challenges and provide solutions that will change the future of work. Together we will push the boundaries of AI in dynamic and challenging unstructured environments and revolutionize the way work gets done.”

For more information on Sarcos and its product portfolio, including solutions benefitting the U.S. Department of Defense, please visit www.sarcos.com.
###

About Sarcos Technology and Robotics Corporation

Sarcos Technology and Robotics Corporation (NASDAQ: STRC and STRCW) designs, develops, and manufactures a broad range of advanced mobile robotic systems, solutions, and software that redefine human possibilities and are designed to enable the safest most productive workforce in the world. Sarcos robotic systems operate in challenging, unstructured, industrial environments and include teleoperated robotic systems, a powered robotic exoskeleton, and software solutions that enable task autonomy. For more information, please visit www.sarcos.com and connect with us on LinkedIn at www.linkedin.com/company/sarcos.

About Sarcos Technology and Robotics Corporation’s AI and ML Platform
Sarcos’ AI and ML platform is being designed to enable robotic systems to learn from experience using a success-based learning approach. The goal is to mimic the core domains of human-level concept/task learning, allowing robots to understand their environment, exhibit reasonable behavior in unforeseen situations, and quickly acquire new skills from new experiences. Unlike existing machine learning algorithms that often require numerous examples to achieve high accuracy, Sarcos’ AI framework seeks to emulate how people can generalize successfully from just a single example. By leveraging what the robot has previously learned, the framework fills in the gaps of unknown scenarios by generating robust and reliable predictions on how the robot should act in any given situation. The advanced AI software platform will revolutionize how businesses operate in dynamic and unstructured operating environments and will empower customers with cutting-edge capabilities and unlock new possibilities for efficiency and innovation.

Forward-Looking Statements
This press release contains forward-looking statements within the meaning of the Private Securities Litigation Reform Act of 1995, including future collaboration with the U.S. Air Force, future abilities of Sarcos’ AI/ML software, the expected benefits and performance of such software (including the impact of Sarcos’ software on how work is done), market adoption of and demand for the software and its applicability to robotic systems generally. Forward-looking statements are inherently subject to risks, uncertainties, and assumptions. Generally, statements that are not historical facts, including statements concerning possible or assumed future actions, business strategies, events, or results of operations, are forward-looking statements. These statements may be preceded by, followed by, or include the words “believes,” “estimates,” “expects,” “projects,” “forecasts,” “may,” “will,” “should,” “seeks,” “plans,” “scheduled,” “anticipates,” “intends” or “continue” or similar expressions. Such forward-looking statements involve risks and uncertainties that may cause actual events, results, or performance to differ materially from those indicated by such statements. These forward-looking statements are based on Sarcos’ management’s current expectations and beliefs, as well as a number of assumptions concerning future events. However, there can be no assurance that the events, results, or trends identified in these forward-looking statements will occur or be achieved. Forward-looking statements speak only as of the date they are made, and Sarcos is not under any obligation and expressly disclaims any obligation, to update, alter or otherwise revise any forward-looking statement, whether as a result of new information, future events, or otherwise, except as required by law.

Readers should carefully review the statements set forth in the reports which Sarcos has filed or will file from time to time with the Securities and Exchange Commission (the “SEC”), in particular the risks and uncertainties set forth in the sections of those reports entitled “Risk Factors” and “Cautionary Note Regarding Forward-Looking Statements,” for a description of risks facing Sarcos and that could cause actual events, results or performance to differ from those indicated in the forward-looking statements contained herein. The documents filed by Sarcos with the SEC may be obtained free of charge at the SEC’s website at www.sec.gov.
Sarcos PR and Investor Contacts:
mediarelations@sarcos.com
ir@sarcos.com
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 28 users
Top Bottom