BRN Discussion Ongoing

Mt09

Regular
1699519328237.jpeg
 
  • Haha
  • Like
Reactions: 7 users

Getupthere

Regular

Morgan Stanley: AI on the 'edge' plays into Apple's strengths


Nov 8, 2023 | 7:32 AM


"We believe 2024 will be a catalyst year for 'Edge AI', and see Apple as one of 6 key beneficiaries." — Analyst Erik Woodring


From a note to Morgan Stanley clients that landed on my desktop Wednesday:


As AI's impact continues to diffuse across the technology stack, AI at 'The Edge' is emerging as a theme we believe will gain more traction in 2024, as new products come to market and new 'killer apps' emerge.


This is because 50% of all enterprise data will be created at the edge, according to Gartner, leaving open an opportunity for hardware OEMs to come to market with a new generation of smart devices to power AI at the edge.


We believe Apple will emerge as one of the key winners - or 'Edge AI Enablers' - in this race given the unique data from Apple's 2 billion+ devices and 1.2 billion+ users, Apple's focus on data privacy, and Apple's leading hardware, software, silicon and services vertical integration.


Ultimately, we see Edge AI as a multi-year tailwind that can improve Product and Services monetization and drive Apple's user base LTV higher, a key driver of long-term valuation...


Why 'The Edge', and why now? Gen AI's impact on cloud vendors has become clearer in 2023, but the impact at the edge has yet to materialize. However, as AI permeates into new consumer use cases, we expect the edge to become an emerging enabler of AI inferencing in 2024 given the benefits of lower query costs, improved latency, greater personalization, better data security/privacy, and easier accessibility (2). While constraints at the edge also exist - namely power consumption and processing power - we believe they are being addressed with more complex and powerful silicon, including Apple's A17 Pro SoC, which can process 35 trillion operations per second, giving it the capability to power - locally on the device - LLMs up to high single digit billions of parameters. We expect new battery tech, silicon, and edge devices to emerge in 2024 (and beyond), helping to spark investor interest in this theme.


Maintains Overweight rating and $210 price target.


Cue the six-company "enabler" chart:


My take: AI built into the devices — that's what Tim Cook has been saying all along.
 
  • Like
  • Fire
Reactions: 19 users

Mt09

Regular
  • Haha
  • Like
Reactions: 6 users

wilzy123

Founding Member
you are right! i saw the line above for a different company. that is a good sign

119,664 means that it will likely have been a little pissant retail shorter that got roasted LMAO...
 
  • Like
  • Fire
Reactions: 2 users
Never a dull moment BRN after the last couple of days huh :LOL:

I see our friends at CMU want someone.

Snipped a couple bits I liked...hopefully presents another channel at some point into US Govt.

.

Associate Researcher - Advanced Computing Lab​

EmployerCarnegie Mellon UniversityLocationPennsylvania, United StatesSalarySalary Not SpecifiedPosted DateNov 1, 2023

What We Do

At the SEI AI Division, we conduct research in applied artificial intelligence and the engineering challenges related to the practical design and implementation of AI technologies and systems. We currently lead a community-wide movement to mature the discipline of AI Engineering for Defense and National Security.

Requirements

Knowledge of Department of Defense and Intelligence Community.
You have a basic understanding of DoD or Intelligence Community software systems acquisition practices.

Duties

Solution development.
You’ll work with interdisciplinary teams to turn research results into prototype operational capabilities for government customers and stakeholders.

Build and demonstrate. You'll participate in the design and develop of solutions that provide needed capabilities to the government, building on state-of-the-art research in high-performance computing and edge architectures, data analytics, information architectures, machine learning, artificial intelligence, software architectures, and human-machine interaction.

Technical Experimentation. You'll experiment with emerging technologies such as augmented and virtual reality, computer vision, and deep learning, and explore the latest in advanced hardware, including FPGAs, CGRAs, TPUs, and emergent technologies such as neuromorphic and analog processors.
 
  • Like
  • Fire
  • Love
Reactions: 16 users

GpHiggsBoson

Regular
How’s the losers at ASX.
What a bunch of useless eaters.

Lots going on now with BRN . AKIDA 2.0 more diverse and open to options for customers. This really can be a game changer. Let’s see where that takes us.
Good luck to all! And especially to us bolted/rusted on individuals. ❤️
 
  • Like
  • Love
  • Fire
Reactions: 32 users

GStocks123

Regular
  • Like
  • Fire
Reactions: 6 users
get those votes in ladies n gents (other w akida) 21h remaining!


3% to go, which equates to another 19 votes. Assuming the infamously spelt "Assistent get another 4 votes, the Brainchip camp should be aming for another 25 votes within the next 20 hours!
 
  • Like
  • Fire
Reactions: 6 users

GpHiggsBoson

Regular
Ok. I know this is a BRN forum and I have lots invested - BUT in the meantime you best get your heads into IMU too.
It’s gone ballistic the last 5 days…. so be warned, it might take a rest and pull back the covers.
Just keep your eyes on it. 2024 will be huge!
Sorry my mates here on BRN.
Carry on!
Moderators.. do your thing. ❤️
 
  • Like
  • Love
  • Fire
Reactions: 19 users

robsmark

Regular
  • Haha
  • Like
Reactions: 11 users
2052 will be our year
 
  • Haha
  • Like
  • Wow
Reactions: 15 users

cosors

👀
It has already been mentioned here. Now it has been published by Wevolver:

"The huge potential of sequential data analysis at the edge​

BrainChip Team
from BrainChip
09 Nov, 2023

1699533578014.png

How BrainChip's Akida technology is revolutionizing time series data analysis.​

artificial intelligence
- big data
- neural network
This article is based on an article titled, BrainChip Sees AI Gold in Sequential Data Analysis at the Edge, from the Cambrian AI website.
Amidst the whirlwind of excitement surrounding Large Language Model (LLM) generative AI, it's easy to lose sight of other AI domains that have seemingly dissolved into the shadows cast by ChatGPT's prominence. One underappreciated domain is the analysis of sequential data streams, which includes monitoring fluctuating stock prices and interpreting video feeds.
BrainChip has identified this niche — the adept handling of sequential data — as a pivotal niche for deploying its Akida technology. Akida stands out for its prowess in Event-Based Neuromorphic computing, adeptly handling various neural network architectures like Vision Transformers (ViT), Convolutional Neural Networks (CNN), Temporal Evolution Network (TENN), and Recurrent Neural Networks (RNN).
This article explores the need for efficient edge AI for time-series data.

Sequential data analysis​

Sequential analysis refers to the process of analyzing and extracting insights from data that is collected and organized in chronological order. This data type typically involves measurements or observations taken at regular intervals over time. Time-series analysis techniques aim to understand data patterns, trends, and dependencies and make predictions or forecasts based on historical patterns.
A few use cases of sequential data analysis include:
1. Financial Analysis: Time-series analysis is extensively used in finance to study stock market trends, analyze economic indicators, and forecast future market conditions. It helps model asset prices, trends, risk assessment, and portfolio optimization.
2. Demand Forecasting: Sequential analysis is crucial in demand forecasting for retail, supply chain management, and manufacturing industries. By analyzing historical sales or demand data, businesses can predict future demand patterns and optimize their production, inventory, and supply chain accordingly.
3. Predictive Maintenance: Predictive maintenance can monitor equipment and machinery in real-time. Analyzing sensor data and historical patterns can help detect anomalies and predict potential failures, enabling proactive maintenance and minimizing downtime.

As well as applications in Energy Consumption Analysis and IoT Sensor Data Analysis.

The market size for applying AI in time-series data analysis is continually growing as organizations recognize the value of extracting insights and making accurate predictions from temporal data. While specific market size figures for this realm are not readily available, the broader AI market, including applications in time-series analysis, is expected to grow substantially. According to a report by Grand View Research, the global AI market size was valued at USD 62.35 billion in 2020 and is projected to expand at a compound annual growth rate (CAGR) of 40.2% from 2021 to 2028. This growth encompasses various AI applications, including time-series analysis, across multiple industries.

Introducing the Dimension of Time to Neural Networks​

Traditional CNNs have been around for 30+ years and combine multiple hidden layers trained in a supervised manner. These are sequential and hence referred to as feed-forward neural networks. Bi-directional networks, also called RNNs, invented at the turn of the century, added capability for more complex learning, such as language modeling. But for applications to time series, a machine learning engineer needed a combination of CNNs and a temporal network for spatial-temporal analysis. While academia developed networks that did temporal convolution, they have yet to be power efficient or easy to train to make it to the far Edge.

Temporal Event-based Neural Networks​

Temporal Artificial Neural Networks (TENNs). This approach simplifies training and reduces model size while maintaining accuracy, resulting in improved performance and efficiency for Edge AI devices.

Using TENNs for data analysis offers several advantages, such as the ability to learn the temporal structure of the data, which is crucial for tasks such as forecasting and anomaly detection. TENNs can make predictions for future time steps, and they are capable of learning from large datasets of time series data. Overall, these features make TENNs a valuable tool for data analysis in various domains.

1699533635331.png


Beyond traditional Edge time series applications, BrainChip suggests that TENNs could minimize or eliminate the need for DSP filtering of raw audio signals and vital signs in health monitoring. Thus, offering compact solutions for wearable, hearing, and implantable devices with minimal power consumption is a significant advance for preventative healthcare.

Additionally, TENNs also excel at treating streaming inputs as a time series of frames, performing 3D convolutions composed of a temporal convolution on the time axis and a spatial convolution on the XY axis. This is efficiently achieved to improve the detection of higher-resolution video objects in low-power scenarios.

1699533671675.png


TENNs offer developers the flexibility to configure them in either buffered temporal convolution or recurrent modes, enabling them to adapt the network to their specific application requirements. Furthermore, TENNs can be efficiently trained on parallel hardware, such as GPUs and TPUs in the cloud, similar to convolutional networks, while retaining the compactness of RNNs for efficient inference at the edge. This approach helps to minimize the exponentially growing cost of training, which is a constant concern.

Akida 2nd Generation Processor and TENNs​

The BrainChip Akida processor is a digital portable processor IP inspired by the energy-efficient functionality of the human brain. Unlike traditional neuromorphic approaches, which are analog, it utilizes fully digital technology and offers a range of capabilities, including image classification, semantic segmentation, odor recognition, and time-series analysis. It supports various neural network architectures, including TENNs.

The Akida processor utilizes highly parallel event-based neural processing cores, merging neuromorphic processing with native support for traditional convolutional capabilities and functions, along with hardware support for TENN networks. Its neuromorphic processing cores communicate using sparse, asynchronous events, making it ideal for efficient time-series data analysis and managing high-speed, asynchronous, and continuous data streams.

1699533706975.png


BrainChip’s Akida allows the analysis of vision, video, and three-dimensional data as a time series, improving video object detection. Akida's support for spatial-temporal convolutions enhances speed and reduces energy consumption. This ability is possible because the processor features a high-speed, low-power digital design optimized for edge computing applications, offering real-time processing and low-latency analysis. It can handle structured and unstructured data, learning and recognizing patterns from streaming data, making it suitable for time-series data analysis.

1699533748058.png


Additionally, Akida includes a traditional CNN accelerator, as well as TENN and ViT logic, providing a comprehensive solution for sequential processing. The Akida processor is particularly effective for real-time data classification, anomaly detection, and predictive analytics. Consequently, the second-generation Akida processor, with TENN support, extends its efficiency and accelerated hardware solutions to multi-dimensional applications like video object detection and vision using event-based processing paradigms.

Conclusion​

The rapid expansion of the AI field has primarily focused on image processing and LLMs, leaving a significant gap in the development of sequential data analysis. BrainChip's Akida technology, featuring TENNs, simplifies training, reduce model size, and enhance performance and efficiency in time series data analysis.

This innovative approach can be applied in different industries, including finance, demand forecasting, predictive maintenance, energy consumption analysis, and IoT data analysis, meeting the growing demand for effective AI solutions in these domains. The Akida processor extends its efficiency to multi-dimensional applications improving real-time data analysis, predictive analytics, and video object detection for event-based processing."
https://www.wevolver.com/article/the-huge-potential-of-sequential-data-analysis-at-the-edge
 
  • Like
  • Love
  • Fire
Reactions: 75 users

Diogenese

Top 20
It has already been mentioned here. Now it has been published by Wevolver:

"The huge potential of sequential data analysis at the edge​

BrainChip Team
from BrainChip
09 Nov, 2023

View attachment 49181

How BrainChip's Akida technology is revolutionizing time series data analysis.​

artificial intelligence
- big data
- neural network
This article is based on an article titled, BrainChip Sees AI Gold in Sequential Data Analysis at the Edge, from the Cambrian AI website.
Amidst the whirlwind of excitement surrounding Large Language Model (LLM) generative AI, it's easy to lose sight of other AI domains that have seemingly dissolved into the shadows cast by ChatGPT's prominence. One underappreciated domain is the analysis of sequential data streams, which includes monitoring fluctuating stock prices and interpreting video feeds.
BrainChip has identified this niche — the adept handling of sequential data — as a pivotal niche for deploying its Akida technology. Akida stands out for its prowess in Event-Based Neuromorphic computing, adeptly handling various neural network architectures like Vision Transformers (ViT), Convolutional Neural Networks (CNN), Temporal Evolution Network (TENN), and Recurrent Neural Networks (RNN).
This article explores the need for efficient edge AI for time-series data.

Sequential data analysis​

Sequential analysis refers to the process of analyzing and extracting insights from data that is collected and organized in chronological order. This data type typically involves measurements or observations taken at regular intervals over time. Time-series analysis techniques aim to understand data patterns, trends, and dependencies and make predictions or forecasts based on historical patterns.
A few use cases of sequential data analysis include:
1. Financial Analysis: Time-series analysis is extensively used in finance to study stock market trends, analyze economic indicators, and forecast future market conditions. It helps model asset prices, trends, risk assessment, and portfolio optimization.
2. Demand Forecasting: Sequential analysis is crucial in demand forecasting for retail, supply chain management, and manufacturing industries. By analyzing historical sales or demand data, businesses can predict future demand patterns and optimize their production, inventory, and supply chain accordingly.
3. Predictive Maintenance: Predictive maintenance can monitor equipment and machinery in real-time. Analyzing sensor data and historical patterns can help detect anomalies and predict potential failures, enabling proactive maintenance and minimizing downtime.

As well as applications in Energy Consumption Analysis and IoT Sensor Data Analysis.

The market size for applying AI in time-series data analysis is continually growing as organizations recognize the value of extracting insights and making accurate predictions from temporal data. While specific market size figures for this realm are not readily available, the broader AI market, including applications in time-series analysis, is expected to grow substantially. According to a report by Grand View Research, the global AI market size was valued at USD 62.35 billion in 2020 and is projected to expand at a compound annual growth rate (CAGR) of 40.2% from 2021 to 2028. This growth encompasses various AI applications, including time-series analysis, across multiple industries.

Introducing the Dimension of Time to Neural Networks​

Traditional CNNs have been around for 30+ years and combine multiple hidden layers trained in a supervised manner. These are sequential and hence referred to as feed-forward neural networks. Bi-directional networks, also called RNNs, invented at the turn of the century, added capability for more complex learning, such as language modeling. But for applications to time series, a machine learning engineer needed a combination of CNNs and a temporal network for spatial-temporal analysis. While academia developed networks that did temporal convolution, they have yet to be power efficient or easy to train to make it to the far Edge.

Temporal Event-based Neural Networks​

Temporal Artificial Neural Networks (TENNs). This approach simplifies training and reduces model size while maintaining accuracy, resulting in improved performance and efficiency for Edge AI devices.

Using TENNs for data analysis offers several advantages, such as the ability to learn the temporal structure of the data, which is crucial for tasks such as forecasting and anomaly detection. TENNs can make predictions for future time steps, and they are capable of learning from large datasets of time series data. Overall, these features make TENNs a valuable tool for data analysis in various domains.

View attachment 49182

Beyond traditional Edge time series applications, BrainChip suggests that TENNs could minimize or eliminate the need for DSP filtering of raw audio signals and vital signs in health monitoring. Thus, offering compact solutions for wearable, hearing, and implantable devices with minimal power consumption is a significant advance for preventative healthcare.

Additionally, TENNs also excel at treating streaming inputs as a time series of frames, performing 3D convolutions composed of a temporal convolution on the time axis and a spatial convolution on the XY axis. This is efficiently achieved to improve the detection of higher-resolution video objects in low-power scenarios.

View attachment 49183

TENNs offer developers the flexibility to configure them in either buffered temporal convolution or recurrent modes, enabling them to adapt the network to their specific application requirements. Furthermore, TENNs can be efficiently trained on parallel hardware, such as GPUs and TPUs in the cloud, similar to convolutional networks, while retaining the compactness of RNNs for efficient inference at the edge. This approach helps to minimize the exponentially growing cost of training, which is a constant concern.

Akida 2nd Generation Processor and TENNs​

The BrainChip Akida processor is a digital portable processor IP inspired by the energy-efficient functionality of the human brain. Unlike traditional neuromorphic approaches, which are analog, it utilizes fully digital technology and offers a range of capabilities, including image classification, semantic segmentation, odor recognition, and time-series analysis. It supports various neural network architectures, including TENNs.

The Akida processor utilizes highly parallel event-based neural processing cores, merging neuromorphic processing with native support for traditional convolutional capabilities and functions, along with hardware support for TENN networks. Its neuromorphic processing cores communicate using sparse, asynchronous events, making it ideal for efficient time-series data analysis and managing high-speed, asynchronous, and continuous data streams.

View attachment 49184

BrainChip’s Akida allows the analysis of vision, video, and three-dimensional data as a time series, improving video object detection. Akida's support for spatial-temporal convolutions enhances speed and reduces energy consumption. This ability is possible because the processor features a high-speed, low-power digital design optimized for edge computing applications, offering real-time processing and low-latency analysis. It can handle structured and unstructured data, learning and recognizing patterns from streaming data, making it suitable for time-series data analysis.

View attachment 49185

I think it's pretty interesting that Qualcomm invested in SiFive to drive rapid growth in RISC-V. And SiFive and BrainChip are partners. And everyone who is anyone is talking about WEARABLES!!!!

I've never said "wearables" so many times in my life as I have over the last couple of days. When you repeat it over and over it sounds really funny! Wearables, wearables, wearables...warble, warble..


View attachment 49114 View attachment 49113


Additionally, Akida includes a traditional CNN accelerator, as well as TENN and ViT logic, providing a comprehensive solution for sequential processing. The Akida processor is particularly effective for real-time data classification, anomaly detection, and predictive analytics. Consequently, the second-generation Akida processor, with TENN support, extends its efficiency and accelerated hardware solutions to multi-dimensional applications like video object detection and vision using event-based processing paradigms.

Conclusion​

The rapid expansion of the AI field has primarily focused on image processing and LLMs, leaving a significant gap in the development of sequential data analysis. BrainChip's Akida technology, featuring TENNs, simplifies training, reduce model size, and enhance performance and efficiency in time series data analysis.

This innovative approach can be applied in different industries, including finance, demand forecasting, predictive maintenance, energy consumption analysis, and IoT data analysis, meeting the growing demand for effective AI solutions in these domains. The Akida processor extends its efficiency to multi-dimensional applications improving real-time data analysis, predictive analytics, and video object detection for event-based processing."
https://www.wevolver.com/article/the-huge-potential-of-sequential-data-analysis-at-the-edge
Hi cosors,

@Bravo was just banging on about wearables, and then your post contains this:

Beyond traditional Edge time series applications, BrainChip suggests that TENNs could minimize or eliminate the need for DSP filtering of raw audio signals and vital signs in health monitoring. Thus, offering compact solutions for wearable, hearing, and implantable devices with minimal power consumption is a significant advance for preventative healthcare.
 
  • Like
  • Fire
  • Love
Reactions: 54 users
Good to see companies like Lenovo also starting to acknowledge the existence of neuromorphic and the potential.

Seems like an eternity that the "wanca"s were front and centre and neuromorphic now becoming more commonplace in industry vernacular.

IMG_20231109_213415.jpg


 
  • Like
  • Fire
Reactions: 26 users

Tothemoon24

Top 20

indie Semiconductor Launches Breakthrough Computer Vision Processor

  • Expands indie’s Growing and Class-leading Camera Video Processor Portfolio
  • Meets OEM Demands for Scalable ADAS Vision Architectures for Mass-market Deployments
  • Provides both Viewing and Sensing Functions via an Unparalleled Solution
October 24, 2023 09:00 AM Eastern Daylight Time
ALISO VIEJO, Calif.--(BUSINESS WIRE)--indie Semiconductor (Nasdaq: INDI), an Autotech solutions innovator, has expanded its automotive camera video processor portfolio with the commercial release of iND87540, a highly integrated system-on-chip (SoC) that enables viewing and sensing capability at the vehicle’s edge.
“By incorporating real-time video processing with object detection into a single SoC, indie is paving the way for multiple vision-enabled safety and convenience use cases across OEMs’ vehicle classes.”
Post this
As government regulators, new car safety assessors and consumers demand higher performance driver and road user safety features, automakers are increasingly seeking camera-based Advanced Driver Assistance System (ADAS) solutions that enable volume scalability, across their vehicle classes. This demands a ‘distributed intelligence’ architectural approach to vision sensing, and high levels of integration coupled with low power consumption to meet the needs of mass market deployments. indie’s iND87540 was developed to address these challenging design requirements. iND87540 is also a preferred solution by end customers as a pre-processor for powerful central compute and provides real-time image processing for optimal detection performance.
Integrating real-time, on-chip image signal processing (ISP), digital signal processing (DSP) and customized hardware, the iND87540 enables viewing and sensing capabilities within the stringent power, latency and compact form factor needed for scalable vision architectures. The SoC provides computer vision processing that can run a range of algorithms, enabling ADAS functions including pedestrian detection, object detection, blind spot detection, cross traffic alerts as well as driver and occupant monitoring (DMS/OMS). indie complements this class-leading SoC hardware with value-add proprietary high-performance embedded algorithms such as auto calibration (AutoCAL®) and dirty lens detection.
“As OEMs strive to deploy vision-based viewing and sensing across their model ranges, distributed intelligence is emerging as a critical enabler for the proliferation of vision-based ADAS applications. Our launch of indie’s iND87540 is capitalizing on this industry dynamic: delivering high-performance vision processing, without sacrificing the power, cost and size demands of the volume market,” said Abhay Rai, EVP and GM of indie’s Vision Business Unit. “By incorporating real-time video processing with object detection into a single SoC, indie is paving the way for multiple vision-enabled safety and convenience use cases across OEMs’ vehicle classes.”
According to S&P Global, shipments of automotive electronic control units (ECUs) incorporating vision-based processing are expected to grow from 232 million units in 2022, to nearly 400 million units by 2027.
The iND87540 has been developed to ISO 26262 ASIL-B requirements, is AEC-Q100 Grade 2 qualified and hosts the AutoSAR software stack. The SoC is available now for vehicle integration and is currently in advanced development with select Tier 1 customers.
About indie
indie is empowering the Autotech revolution with next generation automotive semiconductors and software platforms. We focus on developing innovative, high-performance and energy-efficient technology for ADAS, user experience and electrification applications. Our mixed-signal SoCs enable edge sensors spanning Radar, LiDAR, Ultrasound, and Computer Vision, while our embedded system control, power management and interfacing solutions transform the in-cabin experience and accelerate increasingly automated and electrified vehicles. We are an approved vendor to Tier 1 partners and our solutions can be found in marquee automotive OEMs worldwide. Headquartered in Aliso Viejo, CA, indie has design centers and regional support offices across the United States, Canada, Argentina, Scotland, England, Germany, Hungary, Morocco, Israel, Japan, South Korea, Switzerland and China.
 
  • Like
Reactions: 6 users

Tothemoon24

Top 20
IMG_7777.jpeg


See what traditional frame-based cameras can’t with Metavision® neuromorphic sensing for AR/VR/XR, wearables, IoT and more.

Build your next consumer product with applications that redefine user experience:
👁️ Eye tracking
👋 Gesture recognition
🚶 Fall detection
🌍 Inside-out tracking (user localization and environment reconstruction)
🚗 Driver Monitoring Systems
✨ Constellation tracking (Ultra-fast LED tracking)

Ready to give your application an edge?
Get inspired by watching Event-based Metavision application videos: https://bit.ly/3RMwwap

#machinevision #neuromorphic #IoT #AR #VR #XR #wearables #eventbasedvision #UX #userexperience
 
  • Like
  • Love
  • Thinking
Reactions: 19 users

cosors

👀
After a longer time I went to another forum because the ASX announcements don't work here at the moment. Interesting posts:


"You have hit the nail on the head xxx with 'it is hard to know what they are doing'. Most who have read my posts know my position with regard to shorters, but I would like to throw another issue into the ring which questions the legality of the practice. Shorters know exactly when they are going to dump and control the price and there is NO requirement for them to notify the ASX. A company is required by law to release information that might influence the price of its stock or face up to ASIC. ASX when issuing speeding notices and put the question:

Does XXX consider the Information to be information that a reasonable person would expect to have a material effect on the price or value of its securities?

My point here is that lenders and shorters are aware of when there is going to be a 'dump' and when the cover is going to happen. That is not information to which 'a reasonable person' has access. I am also very sceptical as to the information the shorters / lenders have access to as being the same information that 'a reasonable person' would have. So, how does lending and shorting differ from insider trading? How can authorities condone such practises? Why does the ASX not make this information available to the 'reasonable person' until at least 2 days after it happens? Why is there a 'window of grace' before a substantial holder has to notify the ASX?

I know I am pissing into the wind but us reasonable persons can raise our voices in unison perhaps?"

and

"Only the 'trading participants' on the ASX will get access to those clearing house rates. Every one else has to trade via a broker who will be a trading participant. Trading participants evolved from stock exchange 'members' who were the brokers who mutually owned the various state stock exchanges that were amalgamated to form the ASX. After the ASX was demutualised (and listed on it's own exchange) the members became trading participants, for which privilege they need be a 'fit and proper person' to act in this capacity, and meet minimum requirements of capital and liquidity. Presumably they pay a big sub, and I imagine will have to lodge a large bond.

The brokerages and other financial institutions are not just using the services of the ASX, they are actually part of the system. They have had a big hand in creating the rules of the game we play by. Naturally these do not favour the little guy. Our only advantages are that we can think independently and be more nimble."

A S X
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 30 users

Esq.111

Fascinatingly Intuitive.
Morning Chippers,

Check this new mob out.

Read down the article and click on the 10 minutes launch vidio demo , amazing , particularly where the device projects onto one's palm whilst being controlled by hand movements.

Apparently uses the Snapdragon Chip.


Regards,
Esq.
 
Last edited:
  • Like
  • Fire
  • Wow
Reactions: 25 users
Renesas just released a message a
Saying MCUs with AI accelerators..
 
  • Like
  • Fire
  • Thinking
Reactions: 12 users
Top Bottom