BRN Discussion Ongoing

davidfitz

Regular
Hi all,
Curious to hear how many of us have registered for the Virtual Roadshow tomorrow and if you have submitted any questions you would like answered.

I have registered

AKIDA BALLISTA
I am unable to attend however wouldn't it be nice to know how many chips are available for the Edge Box. They are taking pre orders and payment is required upfront in full. It they have 2000 chips available (I think there are 2 in every box?) then it is not unreasonable to expect sales for this quarter to be approx. $800k. I know Sean/Board does not like to make such forecasts but what's the harm in asking ;)
 
  • Like
  • Fire
Reactions: 9 users

schuey

Regular
Someones trying hard to push us down but like
It seems the Phycology of the shorter is to build up the momentum to Friday only to manipulate on Monday so the retailer panic’s due to this price fall.
Bunch of low life’s bottom feeder’s.
Ass monkeys
 
  • Fire
  • Like
  • Haha
Reactions: 5 users

mrgds

Regular
Hi all,
Curious to hear how many of us have registered for the Virtual Roadshow tomorrow and if you have submitted any questions you would like answered.

I have registered

AKIDA BALLISTA
Ive just had confirmation from TD that Dr Tony Lewis (CTO) will be presenting on tomorrows Virtual Roadshow alongside SH (CEO)

ive been wanting to hear from our new CTO for a while now. (y)
 
  • Like
  • Fire
  • Love
Reactions: 81 users

Newk R

Regular
Lots of mini sales now. And the sellers seem to be falling for the trap.
 
  • Like
  • Fire
Reactions: 6 users

Tothemoon24

Top 20
Telco ? , Special Sauce ?





Optus, Ericsson join forces on energy savings


While results reveal November outage cost to the carrier.


Optus expects to save as much as 26 percent of the energy consumed by its radio access network, after deploying a range of energy efficiency solutions in conjunction with Ericsson.


The technologies are software tools called Massive MIMO Sleep mode, and Booster Carrier Sleep.


Massive MIMO Sleep lets radios hibernate when there’s no traffic, saving around 70 percent of the energy that would be consumed if the radios were fully active.


Booster Carrier Sleep allows the radio carriers to be switched on and off dynamically based on traffic.


The two companies are showing off the solutions at Mobile World Congress in Barcelona.


The software has been deployed at trial sites in Sydney and Melbourne, the companies said in a statement.


They said Massive MIMO Sleep mode is saving 2.5kWh per day, per site, while the Booster Carrier Sleep feature saves 2.23kWh per day, per site.


Combined with modernisation and enhancements completed during 2023, Optus and Ericsson said, daily energy savings at radio sites could reach between 24 percent and 26 percent.


“The enhancements made possible by the Ericsson Massive MIMO Sleep mode and Booster Carrier Sleep solutions are helping Optus take a step closer to our sustainability commitment to reduce our 2025 emissions by 25 percent, from a 2015 baseline and ensure network capacity and performance is maintained, even as power needs are minimised," managing director of networks Lambo Kanagaratnam said.
 
  • Like
  • Fire
  • Thinking
Reactions: 15 users
Whilst we don't get a direct mention and I get it, being the "new kid" on the block in their world, I do like some of the comments and see where we can piggy back and add value in the ecosystem of IFS and ARM in particular.



Intel Launches World’s First Systems Foundry Designed for the AI Era​

February 22, 2024 | BUSINESS WIRE
Estimated reading time: 4 minutes

Intel Corp. (INTC) launched Intel Foundry as a more sustainable systems foundry business designed for the AI era and announced an expanded process roadmap designed to establish leadership into the latter part of this decade. The company also highlighted customer momentum and support from ecosystem partners – including Synopsys, Cadence, Siemens and Ansys – who outlined their readiness to accelerate Intel Foundry customers’ chip designs with tools, design flows and IP portfolios validated for Intel’s advanced packaging and Intel 18A process technologies.

The announcements were made at Intel’s first foundry event, Intel Foundry Direct Connect, where the company gathered customers, ecosystem companies and leaders from across the industry. Among the participants and speakers were U.S. Secretary of Commerce Gina Raimondo, Arm CEO Rene Haas, Microsoft CEO Satya Nadella, OpenAI CEO Sam Altman and others.

More: Intel Foundry Direct Connect (Press Kit)

“AI is profoundly transforming the world and how we think about technology and the silicon that powers it,” said Intel CEO Pat Gelsinger. “This is creating an unprecedented opportunity for the world’s most innovative chip designers and for Intel Foundry, the world’s first systems foundry for the AI era. Together, we can create new markets and revolutionize how the world uses technology to improve people’s lives.”

Process Roadmap Expands Beyond 5N4Y


Intel’s extended process technology roadmap adds Intel 14A to the company’s leading-edge node plan, in addition to several specialized node evolutions. Intel also affirmed that its ambitious five-nodes-in-four-years (5N4Y) process roadmap remains on track and will deliver the industry’s first backside power solution. Company leaders expect Intel will regain process leadership with Intel 18A in 2025.

The new roadmap includes evolutions for Intel 3, Intel 18A and Intel 14A process technologies. It includes Intel 3-T, which is optimized with through-silicon vias for 3D advanced packaging designs and will soon reach manufacturing readiness.

Also highlighted are mature process nodes, including new 12 nanometer nodes expected through the joint development with UMC announced last month. These evolutions are designed to enable customers to develop and deliver products tailored to their specific needs. Intel Foundry plans a new node every two years and node evolutions along the way, giving customers a path to continuously evolve their offerings on Intel’s leading process technology.

Intel also announced the addition of Intel Foundry FCBGA 2D+ to its comprehensive suite of ASAT offerings, which already include FCBGA 2D, EMIB, Foveros and Foveros Direct.

Microsoft Design on Intel 18A Headlines Customer Momentum

Customers are supporting Intel’s long-term systems foundry approach. During Pat Gelsinger’s keynote, Microsoft Chairman and CEO Satya Nadella stated that Microsoft has chosen a chip design it plans to produce on the Intel 18A process.

“We are in the midst of a very exciting platform shift that will fundamentally transform productivity for every individual organization and the entire industry,” Nadella said. “To achieve this vision, we need a reliable supply of the most advanced, high-performance and high-quality semiconductors. That’s why we are so excited to work with Intel Foundry, and why we have chosen a chip design that we plan to produce on Intel 18A process.”

Intel Foundry has design wins across foundry process generations, including Intel 18A, Intel 16 and Intel 3, along with significant customer volume on Intel Foundry ASAT capabilities, including advanced packaging.

In total, across wafer and advanced packaging, Intel Foundry’s expected lifetime deal value is greater than $15 billion.

IP and EDA Vendors Declare Readiness for Intel Process and Packaging Designs

Intellectual property and electronic design automation (EDA) partners Synopsys, Cadence, Siemens, Ansys, Lorentz and Keysight disclosed tool qualification and IP readiness to enable foundry customers to accelerate advanced chip designs on Intel 18A, which offers the foundry industry’s first backside power solution.

These companies also affirmed EDA and IP enablement across Intel node families.
At the same time, several vendors announced plans to collaborate on assembly technology and design flows for Intel’s embedded multi-die interconnect bridge (EMIB) 2.5D packaging technology. These EDA solutions will ensure faster development and delivery of advanced packaging solutions for foundry customers.

Intel also unveiled an "Emerging Business Initiative" that showcases a collaboration with Arm to provide cutting-edge foundry services for Arm-based system-on-chips (SoCs). This initiative presents an important opportunity for Arm and Intel to support startups in developing Arm-based technology and offering essential IP, manufacturing support and financial assistance to foster innovation and growth.

Systems Approach Differentiates Intel Foundry in the AI Era


Intel’s systems foundry approach offers full-stack optimization from the factory network to software. Intel and its ecosystem empower customers to innovate across the entire system through continuous technology improvements, reference designs and new standards.

Stuart Pann, senior vice president of Intel Foundry at Intel said, “We are offering a world-class foundry, delivered from a resilient, more sustainable and secure source of supply, and complemented by unparalleled systems of chips capabilities. Bringing these strengths together gives customers everything they need to engineer and deliver solutions for the most demanding applications.”

Global, Resilient, More Sustainable and Trusted Systems Foundry

Resilient supply chains must also be increasingly sustainable, and today Intel shared its goal of becoming the industry’s most sustainable foundry. In 2023, preliminary estimates show that Intel used 99% renewable electricity in its factories worldwide. Today, the company redoubled its commitment to achieving 100% renewable electricity worldwide, net-positive water and zero waste to landfills by 2030. Intel also reinforced its commitment to net-zero Scope 1 and Scope 2 GHG emissions by 2040 and net-zero upstream Scope 3 emissions by 2050.
 
  • Like
  • Love
  • Fire
Reactions: 33 users
Ive just had confirmation from TD that Dr Tony Lewis (CTO) will be presenting on tomorrows Virtual Roadshow alongside SH (CEO)

ive been wanting to hear from our new CTO for a while now. (y)
Brilliant
 
  • Like
  • Fire
Reactions: 10 users

IloveLamp

Top 20
1000013583.jpg
 
  • Like
  • Fire
  • Love
Reactions: 18 users

Diogenese

Top 20
Hi ILL,

This is an exciting find.

Fact Finder recently reminded us that Akida 2P was expanded from 80 nodes to 256 nodes maximum.

I think that this will be useful for LLMs and, in addition, for AI acceleration in cloud servers.

I've picked the eyes out of the article for support for the idea of Akida 2P in the cloud, but the whole article is well worth reading:


https://www.eetimes.com/arm-updates-css-designs-for-hyperscalers-custom-chips/


CSSes are Arm’s oven-ready designs that combine key SoC elements to give customers a head start when designing custom SoCs. Arm also has an ecosystem of design partners to help with implementation, if required. The overall aim is to make the path to custom silicon faster and more accessible. The recently announced Microsoft Cobalt 100 is based on second-gen CSS (specifically, CSS-N2).

...
Awad said hyperscalers choose Arm because the availability of CSSes means custom solutions can be created quickly and combined with Arm’s robust ecosystem.
“What we’re hearing from everywhere is that, generally speaking for hyperscalers and many of these OEMs, general-purpose compute is just not keeping up, meaning an off-the-shelf SoC is not keeping up,” he said. “We’re really optimistic about [CSSes] and we’ve seen tremendous traction with these platforms.”
The driver for hyperscalers’ desire to build their own chips is undoubtedly AI.
Awad said Arm has customers running AI inference at scale on Arm-based CPUs, in part down to the cost of custom accelerators, and in part down to their availability. (The market-leading data center GPU, Nvidia H100, is in notoriously short supply.) CPUs are widely available and very affordable compared with other options, he said.


...
Arm also expects its CSS designs to be used in tightly-coupled CPU-plus–accelerator designs, analogous to Nvidia Grace Hopper, which is optimized for memory capacity and bandwidth, Awad said.
CSSes don’t just work for hyperscalers; they can also support smaller companies, particularly through the Arm Total Design ecosystem of design partners, he said.
“[Smaller] companies are looking to get to market as quickly as possible to launch their solutions to capture market share, to establish themselves,” he said. “They’re also looking for a level of flexibility so that they can focus their innovation, and then they obviously need the performance to run some of these workload
Collaborative relationship
With CSS, Arm takes responsibility for configuring, optimizing and validating a compute subsystem so the hyperscaler can focus on system-level workload-specific differentiation they care about, whether that’s software tuning, custom acceleration, or something else, said Dermot O’Driscoll, vice president of product solutions for Arm’s infrastructure line of business.
“They get faster time to market, they reduce the cost of engineering, and yet they take advantage of the same leading edge processor technology,” he said. “We created the CSS program to give customers the same kind of control of the silicon stack as they have over their software and system stacks today. This is a close collaborative relationship and our partners push us really hard to raise our game.”

...
The CSS-N3 offers a 20% performance-per-Watt improvement per core over the CSS-N2. This CSS design comes with between 8 and 32 cores, with the 32-core version using as little as 40 W. It’s intended for telecoms, networking, DPU, and cloud applications and can be used with on-chip or separate AI accelerators. The new N3 core is based on Armv9.2 and includes 2 MB private L2 cache per core. It supports the latest versions of PCIe, CXL and UCIe.
 
  • Like
  • Fire
  • Love
Reactions: 38 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
We're trying hard but not really getting anywhere today.

1274087544_cats-on-a-slide.gif
 
  • Haha
  • Like
  • Love
Reactions: 35 users

7für7

Top 20
Possible price sensitive announcement later? 🫠
 
  • Like
Reactions: 5 users

mrgds

Regular
NIIIIIIIICCCCCEEEE .......................... 49 MILLION TRADED AND WE HELD
SUCK ON THAT YOU SCUM SHORTING F#*KS


GREY BABY ................................................................YEAH !!!!!!!!!!!
Screenshot (17).png
 
  • Haha
  • Like
  • Fire
Reactions: 49 users

7für7

Top 20
NIIIIIIIICCCCCEEEE .......................... 49 MILLION TRADED AND WE HELD
SUCK ON THAT YOU SCUM SHORTING F#*KS


GREY BABY ................................................................YEAH !!!!!!!!!!!
View attachment 57894
I hope you would not spend your money into such a cheap quality car. But the colour is acceptable today
 
  • Haha
  • Like
Reactions: 4 users

AARONASX

Holding onto what I've got
That last cross-trade would have cost them more doing it than the share itself

1708924811319.png
 
  • Like
  • Haha
Reactions: 9 users

gilti

Regular
The cut-off time in the closing auction is supposed to be set by a random number generator between 1 & 60 seconds after 4.10 It is astounding how often in the auction the final transaction is just one share that just happens to drop the price by half a cent. The timing is fantastic. Level playing field. my arse.
 
  • Like
  • Haha
  • Fire
Reactions: 27 users

mrgds

Regular
I hope you would not spend your money into such a cheap quality car. But the colour is acceptable today
🤣 ....................... quality will be the same, as Mercedes Benz will just put their badge on a chinese built EV in the years to come.
Apart from Tesla, all European/American brands will be chinese EVs badged in their own brands over the next 5yrs. Sad reality unfolding now.
 
Last edited:
  • Like
  • Sad
  • Fire
Reactions: 9 users

Teach22

Regular
The cut-off time in the closing auction is supposed to be set by a random number generator between 1 & 60 seconds after 4.10 It is astounding how often in the auction the final transaction is just one share that just happens to drop the price by half a cent. The timing is fantastic. Level playing field. my arse.
It’s even more astounding the number of posts we see on this matter…..
 
  • Haha
  • Like
Reactions: 5 users

7für7

Top 20
🤣 ....................... quality will be the same, as Mercedes Benz will just put their badge on a chinese built EV in the years to come.
Apart from Tesla, all European/American brands will be chinese EVs badged in their own colours. Sad reality unfolding now.
Yeah… for sure.. and Mullen will buy them all right?
 
  • Like
Reactions: 1 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Surely this has got to be good for us?

cats-what.gif




f3029a594d3c4d2a2859741c94351491

The edge-native software platform simplifies development, deployment and management of edge AI applications.
BARCELONA, Spain, February 26, 2024
--(BUSINESS WIRE)--Intel (Nasdaq: INTC):
What’s New: At MWC 2024, Intel announced its new Edge Platform, a modular, open software platform enabling enterprises to develop, deploy, run, secure, and manage edge and AI applications at scale with cloud-like simplicity. Together, these capabilities will accelerate time-to-scale deployment for enterprises, contributing to improved total cost of ownership (TCO).
"The edge is the next frontier of digital transformation, being further fueled by AI. We are building on our strong customer base in the market and consolidating our years of software initiatives to the next level in delivering a complete edge-native platform, which is needed to enable infrastructure, applications and efficient AI deployments at scale. Our modular platform is exactly that, driving optimal edge infrastructure performance and streamlining application management for enterprises, giving them both improved competitiveness and improved total cost of ownership."

– Pallavi Mahajan, Intel corporate vice president and general manager of Network and Edge Group Software
Why It Matters: The amount of compute happening at the edge is growing fast because that is where data is generated. In addition, many edge computing deployments are incorporating AI. At the edge, businesses need to automate for many reasons: to achieve pricing competitiveness, to relieve the effects of labor shortages, to expand innovation, to add efficiency, to improve time to market and to deliver new services.
However, working at the edge is often complex and challenging for a variety of reasons:
  • Difficulty building performant edge AI solutions with high return on investment (ROI) across a range of use cases in a specific industry.
  • The diversity of hardware, software and even power requirements at the edge.
  • Lack of secure and cost-effective methods to move and utilize high data volumes required by AI at the edge while maintaining low latency.
  • Increasingly complex operations management of distributed edge devices and applications at scale.
Use cases – with examples including defect detection and preventive maintenance in industrial facilities, frictionless checkout and inventory management in retail, and traffic management and emergency safety in smart cities/transportation – typically require advanced networking and AI analytics at the edge with low latency, locality and cost requirements to meet stringent real-world needs. Additionally, a mix of on-premises analytics with aggregation and placement of some AI processing in the cloud to manage global deployment locations is common. These hybrid AI scenarios require a software platform built to handle them.
And while custom solutions to challenges are available today, they are often built on closed systems and specialized hardware. This makes integrating legacy systems and adding new use cases both costly and time-consuming.
How Intel’s Edge Platform Empowers Enterprises: The open, modular platform will enable ready-made solutions across industries. By leveraging Intel’s edge experience and broad ecosystem to make the most in-demand edge use cases available, enterprises can purchase a complete solution or build their own in existing environments. Enterprise developers can build edge-native AI applications on new or existing infrastructure, and they can manage edge solutions end-to-end for their specific use cases.
The platform provides infrastructure management and AI application development capabilities that can integrate into existing software stacks via open standards.
About Edge-Native Infrastructure: The platform’s edge infrastructure has built-in OpenVINO™ AI inference runtime for edge AI as well as a secure, policy-based automation of IT and OT management tasks. Intel’s OpenVINO has evolved over the past five years to help developers optimize applications for low latency, low power and deployment on existing hardware specifically at the edge, enabling standard hardware already deployed to run AI applications efficiently without costly upgrades or refactoring.
The platform has a single dashboard that enables IT and DevOps personnel to provision, onboard and manage a fleet of edge nodes, including edge servers, industrial controls, HMI devices and others. This is accomplished securely and remotely with zero touch, across day 0/1/2 operations.
Furthermore, closed-loop automation enables operators to leverage policies and observability to trigger business logic from operational alerts at the edge, optimizing operations across the network and improving TCO.
Deep, heterogeneous hardware awareness gives best-in-class capabilities to allocate resources for optimal efficiency, as well as zero-trust security features co-developed for Intel architecture.
About Edge AI + Applications Capabilities: The platform will provide enterprise developers with access to powerful AI capabilities and tools, including:
  • Finely tunable application orchestration for remotely placing latency-sensitive workloads on exactly the right device for best application performance.
  • Powerful low-code to high-code AI model/app development with hybrid AI capabilities from the edge to cloud.
  • A range of horizontal edge services like data annotation services that leverage Intel® Geti™ to build AI models, as well as vertical industry-specific edge services to improve results in common industrial use cases using video and time series information and digital twin capabilities to track and manage environments.
About Intel’s Role in Proven Partner Ecosystem: Intel’s Edge Platform will come to market with industry leaders and broad ecosystem support that includes Amazon Web Services, Lenovo, L&T Technology Services, Red Hat, SAP, Vericast, Verizon Business and Wipro.
Customers on Intel’s Edge Platform
Lenovo:
"Intel’s new Edge Platform integrated with Lenovo Open Cloud Automation, Lenovo XClarity suite of solutions, and deployed on Lenovo’s ThinkEdge servers enable enterprises to develop, deploy, run and manage edge applications at scale with cloud-like simplicity," said Charles Ferland, vice president and general manager, ThinkEdge and Communication Service Providers. "The integrated solution delivers a seamless experience combining truly edge-native capabilities for security, near zero-touch provisioning and management, with Intel and Lenovo’s deep industry experience and unrivaled ecosystems. And, with built-in OpenVINO runtime, it enables businesses to adapt edge and hybrid AI solutions across industry verticals – from finance to healthcare to smart cities and retail."
L&T Technology Services: "LTTS is delighted to partner with Intel on the launch of their new Edge Platform, which promises to democratize access to edge-AI solutions. By running seamlessly on standard hardware and featuring built-in edge-native AI runtime powered by OpenVINO for inferencing, this platform embodies innovation and efficiency," said Abhishek Sinha, chief operating officer and board member at L&T Technology Services (LTTS). "With deep-root hardware optimization at its core, our enterprise customers can trust Intel's Edge Platform to propel them into a future of unparalleled performance and possibilities."
SAP: "By partnering with Intel on the new Edge Platform, we are able to bring the transformational capabilities of SAP Business Technology Platform® (SAP BTP) and SAP business applications together with Intel to make edge-AI computing more accessible for our customers," said Drew Leblanc, head of Strategic Alliances, SAP SE. "This effort is a testament to our ongoing commitment to deliver value for our customers, and we look forward to working with Intel in delivering new use cases for the edge."
Vericast: "Physical retail is on the verge of a major transformation by merging digital media with physical experiences. Intel's new Edge Platform is a key piece of our value chain bringing this exciting trend to market," said Hans Fischmann, vice president of Digital Product Management, Vericast. "Together, we're revolutionizing the digital advertising landscape, seamlessly integrating edge AI capabilities with immersive customer experiences that run on standard hardware. We’re able to run a zero-trust security profile in highly public environments with a scalable, modular platform that can work from a single store to the largest retail chains."
Wipro: "The partnership between Intel and Wipro centers on making edge and AI solutions more accessible for our customers," said Ashish Khare, general manager and global head for IoT, 5G, and Smart Cities at Wipro. "Intel’s new Edge Platform helps us solve the challenges of edge complexity on standard hardware and enables Wipro to deliver the most compelling use cases to drive business results."
More Context: Intel’s Edge Platform is an evolution of the solution first introduced in late September 2023 at Intel Innovation under the code name Project Strata.
 
  • Like
  • Fire
  • Love
Reactions: 52 users
Top Bottom