Dell

Dell Technologies , Internal use - confidential , copyright 2021 . Don’t worry , you’re secret is safe with me , I won’t show anyone …..



View attachment 5865

Nice @Doz

Does tie in as well with prev Dell post I did in the Gen Article thread mid Mar.

We get our mention about 3/4 down in red.



1651999322193.png


The future of artificial intelligence​

Dell Technologies
Dell Technologies
Verified account

3 people liked this article
content
put away
Problems that need to be solved by artificial intelligence
The future of artificial intelligence

Author: Dr. Jia Zhen, Director of Dell Technologies China Research Institute​

Artificial Intelligence (AI) is already ubiquitous. In the era of digital transformation, when people turn to more frequent digital interactions, the massive data from the digital virtual world seamlessly merges with the real physical world. As the amount, variety, and speed of data generation increases, AI represents an important critical step in extracting insights from massive amounts of data and advancing other emerging technologies .
AI algorithms and hardware-accelerated systems are improving business decision-making efficiency, improving business processes, and delivering smarter, more real-time data analysis results at scale. AI is fundamentally changing the way businesses operate, redefining the way people work, and transforming industries on a global scale. In the era of digital transformation, our society and enterprises need to make more use of intelligent information system architecture, application software and algorithms, and data-first strategies to fully realize the business potential of enterprises.
v2-ebeb82c6a2550dda7c378ba3d5f9c955_720w.jpg

Here I also briefly list some key figures reflecting the booming development of artificial intelligence: 62% of global enterprises have invested in artificial intelligence to some extent[1]; 53% of global data and analytics decision makers say they are planning to implement some artificial intelligence in the form of [2]; by 2022, 75% of enterprises will embed intelligent automation into technology and process development [3].
As mentioned above, artificial intelligence has made great progress in recent years, but we still have many problems that need to be solved urgently. In this article, I will first analyze the core problems that remain to be solved in the current development of artificial intelligence, and then propose some ideas for our key development directions in the field of artificial intelligence.
v2-b5503882e18cf0780509a3a01df84398_720w.jpg

Problems that need to be solved by artificial intelligence

  • Algorithmic complexity of artificial intelligence: The mainstream algorithms of artificial intelligence today are based on the Deep Neural Network [13] of Machine Learning. With the development of artificial intelligence technology, the structure of deep neural network is becoming more and more complex, and there are more and more hyperparameters. Sophisticated deep neural networks improve the accuracy of machine learning models, but configuring and debugging such complex networks can be prohibitive for ordinary users of artificial intelligence. The ease of development, debugging, and deployment of deep neural network algorithms and applications is also becoming more and more urgent .
  • Data scarcity of artificial intelligence: The efficient reasoning and recognition of deep neural networks nowadays mainly depends on the support of a large amount of training data. Open databases such as ImageNet [9] provide thousands of images, videos and corresponding annotation information. Through the training of a large amount of data, the machine learning model can almost cover the changes of various reasoning and recognition scenarios. However, if the amount of data is not enough or the type is not comprehensive enough, the performance of the machine learning model is bound to be limited. In the application of artificial intelligence in industry, the problem of data shortage is particularly prominent. Different from traditional reasoning and recognition applications for ordinary consumers, artificial intelligence applications in the industry are often unique business-related problems (such as: intelligent manufacturing, remote system debugging and maintenance, etc.), corresponding data (especially negative samples) Very few. In the case of shortage of training data, how to improve the algorithm model of artificial intelligence so that it can still work efficiently under specific scenarios and limited data is also a new and urgent task .
  • High computational consumption of artificial intelligence: As mentioned in the previous two aspects, the complexity of deep neural networks and the diversity of big data will lead to the high consumption of computing resources in current artificial intelligence applications. At present, the training of more advanced machine learning models, such as GPT-3, takes several months to utilize high-performance clusters [10]. Ordinary machine learning models can take hours or even days to train on traditional x86 high-performance servers if the amount of data is large. At the same time, when the trained model performs inference and recognition tasks, due to the complex model structure, many hyperparameters, and complex calculations, the requirements for computing resources of terminal devices that process data are also higher. For example, lightweight IoT devices cannot run complex machine learning inference and recognition models, or for smart terminal devices, such as smartphones, running complex machine learning models will lead to large battery consumption. How to better and fully optimize computing resources to support machine learning training and inference recognition is also another new urgent task.
  • Interpretability of artificial intelligence: Artificial intelligence technology using deep neural networks, due to the complexity of neural networks, many times people treat them as a "black box". The user inputs the data that needs to be recognized by reasoning, and the deep neural network obtains the result of reasoning and recognition through a series of "complex and unknown" mathematical processing. However, we cannot intuitively analyze why the input data will get the corresponding results through the complex neural network. In some key AI areas, such as autonomous driving, the interpretability of AI decisions is critical . Why does an automated driving system make such a driving decision in some critical safety-related scenarios? Why is the reasoning and recognition of road conditions sometimes wrong? These inference and identification conclusions from the "black box" must be interpretable and must be traceable. Only when artificial intelligence can be explained can we find the basis for decision-making and judgment and find out the reason for the error of reasoning and identification. "From effect to cause", we can improve the performance of deep neural network, so that it can provide artificial intelligence applications more efficiently, safely and reliably in different occasions .
Of course, in addition to the above-mentioned four major problems that AI needs to solve urgently, AI also has some other limitations, such as the privacy of AI, the generality of AI, the scarcity of talents for AI development, and the lack of AI. Legal constraints, etc., I will not repeat them here. In this article, I will focus on the four main issues listed above and explore the way forward .
v2-eb32811d4385194fecba3f58ee470757_720w.jpg

The future of artificial intelligence

In view of the four major problems that artificial intelligence needs to solve urgently listed above, I will briefly describe the main technical directions that we need to pay attention to for future development:
  • First, we need to be facilitators of the “3rd Wave AI”, preparing our corporate society for the coming AI revolution. These changes will drive our data management, artificial intelligence algorithms, and hardware accelerators to flourish. We need to actively develop new models of collaboration with clients and research entities driving the “third wave of AI.” So, what is the "third wave of artificial intelligence"?
    • From an algorithmic point of view, we summarize it as the concept of Contextual Adaptation. Specifically, we need to pay more attention to the following algorithm development trends:
      • We need to establish reliable decision-making capabilities in artificial intelligence systems, so that people can understand or analyze why the "black box" machine learning algorithm model makes inference and identification decisions. Specifically, there are three problems that need to be solved for safe and reliable artificial intelligence: boundary problem, backtracking problem and verifiable problem . We call such a capability “AI explainability” [5].
      • How to build AI systems that can train machine learning models with one (One-Shot Learning [6]) or very few (Few-Shot Learning [7]) examples. As mentioned above, in real industrial application scenarios, data is relatively scarce. Effectively constructing and training machine learning models under extremely limited data is a hot research direction at present .
      • Compared with the traditional and open-loop offline learning (Offline Learning), online learning (Online Learning) [20], as an emerging direction, is a closed-loop system: the machine learning model sends the inference and recognition results to the user based on the current parameters and architecture, User feedback is collected and used to update the optimization model, thus completing an optimization process that continuously receives information and updates iteratively. In other words, machine learning models need to dynamically accept sequential data and update themselves to optimize performance .
      • Multi-Task Learning [21] refers to a learning method in which the training data contains samples from multiple different scenes, and the scene information is used to improve the performance of machine learning tasks during the learning process. The scene adaptation method in traditional transfer learning usually only realizes the bidirectional knowledge transfer between the original scene and the target scene, while multi-scene task learning encourages the bidirectional knowledge transfer between multiple scenes .
      • The machine learning model is trained based on the contextual information of the context. With the passage of time and the migration of the scene, the artificial intelligence system will gradually learn the method of constructing the updated model autonomously [11]. Machine learning models derived from contextual learning (Contextual Learning [15]) will be used to better perceive the world and help humans make inference decisions more intelligently .
      • With the rapid development of artificial intelligence technology, knowledge representation and knowledge reasoning based on deep neural networks have received more and more attention, and scene knowledge graphs of different scenarios have appeared one after another [22]. As a semantic network, the scene knowledge graph depicts scene knowledge and provides the basis for inference and recognition tasks within the scene. As an application of knowledge reasoning, the question answering system based on knowledge graph has made great progress .
      • Machine learning models derived from contextual learning can also help us better abstract our data and the world we need to perceive [16], thereby making our artificial intelligence systems more generalized and adaptable Solve all kinds of complex problems.
In conclusion, the advanced algorithms of the "third wave of artificial intelligence" can not only extract valuable information (Learn) from the data in the environment (Perceive), but also create new meanings (Abstract). , and has the ability to assist human planning and decision-making (Reasoning), while meeting human needs (Integration, Integration) and concerns (Ethics, Security) .
  • From a hardware perspective, the accelerators of Domain Specific Architectures (DSA) [12] enable the third-wave AI algorithms to operate in a hybrid ecosystem consisting of Edge, Core, and Cloud. run anywhere in the system . Specifically, accelerators for specific domain architectures include the following examples: Nvidia's GPU, Xilinx's FPGA, Google's TPU, and artificial intelligence acceleration chips such as BrainChip's Akida Neural Processer, GraphCore's Intelligent Processing Unit (IPU), Cambrian's Machine Learning Unit (MLU) and more. These types of domain-specific architecture accelerators will be integrated into more information devices, architectures, and ecosystems by requiring less training data and being able to operate at lower power when needed. In response to this trend, the area where we need to focus on development is to develop a unified heterogeneous architecture approach that enables information systems to easily integrate and configure various different types of domain-specific architecture hardware accelerators. For Dell Technologies, we can leverage Dell's vast global supply chain and sales network to attract domain-specific architecture accelerator suppliers to adhere to the standard interfaces defined by Dell to achieve a unified heterogeneous architecture .
To sum up, the hardware of the "third wave of artificial intelligence" should not only be more powerful (Powerful), but also smarter (Strategic) and more efficient (Efficient and Efficient).
v2-13a7a4e51cc97ffcd9fa8f9f5fecca5c_720w.jpg

In addition to the above-mentioned development of algorithms and hardware that drives the “third wave of artificial intelligence”, another development direction that requires more attention is artificial intelligence automation (AutoML) [12]. As mentioned above, the development of artificial intelligence is becoming more and more complex, and for ordinary users, the professional skills threshold for using artificial intelligence is getting higher and higher. We urgently need to provide a complete set of information system architecture solutions that " make artificial intelligence simple ".
  • We need to better operate and manage AI workloads, driving the simplification and optimization of information system architectures. Within the entire software stack of AI applications, we need to define " Easy Buttons " for future AI workloads . Specifically, we have the following technical directions to focus on:
    • Develop a more simple and easy-to-use common API (Application Protocol Interface) for the advanced artificial intelligence algorithm framework , so that the information system architecture can integrate and use more advanced and complex algorithms.
    • For artificial intelligence algorithms, we need to provide machine learning model parameters adaptive (Adaptive) selection and tuning (Tuning) strategies , according to the needs of users, automatically select the most suitable algorithm, and optimize the parameters of the algorithm to achieve the best performance.
    • For the artificial intelligence data processing process (Pipeline), we need to establish the functions of process tracking, analysis and reuse , such as MLOps (Machine Learning Operation) described in [14]. Machine learning process management (MLOps) is the practice of creating new machine learning (ML) and deep learning (DL) models and deploying them into production through repeatable, automated workflows. When we have new artificial intelligence application problems, we can learn from the existing data processing process, and after a little analysis and modification, we can reuse the more mature artificial intelligence software and hardware solutions to meet new needs, thereby reducing repeated development. waste of resources.
    • When our artificial intelligence system is deployed, our algorithm model still needs to have the evolution function of self-update, self-learning, and self-tuning . According to the changes of inference recognition scenarios and inference recognition tasks and the attenuation of algorithm accuracy, we use edge and cloud information system architecture to fully mobilize different computing resources to update, optimize and deploy our algorithm models. In the process of updating and deploying artificial intelligence models, we also use the latest algorithms such as model compression [17], data distillation [19], and knowledge distillation [18], so as to make full use of limited computing resources.
    • We need to consider integrating the above AI-enabled automation services in multi-cloud and hybrid cloud environments, in line with Data Management and Orchestration, to create a complete and intelligent AI service platform .
In conclusion, the automation of artificial intelligence should not only be easier (Easy to Use), but also more flexible (Adapt) and more capable of self-learning and growth (Evolve).
v2-d5e7115c3eb914a5ae21513675cbb8b6_720w.jpg

Technological innovation at Dell Technologies never stops. Our mission is to promote the progress of human society, promote technological innovation, and become the most important technology company in the data age. Our AI solutions will help our clients free themselves from the current complex processes of large-scale data processing, analysis and insights (Insights). The Research Office of our Office of CTO is also actively exploring the aforementioned AI development directions. We are committed to helping our clients make better use of state-of-the-art information system architectures, understand their data efficiently and in a timely manner, and bring greater value to their commercial business innovations .
Acknowledgments: I would like to thank the artificial intelligence research team of Dell Technologies China Research Institute (Li Sanping, Ni Jiacheng, Chen Qiang, Wang Zijia, Yang Wenbin, etc.) for their excellent research in the field of artificial intelligence. Their work results strongly support the content of this article.
 
  • Like
  • Fire
Reactions: 40 users
D

Deleted member 118

Guest
Always good hearing from the CEO (not able to watch)

 
  • Fire
Reactions: 2 users
  • Like
  • Fire
Reactions: 5 users
Always good hearing from the CEO (not able to watch)


Dell drops a hint at 4:20 m when asked what’s next for Dell. Answer - Telco,edge.
Sean recently stated currently negotiating with a Telco! I think this is where Dell Is utilising Akida😋
 
Last edited:
  • Like
  • Fire
  • Wow
Reactions: 16 users
D

Deleted member 118

Guest
Dell drops a hint at 4:20 m when asked what’s next for Dell. Answer - Telco,edge.
Sean recently stated currently with a Telco! I think this is where Dell Is utilising Akida😋

Edge: Telecom’s New Battleground​

Communication service providers find themselves in a unique position to capitalize on this growth opportunity, and we are primed and ready to support and accelerate that process.
By Bernd Kaponig | August 19, 2021
The network edge is poised to transform the way communication service providers (CSPs) generate revenue—and it couldn’t have arrived at a better time. As income from connectivity services continues to decline and investment in 5G soars, edge computing represents a new—and hotly contended—growth opportunity for network operators.
The telecom edge with Multi-Access Edge Computing (MEC) connects distributed cloud technologies, digital platforms, and new business models to form a multi-tenant, distributed edge cloud ecosystem. MEC enables low-latency and high-bandwidth use cases that aren’t possible with centralized cloud architectures. These use cases encompass everything from autonomous transportation, AR/VR, gaming, as well as real-time sensory and image processing.
Such growth potential is being propelled forward by the rapid expansion of IoT, the unparalleled promise of 5G, and more. Edge computing is not only complementary to these technologies, in many cases it’s essential to realizing their true value.

5G and edge work hand-in-hand​

5G stands to completely transform the mobile communications network, delivering new levels of coverage, reliability, unprecedented low latency and high-speed data transfer. As the rollout of 5G accelerates, network disaggregation is uprooting traditional ‘black box’ models, enabling more agility and significantly lower total cost of ownership for operators. CSPs now find themselves in a unique position to build new partnerships and solutions using hardware and software previously unavailable to them.
While 5G’s most transformative use cases are yet to be discovered, it is certain that they will be born at the edge. By bringing compute power into the network and enabling applications to be hosted closer to end-users, data can be analyzed closer to its source without needing to travel all the way back to a centralized cloud or data center. In essence, edge computing delivers the ultra-low latency required to unleash the benefits of 5G at scale.
Enterprise decision-makers are already waking up to the possibilities of some of 5G’s many applications. Take 5G private mobility for example, which will deliver critical wireless communication, connectivity and security to inner city offices, offshore oil rigs and everything in between.

Edge computing is key to sustaining IoT’s growing momentum​

IDC estimates that 41.6 billion IoT connected devices will be in use by 2025. As these devices proliferate, there is an increasing need for compute, storage and networking infrastructure to exist closer to where applications are consumed. The telecom edge can service that need, reducing dependency on centralized clouds while facilitating the collation and analysis of data locally.
CSPs that leverage edge computing can facilitate IoT deployments with reduced latency, improved throughput, enhanced security, as well as context and location awareness.

The race to conquer the edge has already began​

Operators are already hard at work building the network infrastructure required to harness this new ecosystem of lucrative technologies, but monetizing that investment is not a simple process. Overcoming the technical, organizational and commercial challenges of venturing into a new market can be difficult, and with both telecom (and non-telecom) competitors moving fast, those that lag behind risk losing their share of the prize.
Operators should now consider partnerships and co-creation to quickly develop the solutions enterprises are looking for and tap into new revenue streams that will come to define the next chapter of telecommunications.

Enter Dell Technologies Solutions Co-Creation Services​

Combining technology, expertise and sales reach, Dell Technologies Solutions Co-Creation Services (SCS) function as a joint venture engine for the conception, realization and commercialization of co-created solutions. We specialize in supporting CSPs where telecom and cloud converge on edge, 5G and more.
Whether you want to kick-start conversations around creating a new solution, have a desire to co-create with Dell Technologies but are unsure of exactly what to build, or you have already developed an idea and simply need support in making it a reality—we are at your service.
Together, we can conceptualize, build and test joint solutions in state-of-the-art lab facilities. Through deep collaboration, combined expertise, and a proven methodology, we bring together everything necessary to deliver a successful go-to-market launch. And that’s just the start. Solutions built with SCS can tap into Dell’s 43,000 strong sales force that reaches almost every enterprise on the planet.

How it works​

Ideation—We start by identifying new opportunities through collaborative, interactive and structured ideation workshops. We then jointly develop ideas that show clear market potential, as well as business models to realize that potential.
Business Case—Once a specific solution has been chosen, we work out the business case to determine the commercial viability of the solution for both parties before committing to the build process.
Planning—In order to ensure a smooth execution of the solution creation project, we take time to intelligently map the development process together, determining if any new equipment needs to be acquired, what laboratory resources are needed etc. as well as what sales activities we will deliver during the campaign.
Execution—Our combined experts build, test and review the solution, and create all the supporting sales and marketing collateral. When everyone is satisfied, we launch.
Campaign—We provide post-launch sales enablement and demand generation, leveraging the sales collateral and the strength of Dell’s sales force.

Start building the solutions enterprises are looking for today​

Edge computing’s market worth is expected to reach $61.14 billion by 2028, exhibiting a compound annual growth rate of 38.4%. Communication service providers find themselves in a unique position to capitalize on this growth opportunity, and the Dell Technologies SCS team is primed and ready to support and accelerate that process.
 
  • Like
  • Fire
Reactions: 8 users
D

Deleted member 118

Guest
Here is the latest as the above was from last year



Edge Compute and Private 5G: Ready for Digital Transformation​

Providing a cloud native edge compute platform capable of services across many enterprise industries.
By Hanna Tong | March 22, 2022
2022 will be a pivotal year for enterprise digital transformation as modular cloud native edge compute and private 5G network equipment, devices and spectrum become more available. Digital transformation is not an option for enterprises. It is table stakes for enterprises to stay ahead of their competition, drive continuous cost reductions, while creating new offerings to generate new revenue streams. Dell Technologies Services Edge 1.2 release offers edge computing and private 5G, ready now for enterprise digital transformation.
According to IDC, manufacturing and retail businesses are among the top in edge spending forecasts. Retailers understand consumers demand an omni-channel experience. Manufacturing and resources companies are constantly looking for new ways to automate and to reduce the total cost of ownership (TCO). Based on a survey conducted by Analysys Mason and Accedian, 76% of manufacturers plan to adopt private 5G by 2024. Automating the detection of defects using sensors or robotics, predictive maintenance to reduce downtime, robotics such as automated guided vehicles (AGVs) to reduce labor costs and computer vision to detect safety compliances are just some examples of use cases where fast computation with milliseconds latency is critical. Private 5G networks provide low latency connectivity for sensors and devices, generating volumes of data to be processed on the premise edge, thereby providing data sovereignty, privacy, control and security to enterprises.
Private 5G provides a stable, reliable network for mission-critical industrial applications such as powering digital mines, closing connectivity gaps in remote mining areas, providing employee safety and increasing productivity. Dell Technologies, a key infrastructure and technology partner of Rogers Business 5G wireless private network (WPN) solution for the launch of the Kirkland Lake Gold’s Detour Lake Mine, the second-largest gold producing mine in Canada, provides scalable, modular, virtualized edge infrastructure enabling end-to-end reliability, redundancy and low latency network.
Communications service providers (CSPs) play an important role in the success of enterprise digital transformation. TBR reported CSPs will represent over 77% of the total edge infrastructure market, which is projected to total $100B in 2025. An Analysys Mason and Accedian survey reported 68% of manufacturers are interested in working with managed service providers with the expertise to deploy private 5G network. CSPs can bundle cloud and edge infrastructure, SD-WAN and network security and transport in an as-a-service offering to simplify edge deployment and help enterprises scale their investment according to the growth of the business.
How does an enterprise jump-start the edge transformation?
  • Look for cloud native modular infrastructure that supports multiple catalogs of workloads. In today’s rapidly changing competitive landscape, a flexible platform for innovation is required that is simple to deploy with confidence, open, composable and disaggregated so IT can mix and match for optimum TCO and investment protection.
  • Adopt virtualization and flexible hardware options versus purpose-built kit.
  • Use a hybrid cloud approach by moving applications requiring low latency to the user or device generating data.
  • Develop an open ecosystem with security built into every layer of the stack.
At Dell Technologies, our Services Edge offering is designed for CSPs and enterprises to succeed in digital edge transformation. CSPs are in the best position by combining their vast array of services from SDWAN, MPLS, mobile networking and their deep services expertise.
Dell Technologies Services Edge 1.2 powered by Intel® Smart Edge and Red Hat OpenShift provides a cloud native edge compute platform able to deliver platform services across many enterprise industries. An integrated orchestration and management console providing a single pane of glass can be deployed in a remote central location supporting multiple sites and tenancy. Dell Services Edge 1.2 security capabilities use a zero-trust model for all connections, users and applications, even its own internal microservices, thereby reducing attack opportunities, supporting your efforts to assemble a zero trust network architecture (ZTNA).
Highlights of this release:
  • PowerEdge R750 and Rugged XR11/12 servers powered by 3rd generation Intel Xeon processors.
  • End-to-end solution validation of Dell edge compute platform, Airspan Networks’ OpenRange AirVelocity 5G radios and virtualized open RAN software pre-integrated validation of manufacturing use cases with Litmus Industrial IoT edge platform.
  • Centralized management and orchestration of compute, connectivity and IoT edge platform.
  • VMware SD-WAN access support on the edge node.
  • Intrinsic security at every layer and IPSec connectivity.
Airspan Networks is a pioneer in end-to-end open RAN architecture solutions with innovative 5G indoor/outdoor technology, private networks for enterprise customers and industrial use applications, fixed wireless access and CBRS solutions. Airspan OpenRANGE 5G RAN software vDU and vCU are validated on Dell Technologies Services edge platform and Dell Technologies PowerEdge servers designed to accelerate application performance with scalable business architecture and integrated security into the full server lifecycle.
Litmus Edge is an all-in-one Industrial IoT (IIoT) platform designed to unify all OT data, put that data intelligence to work to power applications and scale seamlessly with centralized management. The flexible Litmus Edge architecture lays the foundation for complete, secure, enterprise-scale data flow to enable any IIoT use case and to improve operations at scale.
Dell Technologies believes Services Edgemodular approach is the infrastructure solution foundational for mission critical manufacturing use cases as well as across other industries that want to transform their edge for lower TCO and to implement low latency applications for monetization.
 
  • Like
Reactions: 9 users
D

Deleted member 118

Guest
So is Dell the telecommunications company?
 
  • Like
  • Thinking
  • Wow
Reactions: 7 users
D

Deleted member 118

Guest
  • Like
Reactions: 3 users

IloveLamp

Top 20
  • Like
  • Fire
Reactions: 13 users

IloveLamp

Top 20
Could possibly hear more from DELL later this year / early next year.............


Dell Technologies is working with industry standards groups and partners to further evolve computational storage technologies and delivering integrated solutions for customers. Architectures for federated data processing will further evolve in 2022 and pave the way for the next evolution in data-centric computing.

We will see “data storage” systems of today evolve to “data-aware” systems of future. These systems will be able to auto-discover, classify and transform data based on policies and enable organizations to move from digital-first to data-first organizations. The application specific data processing will federate closer to data and optimize the overall economics of data center and edge-to-cloud architectures. Stay tuned for more on this later in 2022 and 2023.
 
  • Like
  • Love
  • Fire
Reactions: 14 users
Top Bottom