I’ve always just assumed neuromorphic computing was mainly going to benefit EV’s but here’s a research paper using SNN’s to improve the efficiency of ICE vehicles cheaply and quickly.
The U.S. Department of Energy's Office of Scientific and Technical Information
www.osti.gov
INTRODUCTION
Neuromorphic computers provide an intriguing platform for low power artificial intelligence (AI) at the edge [1]. However, there are several key hurdles to developing neuromorphic com- puting solutions to real-world applications, including finding sufficiently low power neuromorphic hardware, implementing the appropriate algorithms used to train a spiking neural network (SNN) to deploy onto the hardware platform, and ensuring that the trained SNN fits within the constraints of the pre-determined neuromorphic hardware system.
In this work, we present a complete pipeline for training and deploying a low size, weight, and power (SWaP) neuromorphic hardware solution for a real-world application. The real-world application of interest is an engine control unit (ECU) of a spark-ignition internal combustion engine. To improve engine efficiency and reduce greenhouse gas emissions, transportation researchers have developed advanced combustion strategies that minimize the amount of fuel needed to run the engine. However, as the engine efficiency is pushed to the practical limit, instabilities in the combustion process cause sporadic misfires and partial burns. Such events increase the cycle- to-cycle variability (CCV) of the combustion process which causes undesired levels of noise, vibration, and harshness (NVH). In order to mitigate combustion instabilities, the ECU should command a higher amount of fuel during combustion cycles where misfires or partial burns would otherwise occur. In this case, we would like to utilize a neuromorphic hardware system that will take as input information about the engine combustion’s state at each cycle and provide as output the amount of fuel to inject during the next cycle. The goal is to keep the engine running with few or no engine misfires and partial burns in order to minimize combustion CCV and keep the engine running smoothly, but also to minimize the amount of extra fuel needed to stabilize the combustion events.
FUTURE WORK AND CONCLUSIONS
In this work, we have demonstrated a complete neuromor-
phic workflow, from application to hardware, for training and
deploying low size, weight, and power neuromorphic solutions
for real-world applications. We demonstrated the results of this
workflow on a control task to improve fuel efficiency in spark-
ignition internal combustion engines. By utilizing low power
AI hardware such as neuromorphic systems, we can potentially
enable more efficient engines and reduce greenhouse gas emissions. We show that it is feasible to utilize an SNN approach as deployed on neuromorphic hardware to control an engine in simulation. We show that even this preliminary SNN approach can outperform current, open-loop control strategies. Moreover, we demonstrate the resulting SNNs are very small and sparse and can be deployed onto an inexpensive and low size, weight, and power commercial FPGA, providing the opportunity for rapid deployment.
There are several avenues that we intend to pursue for future work. First, the engine simulators we use in this work are based on a single set of engine operating conditions. We intend to apply this same workflow to train SNNs for other engine characteristic settings.
Hi TopCat .
Have a look at this list of trending tech ,
Can see the mighty chip ticking a few boxes
Top 19 New Technology Trends for 2022 - 2023
By
Archer Charles
24-Jan-2023
https://www.koenig-solutions.com/blog/top-new-technology-trends#
LinkedIn
We live in an era called the information age. New technology is emerging every day to make life simpler, more advanced and better for everyone. The rate at which technology is evolving is almost exponential today. For business organisations, new technology helps to reduce costs, enhance customer experiences and increase profits.
Nearly 50 billion devices will have internet connectivity by 2030. With the COVID-19 pandemic boosting businesses’ digital transformation journey, this goal is closer than anticipated. The familiarity with the latest IT technology holds the key to advancing your career and exploring new opportunities.
Here are some of the Latest Top Technology Trends for 2022 - 2023:
- Artificial Intelligence (AI) and Machine Learning (ML)
- Robotic Process Automation (RPA)
- Edge Computing
- Quantum Computing
- Virtual Reality (VR) and Augmented Reality (AR)
- Blockchain
- Internet of Things (IoT)
- 5G
- Cybersecurity
- Full Stack Development
- Computing Power
- Datafication
- Digital Trust
- Internet of Behaviours
- Predictive analytics
- DevOps
- 3D Printing
- AI-as-a-Service
- Genomics
Now, Let's discuss the top technology trends-
1. Artificial Intelligence (AI) and Machine Learning (ML)
Artificial Intelligence, also known as AI, started gaining popularity a decade ago. It has still not slowed down and continues to be one of the leading technologies in 2022-2023. AI is constantly evolving, and newer applications for this emerging technology continue to spring upon the scene. Today’s most popular AI applications are image and speech recognition, navigation programs, voice assistants like Siri and Alexa, and much more.
Organisations are looking to use AI to analyse customer and business interactions to derive insights and identify triggers. It will help them predict the demand for services such as hospitals or tourism and aid in the improvement of resource allocation for various projects.
Machine Learning (ML) is a part of AI and uses supervised learning to learn new functions. It has seen a massive surge in demand for skilled professionals, making it an attractive trend to watch. According to Forrester, AI and Machine Learning will be responsible for 9% of all new jobs in the US by 2025.
Enquire Now
2. Robotic Process Automation (RPA)
Robotic Process Automation (RPA) uses multiple software and applications to automate business processes such as data collection and analysis, customer service and other repetitive tasks managed previously through manual processes.
Like AI and Machine Learning, RPA is a rapidly advancing technology that automates many jobs across different industries. McKinsey has analysed that fewer than 5% of jobs today can be entirely automated, but nearly 60% can be automated at least partially.
RPA offers several new career options and trajectories such as a programmer, project manager, business analyst or consultant. It also opens doors to high-paying jobs with a moderate learning curve in leading organisations. Choosing this emerging technology as a career move can profit you immensely.
3. Edge Computing
Today, millions of data points are collecting user information from various sources such as social media, websites, emails, and web searches. As the amount of data collected increases exponentially, other technologies like cloud computing fall short in several situations.
Till a decade back,
Cloud Computing was one of the fastest-growing technologies. However, it has become fairly mainstream, with the market dominated by major players such as Amazon Web Services (AWS), Microsoft Azure and Google Cloud Platform.
As more organisations adopted Cloud Computing, they found several limitations in the technology. Edge Computing helps to bypass the latency that Cloud Computing causes and helps organisations to get data into a data centre for processing. It can exist ‘on edge’, in the sense that it is closer to where the data processing will ultimately take place. Edge Computing is used to process time-sensitive data in far-off locations with limited or no connectivity.
Edge Computing application will keep growing as IoT devices increase. Its market share is set to reach $6.72 billion by 2022.
You May Also Like: Machine Learning Certification: Everything You Need to Know
4. Quantum Computing
Quantum Computing is a type of computing that focuses on developing computer technology based on the principles of quantum theory. This theory explains the behaviour of energy and materials on atomic and subatomic levels. In other words, it performs calculations based on the probability of an object’s state before measurement instead of just 0’s and 1’s.
Quantum Computing can easily query, analyse and take action based on given data, regardless of the source. It played a major role in preventing COVID-19 and developing new vaccines. These computers are exponentially faster than normal computers. The revenue for the Quantum Computing market is projected to cross $2.5 billion by 2029.
You need experience with quantum mechanics, linear algebra,
machine learning, and information theory to enter this field.
5. Virtual Reality (VR) and Augmented Reality (AR)
VR and AR have been popular for almost a decade now.
Virtual Reality immerses the user into a new environment, while Augmented Reality enhances the user’s existing environment. While their applications so far have been majorly linked with gaming and filters on social media, simulation software such as Virtual Ship is also used to train the US Navy, Army and Coast Guard ship captains.
A whopping 14 million AR and VR devices were sold in 2019. The global market for this trending technology is predicted to reach $209.2 billion by 2022, which means more job opportunities for professionals in this field.
By 2022, AR and VR are expected to integrate into our everyday lives much more deeply than today. They have huge potential and possible applications in training, entertainment, education, marketing and therapy or post-injury rehabilitation. It is also widely used by advertisers and brands to create new immersive experiences for their customers.
Starting a career in VR or AR doesn’t require too much specialisation. Basic programming skills and a forward-thinking mindset coupled with optics as a skill set can help you easily secure a job in this field.
6. Blockchain
Blockchain was popularised in the context of cryptocurrency and Bitcoin and the security it provides. However, it offers security that can be useful in several other ways as well. Blockchain can be defined as data that you can only add to, not take away or alter. It results in many sections of data which form a ‘chain’, hence the name Blockchain.
The fact that existing data cannot be altered or removed makes
Blockchain a highly secure technology. Blockchains are consensus-driven, which means no single person or organisation can take control of the data. There is no need for a third party to oversee transactions.
As more industries adopt and implement blockchains, the demand for skilled blockchain developers has also increased. It requires the hands-on experience of programming languages, basic knowledge of OOPS, flat and relational databases, data structures, networking and web application development.
7. Internet of Things (IoT)
It is one of the most promising technologies of the decade. Multiple devices or ‘things’ today are wifi-enabled, which means they can be connected to the internet. The Internet of Things is a network of diverse connected devices. Devices within the network can communicate with each other, collect data and transfer it across the network without human intervention.
There are hundreds of real-life
Internet of Things (IoT)applications - from tracking activity using smart devices that connect to your phone, to remotely monitoring home doors or switching applications on and off. Businesses also use IoT for many things like monitoring activity in remote locations from a central hub and predicting when a device will malfunction so that corrective measures can be taken before it’s too late.
It is predicted that by 2030, over 50 billion devices will be connected via the Internet of Things. Global spending on this latest technology will reach an estimated $1.1 trillion in the next two years. IoT is currently in its initial stages and will advance rapidly in the near future. It requires knowledge of AI and Machine Learning fundamentals, as well as information security and data analytics.
Additional Read: How to Become a Certified Ethical Hacker (CEH)
8. 5G Technology
5G technology has the potential to change the way we see the online world. 3G and 4G technology transformed how we interacted with mobile devices, enabling faster internet browsing, using data-driven services, and increasing bandwidth for live streaming.
5G aims to revolutionise our virtual interactions by integrating AR and VR technology and better cloud-based gaming experiences. It will also be used in factories and enterprises for monitoring and streamlining operations. 5G also has applications in road safety and rule implementation, smart grid control and smart retail experiences, in the form of live high-definition cameras.
Telecom companies around the world are working on creating 5G-ready services and devices. The technology was announced and rolled out in select places in 2020, with a worldwide launch expected in 2022. The launch of 5G has been delayed for a while but is set to quickly reach the world and become a part of every person’s life.
9. Cybersecurity
Since the dawn of computers,
cybersecurity has played a major role in ensuring safer user experiences. It is not a new trend, but given that technology is evolving rapidly, cybersecurity measures need to be constantly upgraded and improved. Threats and hacking attempts are growing in number and intensity, which calls for improving security protocols and strengthening systems against malicious attacks.
Data is the most valuable asset today, and hackers are constantly trying to steal data or information. This is why cybersecurity will always be a trending technology and need to constantly evolve to stay ahead of hackers. Demand for cybersecurity professionals is growing three times faster than any other tech jobs today. More and more businesses realise its importance, resulting in businesses spending about $6 trillion on cybersecurity by 2022.
Cybersecurity job roles transition from the ethical hacker to security engineer to Chief Security Officer. The pay is significantly more than in other technology job roles due to its significance in ensuring a secure user experience.
10. Full Stack Development
This is one of the newest industry trends within the software domain to gain momentum. It continues to rise as IoT becomes a more mainstream technology with diverse applications. Full stack development covers the front and back-end development of applications and websites.
Organisations are working on developing more user-friendly and comprehensive apps for their target audience. For this, a full stack developer must have a deep understanding of server-side programming and web development. If you develop the skills needed to create a website, your services will always have a place in the industry. More and more businesses are moving to digital every day, increasing the demand for web developers and designers.
Enquire Now
11. Computing Power
The digital era has computerised every mobile device and application, firmly establishing computing power in this generation. Data scientists have predicted that the infrastructure used to harness this computing power is only going to evolve in the coming years. Computing power is giving us advanced technology to make our lives better and is also creating more jobs in the tech industry. Fields like data science, data analytics, IT management, robotics, etc have the potential to create the largest percentage of employment in the country. Many international brands and companies hire from India because of the specialised training widely available in the country. The more computing our devices need, the more specialised professionals will be required and the economy will flourish as a result.
12. Datafication
When we convert parts of our lives into software and devices through data, we are going through datafication. In this process, data-driven technology takes over human chores and tasks. Smartphones, office applications, industrial machines and even AI devices, all use data to interact with us and improve our lifestyles. Data has been a part of our lives for longer than you can even imagine. Storing this data requires security, which has led to an increase in the demand for data security specialisations in our country. IT professionals, data professionals, engineers, managers, technicians, all work in the same field. Careers in the data field require more skill than high-level qualifications. These skills can be acquired by doing some courses that teach you how the world of data works.
13. Digital Trust
The world is tangling and being accommodated with technology and mobile devices, leading to the development of high trust towards these technologies. The same trust is also leading the way to a number of innovations. With various data security measures being taken, people believe that technology can help us build a reliable, secure and safe digital world. This also leads to companies inventing and innovating new things without having to worry about data security. Cyber security, ethical hacking, etc are a fewspecialisations that can be used to enter this field. There is also an array of jobs available nationally and internationally. There are professional certifications and normal courses available for all courses which can lead to a high-paying job role.
14. Internet of Behaviours
This is one of the fastest-growing industry trends and has found applications in nearly every industry. IoB is a technology that uses data collected from user devices connected to the internet. The large volumes of data collected are then analysed and tracked to understand human behaviour.
With more and more devices connecting to the internet and more users adopting digital technology, this new technology is going to play a significant role in the Big Data, analytics, development and predictive analytics domains. You may have heard of IoT technology that comprises a physical device network of devices connected to the internet where each device can communicate with the other. In the future, more gadgets are going to become ‘smart’ and get added to the network, such as automobiles and home appliances.
15. Predictive analytics
Predictive analytics is the software domain that helps businesses predict future customer behaviours and trends. Some of the most common applications of predictive analytics are in risk management, marketing and operations. It is one of the most promising domains in the data domain along with data science. If you are looking to enter the field of data, you can choose to go for data science, big data or predictive analytics. That way, you will have future-proof skills that pay among the highest salaries in the industry today.
16. DevOps
DevOps combines the development and operations departments within an organisation to help enhance and automate the process of software development. DevOps has two leading applications within an organisation:
- Shortening software delivery cycles
- Improving product standards overall
By promoting collaboration between operations and development teams, enterprises can provide faster software upgrade and feature delivery to customers. It also means enterprises see reduced errors and enhanced product quality.
17. 3D Printing
3D printing has become a key technology used to create prototypes, especially in the industrial and biomedical fields. 3D printers allow you to create a real object from a printer! Needless to say, this cost-effective innovation is here to stay. Brands and companies that work in the healthcare, data and industrial sectors need 3D printing of their products, which has boosted the requirement of people specialising in Machine learning, 3D printing, modelling and AI.
18. AI-as-a-Service
Several as-a-service offerings exist today such as PaaS, IaaS and SaaS. AI-as-a-service is a cloud-based solution that offers AI-based capabilities. Several AI providers are offering this service in several fields and industries. It gives organisations AI access to advanced features and capabilities without having to invest in costly hardware or gear maintenance.
19. Genomics
Can technology study your DNA and help you fight diseases, improve health, etc? Yes! The field of genomics uses technology can studies the make-up of your DNA and genes, do their mapping, etc. They also help quantify your genes that makes it easier for doctors to find any possible health issues waiting in the dark. There are various technical and non-technical roles available in this field. Tech jobs will include analysis, design and diagnostics, while non-tech jobs will include theoretical analysis and research.
The world of technology evolves rapidly and trends change quickly. As an IT professional, you need to stay on top of these trends to build a successful career. What’s more, if you are looking for one area of expertise to direct your career toward, understanding how each of these technologies works, together and by itself, will give you an edge over the other applicants applying for the same job as you.
The biggest pro and con of living in a fast-changing world is that you must keep yourself updated and learn for life to stay relevant. Upskilling and growing have become exponentially more rewarding as the demand and remuneration for such professionals are skyrocketing. Boost your career in tech by learning from the best at Koenig.
What Next?
You now know which technologies are going to make the greatest impact in the years to come. Keeping in mind the potential and promising career options they offer and your personal goals, start training in a technology you like. Get the early bird advantage and sharpen your skills so that you’re among the first to embrace these technologies.