BRN Discussion Ongoing

JK200SX

Regular
Afternoon VictorG,

Certainly a interesting response from ChatGPT.

Below extract from Brainchips last Annual Report. 2021
No mention of AIZip under subsidiary companys, so yes....strange answer.

Regards,
Esq.

71F8FC69-5D3A-477F-A16F-EE4F2506F5A1.png
 

Attachments

  • C6E4335D-4ACD-450B-9A8F-DAEDD65269E7.png
    C6E4335D-4ACD-450B-9A8F-DAEDD65269E7.png
    163.4 KB · Views: 71
  • Like
  • Love
  • Fire
Reactions: 5 users

BaconLover

Founding Member
Not looking to get caught in the crossfire here but just wanted to point out that Tony is talking about his OPINION on what the market thinks. That neither confirms that is what is required, nor does it implicate himself or Brainchip regarding revenue figures.
He is making the same observation that many here have made, but that doesn't mean it's required. I personally don't require it, we are a growth company for now with disruptive tech that I as an early investor gets to share the spoils for the risk I am taking. Many here also still hold the same outlook.
I believe you should edit your post and remove or modify your terminology surrounding what he acknowledges, as the way you have phrased it is putting word in his mouth, which I'm sure you aren't meaning to do.

Well, that is his opinion Damo, and I do agree with him.
He understands that is what we need to convince markets about BRN's potential. What I am saying is, he acknowledges this is what market wants. Happy to edit it if it is read otherwise.

I am glad that company sees it from a shareholder perspective too, because being a public company, they do have a duty of care to look after the shareholders.
 
  • Like
  • Fire
  • Love
Reactions: 13 users

M_C

Founding Member

Future computer systems, said Hinton, will be take a different approach: they will be "neuromorphic," and they will be "mortal," meaning that every computer will be a close bond of the software that represents neural nets with hardware that is messy, in the sense of having analog rather than digital elements, which can incorporate elements of uncertainty and can develop over time.

"It'll be used for putting something else: It'll be used for putting something like GPT-3 in your toaster for one dollar, so running on a few watts, you can have a conversation with your toaster."
 
  • Like
  • Fire
  • Love
Reactions: 19 users

Damo4

Regular
Well, that is his opinion Damo, and I do agree with him.
He understands that is what we need to convince markets about BRN's potential. What I am saying is, he acknowledges this is what market wants. Happy to edit it if it is read otherwise.

I am glad that company sees it from a shareholder perspective too, because being a public company, they do have a care of duty to look after the shareholders.
"even Tony Dawe acknowledged that Brainchip company needs sales and revenue"

This is misleading and you know it I've realized
 
  • Like
  • Fire
  • Love
Reactions: 11 users

BaconLover

Founding Member
"even Tony Dawe acknowledged that Brainchip company needs sales and revenue"

This is misleading and you know it I've realized
Edited it to make it more sense.

But Tony Dawe has ackowledged that Brainchip needs sales and revenue.

How is this misleading? That is exactly what he has said in the email.
 
  • Like
  • Fire
Reactions: 5 users

FJ-215

Regular
  • Like
  • Love
  • Fire
Reactions: 3 users
  • Like
Reactions: 1 users

Tothemoon24

Top 20
So many dots going on certainly looking like a master stroke joining GlobalFoundries

Supply Chain Dive
MENU
menu


DIVE BRIEF

GM signs exclusive chip supplier agreement with GlobalFoundries​

Published Feb. 15, 2023
Kate Magill's headshot
Kate MagillEditor
  • A Chevrolet Volt extended-range electric vehicle
    A Chevrolet Volt extended-range electric vehicle. GM and GlobalFoundries signed an exclusive semiconductor supplier deal as the automaker looks to expand its EV production capacity. Bill Pugliano via Getty Images
audio_icon.svg
Listen to the article2 min

Dive Brief:​

  • GM signed an exclusive supplier agreementwith semiconductor maker GlobalFoundries, another arm of the automaker’s growing electric vehicle supply chain network.
  • Through the agreement, the chip manufacturer aims to reduce the number of unique chips needed to power GM’s EVs by producing higher quality supply in greater quantity and with more predictability, according to the release.
  • GlobalFoundries will produce the chips at its manufacturing facility in Malta, New York.


Dive Insight:​

GM has spent the past year fortifying its EV supplier network, looking to secure ample supply of electric vehicle and battery components as it races to produce 1 million EVs annually by 2025.
“We see our semiconductor requirements more than doubling over the next several years as vehicles become technology platforms,” Doug Parks, GM executive vice president of global product development, purchasing and supply chain, said in a statement. “The supply agreement with GlobalFoundries will help establish a strong, resilient supply of critical technology in the U.S. that will help GM meet this demand.”
GM signed three supplier deals last summer to secure components needed for battery production, including lithium, nickel, cobalt and cathode active material. And in January, the automaker announced plans to invest $650 million in a Nevada lithium mine.
GlobalFoundries, meanwhile, has been investing in its manufacturing capacity. The company announced the extension of its partnership with QualComm Technologies in August, including investing $4.2 billion in chip manufacturing and expanding capacity at GlobalFoundries’ Malta facility.
For New York, the GM-GlobalFoundries news builds on its efforts to become a hub of semiconductor manufacturing activity. The state passed the Green CHIPS legislation in August, aimed at attracting chip investment and job creation in the state.
“We’re making New York State not only the semiconductor capital of the country — but of the globe,” New York Gov. Kathy Hochul said in a statement.

RECOMMENDED READING​

 
  • Like
  • Love
  • Fire
Reactions: 26 users

Tothemoon24

Top 20
Did someone say KFC
 
  • Haha
  • Like
  • Love
Reactions: 64 users
Doing a bit of digging on AIZIP come up with this

View attachment 30211

The above sounds familiar, and considering they will be demonstrating AKIDA, there's perhaps a good chance that this "Volume shipped" product contains AKIDA.

(But I haven't been able to find who the shareholders are?)
Just further to AIZip.

Does look like the MAX78000.

Maxim, Aizip Partnering to Develop IoT Person Detection Device - News

April 22, 2021
PERRY COHEN MAXIM INTEGRATED AIZIP
Maxim Integrated and Aizip announced the MAX78000 neural-network microcontroller detects people in an image using Aizip’s Visual Wake Words (VWW) model at just 0.7 millijoules (mJ) of energy per inference
 
  • Like
  • Fire
  • Sad
Reactions: 7 users

Boab

I wish I could paint like Vincent
I saw a personalised number plate today and wished I'd thought of that

BRN2SHINE
 
  • Like
  • Fire
  • Haha
Reactions: 21 users

Taproot

Regular
Doing a bit of digging on AIZIP come up with this

View attachment 30211

The above sounds familiar, and considering they will be demonstrating AKIDA, there's perhaps a good chance that this "Volume shipped" product contains AKIDA.

(But I haven't been able to find who the shareholders are?)
Sorry if i missed the connection along the way, but can someone fill me in on when / how Aizip are doing an Akida demo ?
 
  • Like
Reactions: 2 users

Slade

Top 20
I am now officially not going to criticize anyone and I am asking for a truce. That the the focus of this thread once again becomes dedicated to great research and that those wanting to voice their fears or complaints about management, lack of sales etc use the other thread that was created as it seemed to be working well until things got derailed. In fairness, I think it is reasonable to allow the next 24 hours a time when all that want to can throw their insults and accusations at me and I guess keep fighting each other if you want to. I will accept what anyone dishes out and I promise I won’t respond. Or alternatively screw it, let’s keep the status quo of this thread and accept the in fighting. It would however be a shame if we lose any members which has happened before.
 
  • Like
  • Love
  • Fire
Reactions: 58 users
https://leat.univ-cotedazur.fr/2022...d-object-detection-with-bio-inspired-retinas/

Development of a prototype HW platform for embedded object detection with bio-inspired retinas​

Context
The LEAT lab is leader of the national ANR project DeepSee in collaboration with Renault, Prophesee and 2 other labs in neuroscience (CERCO) and computer science (I3S). This project aims at exploring a bio-inspired approach to develop energy-efficient solutions for image processing in automotive applications (ADAS) as explored by [3]. The main mechanisms that are used to follow this approach are event-based cameras (EBC are considered as artificial retinas) and spiking neural networks (SNN).
The first one is a type of sensor detecting the change of luminosity at very high temporal resolution and low power consumption, the second one is a type of artificial neural network mimicking the way the information is encoded in the brain. The LEAT has developed the first model of SNN able to make object detection on event-based data [1] and the related hardware accelerator on FPGA [2]. The goal of this internship project is to deploy this spike-based AI solution onto an embedded smart camera provided by the Prophesee company [4]. The camera is composed of an event-based sensor and an FPGA. The work will mainly consist in deploying the existing software code (in C) on the embedded CPU, integrate the HW accelerator (VHDL) onto the FPGA and make the communication between them through an AXI-STREAM bus. The last part of the project will consist in realizing experimentations of the resulting smart cameras to evaluate the real-time performances and energy consumption before a validation onto a driving vehicle.
I know Renault struck up partnership with Chronocam which went on to become Prophesee.

Just saw this paper from mid last year with the LEAT / Renault team.

They speak of Loihi but from what I found earlier they obviously now looking at testing Akidas CNN2SNN capabilities and I'm tying that in with us going to GF 22nm FDSOI which from that NASA solicitation I found indicated was the chip often used in automotive.

I'm starting to wonder if the tape out could be part of supplying some gear to LEAT as well for this upcoming project :unsure:

Screenshot_2023-02-21-16-28-54-77_e2d5b3f32b79de1d45acd1fad96fbb0f.jpg
 

Attachments

  • 2205.04339.pdf
    348.3 KB · Views: 209
  • Like
  • Fire
  • Love
Reactions: 23 users

Future computer systems, said Hinton, will be take a different approach: they will be "neuromorphic," and they will be "mortal," meaning that every computer will be a close bond of the software that represents neural nets with hardware that is messy, in the sense of having analog rather than digital elements, which can incorporate elements of uncertainty and can develop over time.

"It'll be used for putting something else: It'll be used for putting something like GPT-3 in your toaster for one dollar, so running on a few watts, you can have a conversation with your toaster."
"It'll be used for putting something like GPT-3 in your toaster for one dollar, so running on a few watts, you can have a conversation with your toaster."

 
  • Haha
  • Like
Reactions: 7 users

TopCat

Regular
I’ve always just assumed neuromorphic computing was mainly going to benefit EV’s but here’s a research paper using SNN’s to improve the efficiency of ICE vehicles cheaply and quickly.


INTRODUCTION
Neuromorphic computers provide an intriguing platform for low power artificial intelligence (AI) at the edge [1]. However, there are several key hurdles to developing neuromorphic com- puting solutions to real-world applications, including finding sufficiently low power neuromorphic hardware, implementing the appropriate algorithms used to train a spiking neural network (SNN) to deploy onto the hardware platform, and ensuring that the trained SNN fits within the constraints of the pre-determined neuromorphic hardware system.
In this work, we present a complete pipeline for training and deploying a low size, weight, and power (SWaP) neuromorphic hardware solution for a real-world application. The real-world application of interest is an engine control unit (ECU) of a spark-ignition internal combustion engine. To improve engine efficiency and reduce greenhouse gas emissions, transportation researchers have developed advanced combustion strategies that minimize the amount of fuel needed to run the engine. However, as the engine efficiency is pushed to the practical limit, instabilities in the combustion process cause sporadic misfires and partial burns. Such events increase the cycle- to-cycle variability (CCV) of the combustion process which causes undesired levels of noise, vibration, and harshness (NVH). In order to mitigate combustion instabilities, the ECU should command a higher amount of fuel during combustion cycles where misfires or partial burns would otherwise occur. In this case, we would like to utilize a neuromorphic hardware system that will take as input information about the engine combustion’s state at each cycle and provide as output the amount of fuel to inject during the next cycle. The goal is to keep the engine running with few or no engine misfires and partial burns in order to minimize combustion CCV and keep the engine running smoothly, but also to minimize the amount of extra fuel needed to stabilize the combustion events.



FUTURE WORK AND CONCLUSIONS
In this work, we have demonstrated a complete neuromor-
phic workflow, from application to hardware, for training and
deploying low size, weight, and power neuromorphic solutions
for real-world applications. We demonstrated the results of this
workflow on a control task to improve fuel efficiency in spark-
ignition internal combustion engines. By utilizing low power
AI hardware such as neuromorphic systems, we can potentially
enable more efficient engines and reduce greenhouse gas emissions. We show that it is feasible to utilize an SNN approach as deployed on neuromorphic hardware to control an engine in simulation. We show that even this preliminary SNN approach can outperform current, open-loop control strategies. Moreover, we demonstrate the resulting SNNs are very small and sparse and can be deployed onto an inexpensive and low size, weight, and power commercial FPGA, providing the opportunity for rapid deployment.
There are several avenues that we intend to pursue for future work. First, the engine simulators we use in this work are based on a single set of engine operating conditions. We intend to apply this same workflow to train SNNs for other engine characteristic settings.
 
  • Like
  • Fire
  • Love
Reactions: 37 users

Tothemoon24

Top 20
I’ve always just assumed neuromorphic computing was mainly going to benefit EV’s but here’s a research paper using SNN’s to improve the efficiency of ICE vehicles cheaply and quickly.


INTRODUCTION
Neuromorphic computers provide an intriguing platform for low power artificial intelligence (AI) at the edge [1]. However, there are several key hurdles to developing neuromorphic com- puting solutions to real-world applications, including finding sufficiently low power neuromorphic hardware, implementing the appropriate algorithms used to train a spiking neural network (SNN) to deploy onto the hardware platform, and ensuring that the trained SNN fits within the constraints of the pre-determined neuromorphic hardware system.
In this work, we present a complete pipeline for training and deploying a low size, weight, and power (SWaP) neuromorphic hardware solution for a real-world application. The real-world application of interest is an engine control unit (ECU) of a spark-ignition internal combustion engine. To improve engine efficiency and reduce greenhouse gas emissions, transportation researchers have developed advanced combustion strategies that minimize the amount of fuel needed to run the engine. However, as the engine efficiency is pushed to the practical limit, instabilities in the combustion process cause sporadic misfires and partial burns. Such events increase the cycle- to-cycle variability (CCV) of the combustion process which causes undesired levels of noise, vibration, and harshness (NVH). In order to mitigate combustion instabilities, the ECU should command a higher amount of fuel during combustion cycles where misfires or partial burns would otherwise occur. In this case, we would like to utilize a neuromorphic hardware system that will take as input information about the engine combustion’s state at each cycle and provide as output the amount of fuel to inject during the next cycle. The goal is to keep the engine running with few or no engine misfires and partial burns in order to minimize combustion CCV and keep the engine running smoothly, but also to minimize the amount of extra fuel needed to stabilize the combustion events.



FUTURE WORK AND CONCLUSIONS
In this work, we have demonstrated a complete neuromor-
phic workflow, from application to hardware, for training and
deploying low size, weight, and power neuromorphic solutions
for real-world applications. We demonstrated the results of this
workflow on a control task to improve fuel efficiency in spark-
ignition internal combustion engines. By utilizing low power
AI hardware such as neuromorphic systems, we can potentially
enable more efficient engines and reduce greenhouse gas emissions. We show that it is feasible to utilize an SNN approach as deployed on neuromorphic hardware to control an engine in simulation. We show that even this preliminary SNN approach can outperform current, open-loop control strategies. Moreover, we demonstrate the resulting SNNs are very small and sparse and can be deployed onto an inexpensive and low size, weight, and power commercial FPGA, providing the opportunity for rapid deployment.
There are several avenues that we intend to pursue for future work. First, the engine simulators we use in this work are based on a single set of engine operating conditions. We intend to apply this same workflow to train SNNs for other engine characteristic settings.

Hi TopCat .
Have a look at this list of trending tech ,

Can see the mighty chip ticking a few boxes 🚀

logo.png




Top 19 New Technology Trends for 2022 - 2023​

By
Archer Charles

24-Jan-2023



https://www.koenig-solutions.com/blog/top-new-technology-trends#

LinkedIn
Top 19 New Technology Trends for 2022 - 2023

We live in an era called the information age. New technology is emerging every day to make life simpler, more advanced and better for everyone. The rate at which technology is evolving is almost exponential today. For business organisations, new technology helps to reduce costs, enhance customer experiences and increase profits.
Nearly 50 billion devices will have internet connectivity by 2030. With the COVID-19 pandemic boosting businesses’ digital transformation journey, this goal is closer than anticipated. The familiarity with the latest IT technology holds the key to advancing your career and exploring new opportunities.

Here are some of the Latest Top Technology Trends for 2022 - 2023:​

  1. Artificial Intelligence (AI) and Machine Learning (ML)
  2. Robotic Process Automation (RPA)
  3. Edge Computing
  4. Quantum Computing
  5. Virtual Reality (VR) and Augmented Reality (AR)
  6. Blockchain
  7. Internet of Things (IoT)
  8. 5G
  9. Cybersecurity
  10. Full Stack Development
  11. Computing Power
  12. Datafication
  13. Digital Trust
  14. Internet of Behaviours
  15. Predictive analytics
  16. DevOps
  17. 3D Printing
  18. AI-as-a-Service
  19. Genomics
Now, Let's discuss the top technology trends-

1. Artificial Intelligence (AI) and Machine Learning (ML)​

Artificial Intelligence, also known as AI, started gaining popularity a decade ago. It has still not slowed down and continues to be one of the leading technologies in 2022-2023. AI is constantly evolving, and newer applications for this emerging technology continue to spring upon the scene. Today’s most popular AI applications are image and speech recognition, navigation programs, voice assistants like Siri and Alexa, and much more.
Organisations are looking to use AI to analyse customer and business interactions to derive insights and identify triggers. It will help them predict the demand for services such as hospitals or tourism and aid in the improvement of resource allocation for various projects.
Machine Learning (ML) is a part of AI and uses supervised learning to learn new functions. It has seen a massive surge in demand for skilled professionals, making it an attractive trend to watch. According to Forrester, AI and Machine Learning will be responsible for 9% of all new jobs in the US by 2025.
Enquire Now

2. Robotic Process Automation (RPA)​

Robotic Process Automation (RPA) uses multiple software and applications to automate business processes such as data collection and analysis, customer service and other repetitive tasks managed previously through manual processes.
Like AI and Machine Learning, RPA is a rapidly advancing technology that automates many jobs across different industries. McKinsey has analysed that fewer than 5% of jobs today can be entirely automated, but nearly 60% can be automated at least partially.
RPA offers several new career options and trajectories such as a programmer, project manager, business analyst or consultant. It also opens doors to high-paying jobs with a moderate learning curve in leading organisations. Choosing this emerging technology as a career move can profit you immensely.

3. Edge Computing​

Today, millions of data points are collecting user information from various sources such as social media, websites, emails, and web searches. As the amount of data collected increases exponentially, other technologies like cloud computing fall short in several situations.
Till a decade back, Cloud Computing was one of the fastest-growing technologies. However, it has become fairly mainstream, with the market dominated by major players such as Amazon Web Services (AWS), Microsoft Azure and Google Cloud Platform.
As more organisations adopted Cloud Computing, they found several limitations in the technology. Edge Computing helps to bypass the latency that Cloud Computing causes and helps organisations to get data into a data centre for processing. It can exist ‘on edge’, in the sense that it is closer to where the data processing will ultimately take place. Edge Computing is used to process time-sensitive data in far-off locations with limited or no connectivity.
Edge Computing application will keep growing as IoT devices increase. Its market share is set to reach $6.72 billion by 2022.

You May Also Like: Machine Learning Certification: Everything You Need to Know

4. Quantum Computing​

Quantum Computing is a type of computing that focuses on developing computer technology based on the principles of quantum theory. This theory explains the behaviour of energy and materials on atomic and subatomic levels. In other words, it performs calculations based on the probability of an object’s state before measurement instead of just 0’s and 1’s.
Quantum Computing can easily query, analyse and take action based on given data, regardless of the source. It played a major role in preventing COVID-19 and developing new vaccines. These computers are exponentially faster than normal computers. The revenue for the Quantum Computing market is projected to cross $2.5 billion by 2029.
You need experience with quantum mechanics, linear algebra, machine learning, and information theory to enter this field.

5. Virtual Reality (VR) and Augmented Reality (AR)​

VR and AR have been popular for almost a decade now. Virtual Reality immerses the user into a new environment, while Augmented Reality enhances the user’s existing environment. While their applications so far have been majorly linked with gaming and filters on social media, simulation software such as Virtual Ship is also used to train the US Navy, Army and Coast Guard ship captains.
A whopping 14 million AR and VR devices were sold in 2019. The global market for this trending technology is predicted to reach $209.2 billion by 2022, which means more job opportunities for professionals in this field.
By 2022, AR and VR are expected to integrate into our everyday lives much more deeply than today. They have huge potential and possible applications in training, entertainment, education, marketing and therapy or post-injury rehabilitation. It is also widely used by advertisers and brands to create new immersive experiences for their customers.
Starting a career in VR or AR doesn’t require too much specialisation. Basic programming skills and a forward-thinking mindset coupled with optics as a skill set can help you easily secure a job in this field.

6. Blockchain​

Blockchain was popularised in the context of cryptocurrency and Bitcoin and the security it provides. However, it offers security that can be useful in several other ways as well. Blockchain can be defined as data that you can only add to, not take away or alter. It results in many sections of data which form a ‘chain’, hence the name Blockchain.
The fact that existing data cannot be altered or removed makes Blockchain a highly secure technology. Blockchains are consensus-driven, which means no single person or organisation can take control of the data. There is no need for a third party to oversee transactions.
As more industries adopt and implement blockchains, the demand for skilled blockchain developers has also increased. It requires the hands-on experience of programming languages, basic knowledge of OOPS, flat and relational databases, data structures, networking and web application development.

7. Internet of Things (IoT)​

It is one of the most promising technologies of the decade. Multiple devices or ‘things’ today are wifi-enabled, which means they can be connected to the internet. The Internet of Things is a network of diverse connected devices. Devices within the network can communicate with each other, collect data and transfer it across the network without human intervention.
There are hundreds of real-life Internet of Things (IoT)applications - from tracking activity using smart devices that connect to your phone, to remotely monitoring home doors or switching applications on and off. Businesses also use IoT for many things like monitoring activity in remote locations from a central hub and predicting when a device will malfunction so that corrective measures can be taken before it’s too late.
It is predicted that by 2030, over 50 billion devices will be connected via the Internet of Things. Global spending on this latest technology will reach an estimated $1.1 trillion in the next two years. IoT is currently in its initial stages and will advance rapidly in the near future. It requires knowledge of AI and Machine Learning fundamentals, as well as information security and data analytics.
Additional Read: How to Become a Certified Ethical Hacker (CEH)

8. 5G Technology​

5G technology has the potential to change the way we see the online world. 3G and 4G technology transformed how we interacted with mobile devices, enabling faster internet browsing, using data-driven services, and increasing bandwidth for live streaming.
5G aims to revolutionise our virtual interactions by integrating AR and VR technology and better cloud-based gaming experiences. It will also be used in factories and enterprises for monitoring and streamlining operations. 5G also has applications in road safety and rule implementation, smart grid control and smart retail experiences, in the form of live high-definition cameras.
Telecom companies around the world are working on creating 5G-ready services and devices. The technology was announced and rolled out in select places in 2020, with a worldwide launch expected in 2022. The launch of 5G has been delayed for a while but is set to quickly reach the world and become a part of every person’s life.

9. Cybersecurity​

Since the dawn of computers, cybersecurity has played a major role in ensuring safer user experiences. It is not a new trend, but given that technology is evolving rapidly, cybersecurity measures need to be constantly upgraded and improved. Threats and hacking attempts are growing in number and intensity, which calls for improving security protocols and strengthening systems against malicious attacks.
Data is the most valuable asset today, and hackers are constantly trying to steal data or information. This is why cybersecurity will always be a trending technology and need to constantly evolve to stay ahead of hackers. Demand for cybersecurity professionals is growing three times faster than any other tech jobs today. More and more businesses realise its importance, resulting in businesses spending about $6 trillion on cybersecurity by 2022.
Cybersecurity job roles transition from the ethical hacker to security engineer to Chief Security Officer. The pay is significantly more than in other technology job roles due to its significance in ensuring a secure user experience.

10. Full Stack Development​

This is one of the newest industry trends within the software domain to gain momentum. It continues to rise as IoT becomes a more mainstream technology with diverse applications. Full stack development covers the front and back-end development of applications and websites.
Organisations are working on developing more user-friendly and comprehensive apps for their target audience. For this, a full stack developer must have a deep understanding of server-side programming and web development. If you develop the skills needed to create a website, your services will always have a place in the industry. More and more businesses are moving to digital every day, increasing the demand for web developers and designers.
Enquire Now

11. Computing Power​

The digital era has computerised every mobile device and application, firmly establishing computing power in this generation. Data scientists have predicted that the infrastructure used to harness this computing power is only going to evolve in the coming years. Computing power is giving us advanced technology to make our lives better and is also creating more jobs in the tech industry. Fields like data science, data analytics, IT management, robotics, etc have the potential to create the largest percentage of employment in the country. Many international brands and companies hire from India because of the specialised training widely available in the country. The more computing our devices need, the more specialised professionals will be required and the economy will flourish as a result.

12. Datafication​

When we convert parts of our lives into software and devices through data, we are going through datafication. In this process, data-driven technology takes over human chores and tasks. Smartphones, office applications, industrial machines and even AI devices, all use data to interact with us and improve our lifestyles. Data has been a part of our lives for longer than you can even imagine. Storing this data requires security, which has led to an increase in the demand for data security specialisations in our country. IT professionals, data professionals, engineers, managers, technicians, all work in the same field. Careers in the data field require more skill than high-level qualifications. These skills can be acquired by doing some courses that teach you how the world of data works.

13. Digital Trust​

The world is tangling and being accommodated with technology and mobile devices, leading to the development of high trust towards these technologies. The same trust is also leading the way to a number of innovations. With various data security measures being taken, people believe that technology can help us build a reliable, secure and safe digital world. This also leads to companies inventing and innovating new things without having to worry about data security. Cyber security, ethical hacking, etc are a fewspecialisations that can be used to enter this field. There is also an array of jobs available nationally and internationally. There are professional certifications and normal courses available for all courses which can lead to a high-paying job role.

14. Internet of Behaviours​

This is one of the fastest-growing industry trends and has found applications in nearly every industry. IoB is a technology that uses data collected from user devices connected to the internet. The large volumes of data collected are then analysed and tracked to understand human behaviour.
With more and more devices connecting to the internet and more users adopting digital technology, this new technology is going to play a significant role in the Big Data, analytics, development and predictive analytics domains. You may have heard of IoT technology that comprises a physical device network of devices connected to the internet where each device can communicate with the other. In the future, more gadgets are going to become ‘smart’ and get added to the network, such as automobiles and home appliances.

15. Predictive analytics​

Predictive analytics is the software domain that helps businesses predict future customer behaviours and trends. Some of the most common applications of predictive analytics are in risk management, marketing and operations. It is one of the most promising domains in the data domain along with data science. If you are looking to enter the field of data, you can choose to go for data science, big data or predictive analytics. That way, you will have future-proof skills that pay among the highest salaries in the industry today.

16. DevOps​

DevOps combines the development and operations departments within an organisation to help enhance and automate the process of software development. DevOps has two leading applications within an organisation:
  • Shortening software delivery cycles
  • Improving product standards overall
By promoting collaboration between operations and development teams, enterprises can provide faster software upgrade and feature delivery to customers. It also means enterprises see reduced errors and enhanced product quality.

17. 3D Printing​

3D printing has become a key technology used to create prototypes, especially in the industrial and biomedical fields. 3D printers allow you to create a real object from a printer! Needless to say, this cost-effective innovation is here to stay. Brands and companies that work in the healthcare, data and industrial sectors need 3D printing of their products, which has boosted the requirement of people specialising in Machine learning, 3D printing, modelling and AI.

18. AI-as-a-Service​

Several as-a-service offerings exist today such as PaaS, IaaS and SaaS. AI-as-a-service is a cloud-based solution that offers AI-based capabilities. Several AI providers are offering this service in several fields and industries. It gives organisations AI access to advanced features and capabilities without having to invest in costly hardware or gear maintenance.

19. Genomics​

Can technology study your DNA and help you fight diseases, improve health, etc? Yes! The field of genomics uses technology can studies the make-up of your DNA and genes, do their mapping, etc. They also help quantify your genes that makes it easier for doctors to find any possible health issues waiting in the dark. There are various technical and non-technical roles available in this field. Tech jobs will include analysis, design and diagnostics, while non-tech jobs will include theoretical analysis and research.
The world of technology evolves rapidly and trends change quickly. As an IT professional, you need to stay on top of these trends to build a successful career. What’s more, if you are looking for one area of expertise to direct your career toward, understanding how each of these technologies works, together and by itself, will give you an edge over the other applicants applying for the same job as you.
The biggest pro and con of living in a fast-changing world is that you must keep yourself updated and learn for life to stay relevant. Upskilling and growing have become exponentially more rewarding as the demand and remuneration for such professionals are skyrocketing. Boost your career in tech by learning from the best at Koenig.

What Next?

You now know which technologies are going to make the greatest impact in the years to come. Keeping in mind the potential and promising career options they offer and your personal goals, start training in a technology you like. Get the early bird advantage and sharpen your skills so that you’re among the first to embrace these technologies.
 
  • Like
  • Fire
  • Haha
Reactions: 34 users

VictorG

Member
Just further to AIZip.

Does look like the MAX78000.

Maxim, Aizip Partnering to Develop IoT Person Detection Device - News

April 22, 2021
PERRY COHEN MAXIM INTEGRATED AIZIP
Maxim Integrated and Aizip announced the MAX78000 neural-network microcontroller detects people in an image using Aizip’s Visual Wake Words (VWW) model at just 0.7 millijoules (mJ) of energy per inference
Nice work FMF, I think we can put to rest the current shipment.

I'd be very interested to see who their new partners are, I have a feeling BRN will be one of them.

Screenshot_20230221_190753_Chrome.jpg
 
  • Like
Reactions: 17 users

buena suerte :-)

BOB Bank of Brainchip
Sorry if i missed the connection along the way, but can someone fill me in on when / how Aizip are doing an Akida demo ?
According to RT, BRN will be located at booth 2-238 at the Embedded World 2023.
According to Embedded World's website, a company called AIZIP is in booth 2-238

AIZIP doesn't list BRN as partners but they do say they have new partnerships to be announced shortly.
Interestingly though AIZIP's link to their partnership with Renesas describes everything Akida does. (posted earlier @VictorG)
 
Last edited:
  • Like
  • Fire
  • Haha
Reactions: 34 users

GazDix

Regular
According to RT, BRN will be located at booth 2-238 at the Embedded World 2023.
According to Embedded World's website, a company called AIZIP is in booth 2-238

AIZIP doesn't list BRN as partners but they do say they have new partnerships to be announced shortly.
Interestingly though AIZIP's link to their partnership with Renesas describes everything Akida does. (posted by @VictorG earlier)
Love it BienSuerte. Full Moon Fever went down this road Thursday, I dug in midday and now you are here!
Aizip are partners with Renesas.
Imagimob are partners with Renesas and ARM.
Arduino are partners with Intel, but hard to say who the others are.
Nota AI are partners with Arm and Intel.
Syntiant Corp are partners with all the others on the list apart from Nota.

Brainchip are not listed as partners on any of the above websites.
I guess these companies could be the prime suspects for our next announced partners seieng they all deal with different aspects of Iot.
Here are the folks who will join us.
 
  • Like
  • Fire
  • Love
Reactions: 14 users
Top Bottom