D
Akida™ Development Environment (ADE).I’ve been reading some more again from the end of 2019 company progress update and I can’t work out what ADE stands for. Anyone know?
Talk about intellectual property licensing a bit more. There were a lot of questions about it, and I think in part that's because we've voiced a strong opinion, coming in advance of actual device sales. There's no manufacturing process involved. There's no inventory. There's no loan package qualification by the customer. We released that in 2019. We have received strong response from prospective customers. The ADE, in the hands of one major South Korean company, is being exercised almost as much as we exercise it. They really have dug in, validated some of the benchmark results that we've provided, and now they're moving onto some of their own proprietary networks to do validation.
Thank you @Steve10 and appreciate all the research you’ve provided latelyAkida™ Development Environment (ADE).
BrainChip’s Akida Development Environment Now Freely Available for Use
Develop and Deploy on Akida Deeply Learned Neural Networks in a standard TensorFlow/Keras Environment
SAN FRANCISCO–(BUSINESS WIRE)–
BrainChip Holdings Ltd. (ASX: BRN), a leading provider of ultra-low power, high-performance edge AI technology, today announced that access to its Akida™ Development Environment (ADE) no longer requires pre-approval, now allowing designers to freely develop systems for edge and enterprise products on the company’s Akida Neural Processing technology.
ADE is a complete, industry-standard machine learning framework for creating, training and testing deeply learned neural networks. The platform leverages TensorFlow and Keras for neural network development, optimization and training. Once the network model is fully trained, the ADE includes a simple-to-use compiler to map the network to the Akida fabric and run hardware accurate simulations on the Akida Execution Engine. The framework uses the Python scripting language and its associated tools and libraries, including Jupyter notebooks, NumPy and Matplotlib. With just a few lines, developers can easily run the Akida simulator on industry-standard datasets and benchmarks in the Akida model zoo such as Imagenet1000, Google Speech Commands, MobileNet among others. Users can easily create, modify, train and test their own models within a simple use development environment.
ADE comprises three main Python packages:
“The enormous success of our early-adopters program allowed us to make ADE available to developers looking to use an Akida-based environment for their deep machine learning needs,” said Louis DiNardo, CEO of BrainChip. “This is an important milestone for BrainChip as we continue to deliver our technology to a marketplace in search of a solution to overcome the power- and training-intense needs that deep learning networks currently require. With the ADE, designers can access the tools and resources needed to develop and deploy Edge application neural networks on the Akida neural processing technology.”
- the Akida Execution Engine including the Akida Simulator is an interface to the BrainChip Akida neural processing hardware. To allow the development, optimization and testing of Akida models, it includes a software backend that simulates the Akida NSoC. The output of the Akida Execution Engine generates all necessary files to run the Akida neural processor hardware as well.
- the CNN development tool utilizes TensorFlow/Keras to develop, optimize and train deeply learned neural networks such as CNNs
- the Akida model zoo contains pre-created neural network models built with the Akida sequential API and the CNN development tool using quantized Keras models.
Akida is available as a licensable IP technology that can be integrated into ASIC devices and will be available as an integrated SoC, both suitable for applications such as surveillance, advanced driver assistance systems (ADAS), autonomous vehicles (AV), vision guided robotics, drones, augmented and virtual reality (AR/VR), acoustic analysis, and Industrial Internet-of-Things (IoT). Akida is a complete neural processing engine for edge applications, which eliminates CPU and memory overhead while delivering unprecedented efficiency, faster results, at minimum cost. Functions like training, learning, and inferencing are orders of magnitude more efficient with Akida.
Access to ADE is currently available online at https://doc.brainchipinc.com/. Among the resources are installation information, user guide, API reference, Akida examples, support and license documentation. ADE requires TensorFlow 2.0.0. Any existing virtual environment previously used would need to be updated as per the installation step.
How embedded vision and AI will revolutionise industries in the coming future
15 March 2023
As we stand on the cusp of the Fourth Industrial Revolution, the integration of embedded vision systems and artificial intelligence (AI) is poised to unleash a wave of disruption that will revolutionise industries as diverse as healthcare, manufacturing, transportation, and retail.
With the ability to process massive amounts of data in real time and make complex decisions with astonishing speed and accuracy, these technologies have the potential to transform the way businesses operate, optimise supply chains, enhance product quality, and deliver unparalleled customer experiences. As we look to the future, it is clear that those companies that are able to harness the power of embedded vision systems and AI will be the ones that thrive in an increasingly competitive and dynamic marketplace.
The application of embedded vision systems and AI has extended beyond their traditional use cases, spurring new and innovative solutions across industries. For instance, the use of AI-powered chatbots has significantly improved customer service, providing 24/7 support and reducing response times. Additionally, AI algorithms have been used to predict and prevent equipment failure in manufacturing, reducing downtime and improving overall efficiency. In the healthcare industry, embedded vision systems and AI have enabled the development of precision medicine, allowing for accurate diagnoses and targeted treatment plans.
Furthermore, the convergence of these technologies has also facilitated the development of new forms of human-machine interaction, such as gesture recognition and voice-controlled interfaces. These innovations have led to the creation of more intuitive and seamless user experiences in areas such as gaming, virtual reality, and augmented reality. The widespread adoption of embedded vision systems and AI has also resulted in a growing demand for skilled professionals capable of designing, developing, and implementing these solutions. As such, academic institutions and training programmes are increasingly offering courses and degrees focused on these technologies, providing the necessary skills for individuals seeking careers in these areas.
Together, embedded vision systems and AI have the potential to revolutionise industries in the coming future. Some of the industries that are set to benefit from this technological convergence are discussed here.
Autonomous vehicles
The automotive industry is a hotbed of innovation, and the integration of embedded vision systems and AI represents a significant leap forward in this industry's technological capabilities. One of the most compelling examples of this integration is in the development of autonomous vehicles, which are poised to revolutionise transportation as we know it. Embedded vision systems, comprising cameras, lidar, and other sensors, capture visual data in real-time, which is then transmitted to AI algorithms that analyse the information and make decisions based on predefined parameters.
View attachment 32333
AI-powered embedded vision systems comprising cameras, lidar, and other sensors have enabled the development of autonomous vehicles (Image: Shutterstock/Scharfsinn)
In addition to revolutionising consumer transportation, autonomous vehicles will undoubtedly also be a game-changer for the logistics industry, where they are already being deployed to pick and transport packages in warehouses.
Healthcare
The healthcare industry also stands to benefit greatly from the integration of embedded vision systems and AI, as it promises to transform the field of medical imaging and diagnosis. The ability of embedded vision systems to capture high-resolution images of internal organs and tissues has already shown promise in detecting and diagnosing health issues. However, the true potential of this technology lies in AI's ability to analyse these images and identify patterns that may not be visible to the human eye. By leveraging machine learning algorithms, medical professionals can achieve faster and more accurate diagnoses, leading to better patient outcomes. This technology could also lead to reduced costs by minimising the need for additional tests and consultations.
With the integration of embedded vision systems and AI, the healthcare industry has the potential to revolutionise the way medical diagnoses are made, paving the way for more efficient and effective healthcare solutions.
Sport broadcasting
Sports broadcasting is another industry poised for significant transformation with the integration of embedded vision systems and AI. With the help of AI, sports broadcasters can enhance the viewing experience of their audiences by providing them with real-time statistics, player tracking, and other vital information during live matches. Embedded vision systems can capture and process high-definition images of sports events, and AI algorithms can analyse this data to provide viewers with unique insights into the game, such as player heat maps, ball speed and trajectory, and other performance metrics.
This data can be used by coaches and analysts to make critical decisions, while sports broadcasters can use it to create more engaging and informative content for their viewers. With the integration of embedded vision systems and AI, the sports broadcasting industry is set to undergo a significant transformation, providing viewers with a more immersive and insightful experience.
Retail
The retail industry holds immense potential for significant advancement through the integration of embedded vision systems and AI, which could revolutionise the shopping experience for customers and improve business outcomes for retailers.
One of the most revolutionary and innovative applications of these technologies is the concept of autonomous shopping, which entails a comprehensive network of high-tech cameras, state-of-the-art sensors, and AI-powered algorithms automating the entire shopping experience. This cutting-edge approach provides customers with the ability to simply walk into a store, select the items they desire, and exit without the need for interaction with a cashier or checkout system. Essential to this process, embedded vision systems enable instantaneous object detection and recognition, facilitating the AI algorithms' identification of products while simultaneously tracking their placement into the customer's cart. Furthermore, data gathered through these systems is instrumental in supporting retailers to optimise their inventory management, streamline store layout, and enhance overall customer experience.
In addition, AI-powered recommendation engines, which analyse customer data to provide personalised product recommendations, represent a major advancement in the field of customer engagement. The amalgamation of embedded vision systems and AI therefore has great potential to revolutionise the retail industry, dramatically transforming the way we approach the shopping experience as a whole.
Conclusion
The convergence of embedded vision systems and AI has tremendous implications for a wide range of industries, with there being great promise for the transformative potential of these technologies. The progress shared here that has been made thus far in the automotive, healthcare, sports broadcasting and retail industries underscores the potential of this technology to improve efficiency, reduce costs, and enhance productivity. Nevertheless, it is important to address challenges such as privacy concerns and the ethical development of AI to ensure that these technologies are used responsibly and sustainably.
As industry experts continue to explore the possibilities of embedded vision systems and AI, it is clear that this technology will continue to shape and transform industries in the years to come.
Does this not indicate potential funding for BRN?Why concerning?
Definitely caking the pants as I bought in at 4.9 cents To see the SP go from there to $2+ and still holding!
Hi @SERA2g. An observation, I haven't seen much of you, last couple of months, your always educational and entertaining posts are very much appreciated.In the interest of potentially answering some initial questions others might have about Spiral Blue...
From @Diogenese (with permission) earlier this month. I ran Spiral Blue's space edge computer past him for his opinion.
"Hi SeRA2,
No published patent docs, but they are not published until 18 moths after filing.
They use Nvidia, so they are probably software.
https://spiralblue.space/space-edge-computing
Space Edge Computers use the NVIDIA Jetson series, maximising processing power while keeping power draw manageable. They carry polymer shielding for single event effects, as well as additional software and hardware mitigations. We provide onboard infrastructure software to manage resources and ensure security, as well as onboard apps such as preprocessing, GPU based compression, cloud detection, and cropping. We can also provide AI based apps for object detection and segmentation, such as Vessel Detect and Canopy Mapper.
Note our friend "segmentation" is along for the ride."
How many amputees are there from land mine explosions.
Space will not be a volume buyer for decades but robots are in factories around the world already.
Underground mines are a robotic target market.
If AKIDA proves itself in space on global television carrying out on the fly repairs just consider the marketing value.
My opinion only DYOR
Yes, I agree, But in terms of this news some people are expecting an asx announcement. That won’t happen imo because it is not a revenu producing contact, or if it is, then this should have already been an asx announcement before we hear wind of it trough the other party? It’s not material news is what I was referring to.How many amputees are there from land mine explosions.
Space will not be a volume buyer for decades but robots are in factories around the world already.
Underground mines are a robotic target market.
If AKIDA proves itself in space on global television carrying out on the fly repairs just consider the marketing value.
My opinion only DYOR
Same here, i just topped up my SMSF with another 10,000. These low prices are like buying from Coles on half price specials..Couldn’t help myself topped up with more shares yesterday I am all in.
Love reading through all the post here thanks for the input all.
All the best Sirod !for tomorrow I've ordered another 5000 for 0.28 cents (Europe), if that doesn't work, I'll have to sell a sofa as well . I'm also grateful if the order doesn't work. The course is really awful! Of course we are all long here. I found the Teksun thing very exciting, word has gotten around the world. I'll be operated on Friday, so I'll be out of here again for a few days. I hope I can read everything here again. Nice to be in this forum!
F/F, pleased to see u've hardened up, and back contributing. As ltshs , we've all been smashed, both by nay sayers, and s/p. It's reminiscent of the early days, when I remember being down over 50% on my holdings, and kept buying, due to research. That business spiderweb for BRN keeps expanding. Very encouraging to see. Ur 1 of many whose input is highly valued. I say to u, and all. Trolls are a waste of space. Don't be, and don't get disheartened. That puts the WANCAS in the winners circle. No-one needs to pump this co. It's achievements will do that for us. Looking forward to the time where late adopters are in awe, and shorters get smoked. Best of health to u, and keep on keeping on. I, like many, luv ur informative posts. Keep them coming!Just for interest this is the link to Spiral Blue’s Edge Compute products powered by Nvidia with detailed spec sheets:
Spiral Blue | Space Edge Computing
Spiral Blue is developing the Space Edge Computer - an onboard computing system that will give Earth observation satellites the ability to process images captured on the satellite itself. This has the potential to increase the capacity of Earth observation satellites carrying Space Edge...www.spiralblue.space
How can AKIDA running at micro to milli watts with on chip learning and a price tag of tens of US dollars compete with their current Nvidia offerings-please note this is rhetorical humour
What did Edge Impulse say again Science Fiction AKIDA can out perform at 300 megahertz a GPU running at 900 megahertz.
My opinion only DYOR
FF
AKIDA BALLISTA