Fact Finder
Top 20
Sorry did I forget to mention someone?**Insert dad joke**
Geez....her name might be hard to forget.
Sorry did I forget to mention someone?**Insert dad joke**
Geez....her name might be hard to forget.
Good morning all,
inspired by a teardown article yesterday, i had a closer look into the specs of the GoPro 10 camera.
GoPro HERO 10 Teardown
A teardown of the GoPro HERO10 codenamed Kong including what changed in hardware from the HERO9. The ARMV8 64bit GP2 Processor also known as a Socionext M20V is the only major change. The whole codebase was upgraded to a 64bit codebase so be weary of buying this right out of the gates.gethypoxic.com
And one interesting thing for me is, that they removed the "DSP Group DBMD4" from the GoPro 9 camera.
The DBMD4 is an "Ultra Low Power Always-On Voice Activation for Any Device" Audio DSP pre-processor.
A costumer commented on the GoPro forums, that this was due the fact, that the camera often accidently powering on, low user interest and the chip shortage.
GoPro Support
community.gopro.com
Which leads me to the DSP Group Youtube channel and I would like to hear from someone with more expertise about the benchmarks between AKIDA and DSP Group on keyword spotting:
By the way, DSP Group have been acquired by Synaptics, which have been discussed here as a competitor earlier this year.
Which will really be in 2023In my opinion brainchip AKIDA IP isn't in this version but will be in the new MB OS system in 2024.
Though we know the ASX is not very proactive and they are barely covering off on their basic duties as it is. A lot of companies are getting away with a lot of different things unfortunately.I am not sure if this was for me or Cardpro but either way Trade Secret or not it makes clear that the ASX can decide that it should be disclosed.
When a company makes an announcement it takes the risk that the ASX will intervene and require the disclosure of information that it wants, needs or has agreed to keep confidential. See 3.1A.2 of Guidance Note 8
Of course in the absence of an announcement if the ASX becomes aware from any source that the company may be sitting on material information it can intervene and force disclosure via the ASX.
My opinion only DYOR
FF
AKIDA BALLISTA
This is a great read. Thanks for posting @Dozzaman1977View attachment 15381
Is that a new whitepaper released "Learning how to learn neuromorphic AI at the edge" on BRN website or did i miss its release in the past.
In my opinion brainchip AKIDA IP isn't in this version but will be in the new MB OS system in 2024.
And it stinks. I have seen this type of enforcement across so many areas. When enforcement organisations are failing they try to disguise their failure by selecting high visibility targets that get them publicity.Though we know the ASX is not very proactive and they are barely covering off on their basic duties as it is. A lot of companies are getting away with a lot of different things unfortunately.
View attachment 15381
Is that a new whitepaper released "Learning how to learn neuromorphic AI at the edge" on BRN website or did i miss its release in the past.
Hi Dozz,
Yes it's new, great find, I checked out the document properties to find the creation date & it was created on the 24th Aug 2022
See below
View attachment 15388
Tell me this isn't Valeo's next generation LiDAR: Enabling advanced LiDAR with AKIDA Neuromorphic AI inference at the edgeThis is a great read. Thanks for posting @Dozzaman1977
Q: BrainChip customers are already deploying smarter endpoints with independent learning and inference capabilities, faster response times, and a lower power budget. Can you give us some real-world examples of how AKIDA is revolutionizing AI inference at the edge?
Peter: Automotive is a good place to start. Using AKIDA-powered sensors and accelerators, automotive companies can now design lighter, faster, and more energy efficient in-cabin systems that enable advanced driver verification and customization, sophisticated voice control technology, and next-level gaze estimation and emotion classification capabilities.
Revolutionizing AI inference at the edge In addition to redefining the automotive in-cabin experience, AKIDA is helping enable new computer vision and LiDAR systems to detect vehicles, pedestrians, bicyclists, street signs, and objects with incredibly high levels of precision. We’re looking forward to seeing how these fast and energy efficient ADAS systems help automotive companies accelerate the rollout of increasingly advanced assisted driving capabilities. In the future, we’d also like to power self-driving cars and trucks. But we don’t want to program these vehicles. To achieve true autonomy, cars and trucks must independently learn how to drive in different environmental and topographical conditions such as icy mountain roads with limited visibility, crowded streets, and fast-moving highways.
Enabling advanced LiDAR with AKIDA Neuromorphic AI inference at the edge With AKIDA, these driving skills can be easily copied and adaptable by millions of self-driving vehicles. This is a particularly important point. AKIDA driving updates will be based on real-world knowledge and skills other cars have learned on the road. We want to avoid boilerplate firmware updates pushed out by engineering teams sitting in cubes. A programmed car lacking advanced learning and inferential capabilities can’t anticipate, understand, or react to new driving scenarios. That’s why BrainChip’s AKIDA focuses on efficiently learning, inferring, and adapting new skills.
Q: Aside from automotive, what are some other multimodal edge use cases AKIDA enables?
Peter: Smart homes, automated factories and warehouses, vibration monitoring and analysis of industrial equipment, as well as advanced speech and facial recognition applications. AKIDA is also accelerating the design of robots using sophisticated sensors to see, hear, smell, touch, and even taste.
Yes brand new and I have reported a fault with the security access via my iPhone which prevents me from advancing to the document. The thing I find funny is that Ken the Robot with AKIDA can pass all these select a boat or bus or plane tests they throw up to eliminate robots from accessing. I have also mentioned the humour involved here to the company as well. LOLHi Dozz,
Yes it's new, great find, I checked out the document properties to find the creation date & it was created on the 24th Aug 2022
See below screenshot.
View attachment 15388
Your 100% on the moneyTell me this isn't Valeo's next generation LiDAR: Enabling advanced LiDAR with AKIDA Neuromorphic AI inference at the edge
Hi @SladeTell me this isn't Valeo's next generation LiDAR: Enabling advanced LiDAR with AKIDA Neuromorphic AI inference at the edge
Exactly. Nintendo has only the Switch console in program by now. An update is coming 2023. The technical dates of it been leaked earlier this year, no Akida inside. According to the usually well informed games scene a new console will not appear earlier than 2026. When there comes a new console with new in game functionalities, it first needs new games which can handle the new features. Games have a very long development cycle, there is no way to speed this up.
So anything about controller updates is much closer to reality than speculations on a brandnew console coming soon containing Akida IP.
I don't think anyone could say 100% that there won't be Akida inside in the update in 2023. I understand what you're saying about the fact that new games may have to be developed to utilize Akida to the fullest capacity and the development time-frame creating new games is particularly long.
But there's no way of knowing that Akida won't be in the upgrade to assist with other things like AI, self-learnig features, noise reduction, etc, whilst new games are being developed. Just a thought?
Cheers TechGirl
Chapter 3 is exciting, its all been stated before but hopefully the world is moving in the Akida direction!!!!
Q: With AKIDA’s neuromorphic architecture, BrainChip is enabling the
semiconductor industry to untether edge AI from cloud data centers. This is quite
timely, because conventional AI silicon and cloud-centric inference models aren’t
performing efficiently at the edge, even as the number of edge-enabled IoT
devices are expected to hit 7.8 billion by 2030. Can you elaborate on the notion
of untethering?
Peter: Increasing internet congestion is increasing latency as more edge devices
upload their data. The power consumption and heat production of massive
parallel Von Neumann type processors is also increasing linearly with the
computing power required by AI applications. That’s why untethering edge AI from
the cloud with AKIDA is a critical step to designing faster and more environmentally
sustainable endpoints.
Chapter 3:
Untethering edge AI from cloud data centers
Differentiating intelligent endpoint requirements
Data Center
Server Edge Endpoint
Power intensive
High latency
Huge memory requirement
Big data inference
High bandwidth
Privacy concerns
Power efficient
Ultra low latency
Small memory requirement
Small data inference, one-shot learning
Low bandwidth
On-chip, in-device
Privacy enabling
The Edge
Data centers hosting cloud-based workloads emitted an estimated 600
megatons of greenhouse gases in 2020 alone, more than the consumption of the
entire United Kingdom (GB). Unless something radically changes, data centers
will consume over 20% of the world’s energy by 2050! With its on-chip learning
and low power, high throughput inference capabilities, we believe AKIDA can help
reduce data center carbon emissions by 98% by decentralizing AI processing.
Intelligently analyzing data on-chip will help put an end to the yottabytes of raw,
unprocessed, and mostly irrelevant data sent to cloud data centers by millions of
endpoints, solving the impeding internet congestion problem.
Using image recognition as an example, we can quantify the power savings
enabled by AKIDA’s on-chip capabilities compared to a GPU in today’s data
center. Specifically, AKIDA can efficiently analyze and categorize the 1.2 million
images of the ImageNet dataset with a minimal power budget of 300 milliwatts.
A GPU performing this task consumes up to 300 watts! This huge difference
illustrates why simply scaling down conventional AI hardware to meet the unique
requirements of edge endpoints is insufficient.
Hi FF,Consider the Communications possibilities with ARM, MegaChips and LON Systems:
You are here:Home/Smart Building Network Based on HD-PLC/The Smart City/Smart Building Network Based on HD-PLC
The Smart City Blog
Smart Building Network Based on HD-PLC
HD-PLC is the new standard for high-speed wireline networks. Based on the IEEE 1901 and ITU-T G.9905 international standards, HD-PLC offers megabit speeds over up to several kilometers of wiring (AC/DC powerlines, twisted-pair, coax, phone lines, etc.). The combination of high-speed, long-distance communication, and physical media flexibility makes it well suited for a wide variety of building automation systems (BAS).
In this article, we review an example smart building network built using HD-PLC. This network can integrate multiple systems found in modern smart buildings, including IP cameras, environmental sensors, security systems, building comfort systems, lighting, and more. Thanks to HD-PLC’s multi-hop technology, any node can act as a repeater to extend network range and robustness. This network can scale to support up to 1024, distributed across the building, using any wiring available.
Support High-Bandwidth Applications
Originally developed for multimedia networks in residential applications, HD-PLC supports blazing-fast megabit data rates (up to 240Mbps, PHY) over powerlines and other physical media. It has since been enhanced with industrial-grade robustness mechanisms to ensure reliable operation in smart building systems. These characteristics make it well-suited for IP camera networks.
The left side of the network diagram shows a standard CCTV surveillance application using HD-PLC CCTV-IP cameras. HD-PLC uses IP-based communications (IPv4 or IPv6 multicasting) to encapsulate packets as UDP messages. This means that IP cameras are able to draw power from the main supply and also use it to transfer high-definition video to the central surveillance station. This simplifies the deployment of IP-based CCTV systems, since you don’t need to install new wiring to support the bandwidth required by these cameras.
Develop Faster, Smarter LON Systems
In 2017, LonMark adopted HD-PLC as the new standard for high-speed wireline networks. Compared to the previous LonWorks solution that was limited to 78kbps over twisted-pair and just 5.4kbps over power line, HD-PLC brings a significant speed advantage to wired control networks. Now system designers can get the cost savings of using existing wiring infrastructure along with the bandwidth they need to build smarter systems.
The right side of the network diagram illustrates a typical LON building automation scenario. The backbone for this network is provided by the main power line, to which a variety of standard LON systems are attached. HD-PLC is fully compatible with LON, and all of the devices are able to behave exactly as they would in any other LON system.
At the top of the diagram is the LON network server (LNS) with SCADA and BMS clients. LonMaker for Windows is another client that one might use in this network to manage installation and maintenance software. These clients are easily connected to the powerline backbone using an HD-PLC/Ethernet adapter.
Connected below the LNS database are HD-PLC LON nodes. These can be connected directly to the powerline, as in Section A. Alternately, they can be connected to an Easylon® HD-PLC router from Gesytec (Sections B and C), which enables seamless translation from HD-PLC LON to other LonWorks channels like the familiar TP/FT-10 physical layer.
In this smart building network example, HD-PLC replaces the existing LON/IP backbones, which require costly Ethernet cabling. By using the same wiring infrastructure used for power distribution, HD-PLC greatly reduces installation cost. It also improves system reliability, since the backbone is now independent of other IT equipment like Ethernet switches.
Simplify Network Deployment and Management
System integrators appreciate the simplicity of HD-PLC. Its built-in multi-hop technology takes the guesswork out of network planning and design by providing the plug-and-play simplicity of a mesh network. With this technology, the nodes in the network dynamically calculate route cost and select the best route based on link quality. This eliminates bottlenecks and improves robustness, since the network will automatically reroute traffic if any given node fails.
MegaChips HD-PLC solution comes with Network Manager software that makes it easy to configure, monitor, and manage complex networks. The bottom of the network diagram shows how the installation laptop can be connected using an HD-PLC/USB adapter to easily manage the network. Simply plug the HD-PLC/USB adapter into any power outlet, and you’re instantly able to connect to the building automation network.
Getting Started with HD-PLC
It’s easy to upgrade to megabit data rates; build larger, more robust IIoT networks; and meet new cybersecurity demands with MegaChips’ HD-PLC SoC with Multi-hop. Combining a state-of-the-art analog front-end (AFE) with baseband, physical (PHY), and media access control (MAC) layers into a single compact package, the MegaChips solution enables you to reliably deliver high-speed, bidirectional, IP-based communications over any wiring.
MegaChips provides all the design resources and tools you need to quickly develop and deploy products with HD-PLC. Evaluation kits give you the hardware, software, and documentation to get started. An SDK is also provided with sample firmware and command programs, as well as tools for power control, channel monitoring, and more. And when you’re ready to deploy, MegaChips’ Network Manager makes it easy with an intuitive interface for configuring, monitoring, and managing networks.
Need to get to market even faster? MegaChips has partnered with Gesytec, the leading supplier of LON interfaces, to give you a market-ready HD-PLC module complete with Gesytec’s Easylon Protocol Stack (EPOS).
At the heart of the HD-PLC module is a standard MegaChips EVK running the HD-PLC and LON stacks. To this, Gesytec has connected an ARM Cortex M0-based CPU board to provide additional I/O and run the application-specific code. This simple partitioning enables you to run your application without interfering with device communications. It also gives you the flexibility to choose the right CPU for your application.
The module includes a license for Gesytec’s Easylon IP, a full-featured LON protocol stack that makes integration into LonWorks environments a breeze. Thanks to the proven EPOS LON stack, the HD-PLC LON Platform acts as a standard LON device that can be managed by the LON tools you already know and love, including LonMaker for Windows and other third-party software.
Additional infrastructure devices, such as phase couplers and range extenders, are available as off-the-shelf products from Gesytec and other vendors to facilitate the design of advanced HD-PLC smart building networks.
Ready to Upgrade to HD-PLC?
Order your evaluation kit today—and discover how easy it is to build smarter, faster, more robust smart building networks with MegaChips.
Michael Navid
VP, Marketing and Business Development, MegaChips
Michael is an accomplished business executive who has spent the last 15 years working to advance the communications technologies needed to build a smarter planet. The original founder of the G3-PLC Alliance, he was a key contributor in the evolution of G3-PLC as the premier communications technology for smart grids. Today, Michael is applying his experience and energy to bringing the benefits of HD-PLC to smart cities and smart buildings. When he’s not driving technology transformation, you’ll likely find him in one of his vintage cars heading down Pacific Coast Highway in Southern California.
Connect with him on LinkedIn.
Hi bravo,Hi @TasTroy77, I'd be interested to know a little more about why you're ruling it out completely.
In the information I posted yesterday, it's clear that Mercedes are taking incremental steps with the introduction of the new operating system. It was reported that "From 2022 to 2023, Mercedes-Benz will be equipped with a lightweight version of the MB.OS operating system on the next generation of new E-class models, before the full version of the MB.OS operating system is launched in 2024". Just because they refer to it as "lightweight" does not necessarily mean it won't include Akida IMO.
On the contrary, I think the signs are all pointing to the inclusion of Akida, particularly when there are references made about advanced voice control.
If we just look at the voice control feature in isolation, another example can be found in the EQS SUV where there are five different equipment packages available. One of the 5 packages is called the Interior Assistant and it includes "the latest iteration of voice control capable of recognizing gestures and it even knows which way the driver is looking thanks to an array of cameras."
The way that I understand it, is "the latest iteration of voice control" is the Vision EQXX's voice control system, which is five to ten times more efficient than conventional voice control, thanks to BrainChip's AKIDA.
Would love to hear your thoughts.
EQE with "advanced voice control"
View attachment 15384
EQS with "the latest iteration of voice control"
View attachment 15387
Mercedes EQE SUV: This is the first look into the cabin of the off-road electric 'E'
After the electric Mercedes-Benz EQS SUV, Stuttgart is preparing the premiere of its smaller sibling […]www.nautiv.comMercedes-Benz EQS SUV launches, starting at €110,658
First electric SUV from mercedes-Benz is finally available for order in Europe with deliveries to begin this December.www.arenaev.com