BRN Discussion Ongoing

chapman89

Founding Member
Sean at the end of the podcast regarding transformers-

“Stay tuned to us, we got some very interesting things coming out here and interesting comments in that space in the near future I promise” 🔥
 
  • Like
  • Fire
  • Love
Reactions: 90 users

ndefries

Regular
  • Like
  • Sad
Reactions: 6 users
D

Deleted member 118

Guest
People give this a thumbs up but it goes and creates work and the company isn't going to confirm this and it potentially creates issues with Teksun. Sometimes just let things be.
 
  • Haha
  • Like
Reactions: 16 users
People give this a thumbs up but it goes and creates work and the company isn't going to confirm this and it potentially creates issues with Teksun. Sometimes just let things be.
I get where you are coming from but this is now publicly listed information. Mistake by Teksun or not, the public are allowed to be privy to this and are not at fault for this information coming out.

Major information leaks from companies all the time. It’s just the way of the digital world. Brainchip have managed to keep a lot of things under wrap but occasionally there are going to be slip ups.

I doubt there will be a need to officially announce anything as it states ‘partnered’ with said companies.
 
  • Like
  • Love
  • Fire
Reactions: 29 users

DK6161

Regular
People give this a thumbs up but it goes and creates work and the company isn't going to confirm this and it potentially creates issues with Teksun. Sometimes just let things be.
I see your point, let's see if Teksun will amend the info on their website.
 
  • Like
Reactions: 6 users
D

Deleted member 118

Guest
Sean at the end of the podcast regarding transformers-

“Stay tuned to us, we got some very interesting things coming out here and interesting comments in that space in the near future I promise” 🔥
Can’t wait as this was my favorite kids show growing up

 
  • Like
  • Haha
  • Sad
Reactions: 7 users

ndefries

Regular
  • Like
  • Haha
  • Fire
Reactions: 13 users
I get where you are coming from but this is now publicly listed information. Mistake by Teksun or not, the public are allowed to be privy to this and are not at fault for this information coming out.

Major information leaks from companies all the time. It’s just the way of the digital world. Brainchip have managed to keep a lot of things under wrap but occasionally there are going to be slip ups.

I doubt there will be a need to officially announce anything as it states ‘partnered’ with said companies.
Leaving aside the other two companies Teksun must given the technology space it occupies have a working relationship with Cisco somewhere in the 5G network space.

And for newer shareholders:


My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Fire
  • Love
Reactions: 32 users

JB49

Regular
Whats everyones thoughts on the Edge Impulse post? Is the post saying that we are only in the Objection Detection product, or are we also in the Machine Health Monitoring device and NOWATCH?
 
  • Like
  • Thinking
Reactions: 2 users
Leaving aside the other two companies Teksun must given the technology space it occupies have a working relationship with Cisco somewhere in the 5G network space.

And for newer shareholders:


My opinion only DYOR
FF

AKIDA BALLISTA
Fact Finder did you like the podcast??
 
  • Like
Reactions: 1 users

Steve10

Regular
Lucky went fishing yesterday & missed the bloodbath.

Heaps of great BRN links & news posted. Takes a while to catch up here. LOL

Looking at markets, US CPI came in at 6% as forecast down from 6.4% YOY.

The US 2 year bond yield has dropped from 5% to 4.25%. This indicates 0.75% rate cuts by US Fed, however, bond market could be wrong. Fedwatch Tool indicates 20% probability of rate pause next week & 80% probability 0.25% rate rise.

The AU 2 year bond yield has dropped from 3.7% to 3.2% indicating 0.5% rate cuts. The RBA Futures chart indicates no more RBA rate hikes & rate cuts in H1 2024.



1678833913390.jpeg
 
  • Like
  • Fire
  • Love
Reactions: 32 users
D

Deleted member 118

Guest
Lucky went fishing yesterday & missed the bloodbath.

Heaps of great BRN links & news posted. Takes a while to catch up here. LOL

Looking at markets, US CPI came in at 6% as forecast down from 6.4% YOY.

The US 2 year bond yield has dropped from 5% to 4.25%. This indicates 0.75% rate cuts by US Fed, however, bond market could be wrong. Fedwatch Tool indicates 20% probability of rate pause next week & 80% probability 0.25% rate rise.

The AU 2 year bond yield has dropped from 3.7% to 3.2% indicating 0.5% rate cuts. The RBA Futures chart indicates no more RBA rate hikes & rate cuts in H1 2024.



View attachment 32158
Good news as my wife fixed in a rate of 1.99% on her mortgage just over 2 years ago that ends at the end of 2024
 
  • Like
Reactions: 14 users

Steve10

Regular


Sean has some good mates. 20 year ex-Nvidia veteran endorsing BRN's ecosystem approach.

Wonder when he will load up on some shares or will Nvidia load up soon?

Could the tranformers news be Nvidia related?
 
  • Like
  • Love
  • Fire
Reactions: 36 users

Taproot

Regular
Awesome T, thanks for sharing.

Below is link to tweet Blog, we are first on the list (y)


BLOG POST

Come Find Edge Impulse at Embedded World 2023​

EMBEDDED DEVICES
Mike Senese
8 March 2023

Linkedin_1200x630_3_ddec41cb49.png

Each year, all the big names in embedded computing gather at Embedded World in Nuremberg, Germany to show off their latest innovations and developments, to meet with partners and customers, and to learn about new advancements in their fields. This year, Embedded World is happening from March 14–16, and Edge Impulse is excited to once again be participating with a range of activities.

embedded world | …it's a smarter world's a smarter world
H5LJ0H2eGnMXfLTpUSvvSxIMORTXeZloKPUHYbSl-Fv9uTF8bzf1wbuWNGzd51iTQsAgX4OiSpDLIVkwpwRKEy8yPL6p7qNXSPQopTt1m_RD0qc3yGRWU8VR2xMEs_PfwtXToapRkXZUEVKyNmI2bis

First held in 2003, Embedded World is known as possibly the largest show in the world for the embedded industry. The exhibition focuses on products and services related to embedded systems, including hardware, software, and tools for developing and testing. The conference portion of the event features presentations and workshops from industry experts on a variety of topics, such as security, connectivity, and real-time operating systems. There’s a lot there for everyone.
With our machine learning toolkit that is ideally optimized for embedded applications, Edge Impulse and Embedded World are a perfect match. Here are some of the different places you will be able to find us and what we’ll be getting up to in each spot.
tkw3srg0yQQxcfB-J4osq_mBLeCqyoe3-ZwAn_AAInh2gqMa5fD3tsx6maxmcFFX46TRGpHIEjdmecQMqNjgodIXms8mVwQety8xIT-L_kKO3GnhYoy77u0DoRoenoN7GNewnghnfPNf5YftFDdLFd4

Edge Impulse Booth
Hall 2, Booth 2-238
This year we will be hosting our own space in the TinyML Pavilion. Our booth will have a demo from BrainChip, showing off our FOMO visual object-detection algorithm running on the BrainChip Akida AKD1000, featuring their neuromorphic IP.
Also at the booth: Meet BrickML, the first product based on the Edge Impulse “Industrial Monitoring” reference design, focused on providing machine learning processing for industrial/machine monitoring applications. Built in collaboration with Reloc and Zalmotek, BrickML can be used to track numerous aspects of industrial machinery performance via its multitude of embedded sensors. We’ll be showing it in a motor-monitoring demonstration. BrickML is fully integrated into the Edge Impulse platform which makes everything from data logging from the device, to ML inference model deployment on to the device a real snap. (Our Industrial Monitoring reference design includes hardware and software source code to rapidly design your own product, available for Edge Impulse enterprise customers.)
esSCdCuQarpv7nYBBz-Yv9U1Mya8kEWFbQMmDDEPH9u9Kk6OJl6oqDTb52oH8YDTSXrxu6bJed55rrZ3skim5klMnOHCVJpCuNgudC3ntnOwhVr0K8o0eBfDLHocEyBHe3jNoVyhCfZEFXzLJMMauAw

We’ll additionally be showing off devices from companies we work with, including Oura, the health-monitoring wearable that is discreetly embedded in a ring you wear on your finger, and NOWATCH, a wrist-based wearable that tracks your stress levels and mental well-being.
File:TexasInstruments-Logo.svg - Wikimedia Commons

Texas Instruments
Hall 3A, Booth 3A-215
In the TI booth you’ll find our Edge Impulse/Texas Instruments demo. This will show TI’s YOLOX-nano-lite model. The model was trained on a Kaggle dataset to detect weeds and crops. The dataset was loaded to Edge Impulse and the YOLOX model was trained via the “Bring Your Own Model” extensions to Edge Impulse Studio. The train model was then deployed to run on the TI Deep Learning framework.
File:Advantech logo.svg - Wikimedia Commons

Advantech
Hall 3, Booth 3-339
Scailable will be displaying their Edge Impulse FOMO-driven object detection implementation at the Advantech booth. It uses the Advantech ICAM camera to distinguish small washers, screws, and other items on several different trays. They’ll be demonstrating different trays and different models for the demo, and showing how to train new models at the booth.
File:AVSystem logo.jpg - Wikimedia Commons

AVSystem
Demo at the Zephyr booth: Hall 4, Booth 4-170
AVSystems’ Coiote is a LwM2M-based IoT device-management platform, providing support for constrained IoT devices at scale. It integrates with a tinyML-based vibration sensor and can detect and report anomalies in vibrations. This demo is based on the Nordic Thingy:91, which runs the Zephyr OS, and uses the Edge Impulse platform.
The Things Conference

Arduino
Hall 2, Booth 2-238
Check out the “vineyard pest monitoring” vision demo, running on the Arduino Nicla Vision and MKR WAN 1310, built by Zalmotek and using Edge Impulse for machine learning.

Avnet: Quality Electronic Components & Services

Avnet
Hall 1, Booth 1-510
Come meet the new RASynBoard from Avnet, built in collaboration with Renesas, Syntiant, TDK, and Edge Impulse. “RASynBoard targets industrial and consumer applications needing battery powered, always-on, smart sensing capabilities, such as sound recognition, command recognition, vibration sensing and environmental sensing. Its small (30mm x 25mm) size makes it ideal as an add-on sensor module for existing products, or its design can be customized and integrated in a larger system for enhanced sensing needs.”
You'll also be able to find RASynBoard at the Abacus booth (3A-125) and the Silica booth (3A-111); the Silica booth will additionally show the Avnet RZBoard V2L, another excellent piece of hardware for which Edge Impulse is proud to offer support.
RASynBoard-EVK-(evaluation-kit).png
alif_4bb36c6432.png

Alif
Hall 4, Booth 4-544
Alif will also be hosting an Edge Impulse-powered demo to the show. It is viewable in their private conference room by appointment; contact kirtana@edgeimpulse.com to set up a meeting.
Press Kit | Synaptics

Synaptics panel, featuring Edge Impulse
Tuesday, 3/14 @ 3PM (local time)
Hall 1, Booth 500
Edge Impulse co-founder/CEO Zach Shelby will be a participant in the “Rapid Development of AI Applications on the Katana SoC” panel, brought to you by one of our partner companies, Synaptics, and moderated by Rich Nass from Embedded Computing Design.
Come find us!
In addition to these locations and scheduled events, we’ll have numerous staff members from Edge Impulse on site and ready to answer any questions you may have about our tools and use cases. Be sure to stop by to say hi.
(And if you can’t make it in person, you can always drop us a note: hello@edgeimpulse.com)
Wow, all happening this morning.
@TechGirl
from your post.
Advantech
Hall 3, Booth 3-339
Scailable will be displaying their Edge Impulse FOMO-driven object detection implementation at the Advantech booth. It uses the Advantech ICAM camera to distinguish small washers, screws, and other items on several different trays. They’ll be demonstrating different trays and different models for the demo, and showing how to train new models at the booth.


Edge AI in industry​

Artificial Intelligence in its essence allows computers to learn advanced tasks based on examples; provide a computer with sufficient examples of a product and, over time, the computer can learn to tell the good from the bad. This used to be an engineering challenge only available in highly constrained and highly costly proofs-of-concept. However, in recent years AI hardware, AI model training, and AI deployment have advanced to a level where robust AI solutions can be created effortlessly and rapidly. And, AI can be deployed securely to edge devices: by moving an AI model close to the data, latency is reduced, network costs are mitigated, and data is as secure as it can be.

During Embedded World we will demonstrate how Edge Impulse, Advantech, and Scailable come together to allow for the effortless creation of Edge AI solutions in industry. Use cases include:

  • Classify (varying) SKUs passing by on a conveyor belt; Use classification for accurate product counting and measurement of change-over times.
  • Segment complex images to check product quality; we have supported cases ranging from bark detection on trees for the paper industry to detecting anomalies in the caps of vaccine bottles.
  • Asset and Parcel tracking; Using advanced QR and bar-code recognition combined with AI to inspect package quality.
Creating the applications above is easier than ever before. However, most applications demand iteration: off-the-shelf applications often need to be (re-)trained to satisfy the use-case at hand. The power of the combination of Edge Impulse (effortless AI model training and labeling), Scailable (full AI pipeline deployment and management to edge devices), and Advantech (sturdy industrial hardware optimized for AI in industry) is to enable an effortless AI lifecycle to create highly robust and accurate AI solutions for your use-case.

The Edge AI lifecycle​

Creating AI used to be hard, almost impossible. Hundreds of thousands of examples were necessary to train a “supercomputer” to come close to the desired accuracy. However, over the last decades technology has developed rapidly. All throughout the Edge AI lifecycle the process has now become feasible:

  1. Annotation of training data: The Edge Impulse platform makes the handling of training data as easy as it can be. Simple visual annotation of training images that is supported by highly advanced active learning methods to make labelling accurate and fast.
  2. Model training and testing: The Edge Impulse platform allows for extremely simple model training and evaluation. Using the Edge Impulse FOMO model architecture allows you to create (through transfer learning) highly accurate segmentation models whily only requiring a small number of example images. And, you can immediately address how well your model works on your test data.
  3. Deployment target selection: Running your AI model—or more accurately, the AI pipeline from input image to output signal—requires robust and specialized hardware. Advantech offers AI ready hardware such as the Advantech ICAM-500; a single integrated industrial camera with sufficient power to run advanced segmentation models.
  4. Deployment and management: Preparing your model pipeline and transferring your trained model to your target device used to be a challenging embedded engineering task. Scailable provides highly optimized, remotely configurable pipelines with modular AI deployment capability. Simply purchase the ICAM with the Scailable AI manager pre-installed and there is no need for any on-device engineering.
  5. Iterate and improve: The faster you can iterate, the faster you can learn, and the more value you will bring. The Scailable AI manager allows you to collect new training images whenever models are uncertain, and, through the integration of your Edge Impulse account with the Scailable AI manager you can easily retrain your model and re-deploy.
We will provide a live demonstration of the ease by which one can train and re-train AI models, deploy complete pipelines without writing code, and create robust AI solutions in industry.

 
  • Like
  • Fire
Reactions: 31 users

dippY22

Regular
Some random bloke going around embedded world today, visiting various stalls


I watched this and ......

Well, .... for Brainchip stockholders such as myself, the Synaptics interview was unsettling. Starting at the 3 hour 52 minute mark it is remarkably what I imagine a Brainchip interview might sound like. Certainly I must be missing something in my understanding of the 3 year abvantage Brainchip says they have over the competition which I imagine Synaptics would be considered to be. Synaptics sure looks like they are a formibable competetitor.

I would love to see an interview with Anil Mankar commenting on a review of the Synoptics interview minute by minute.

Anyone else care to comment on the Synaptics segment? Regards, dippY
 
  • Like
  • Thinking
Reactions: 9 users
D

Deleted member 118

Guest
I watched this and ......

Well, .... for Brainchip stockholders such as myself, the Synaptics interview was unsettling. Starting at the 3 hour 52 minute mark it is remarkably what I imagine a Brainchip interview might sound like. Certainly I must be missing something in my understanding of the 3 year abvantage Brainchip says they have over the competition which I imagine Synaptics would be considered to be. Synaptics sure looks like they are a formibable competetitor.

I would love to see an interview with Anil Mankar commenting on a review of the Synoptics interview minute by minute.

Anyone else care to comment on the Synaptics segment? Regards, dippY
Hopefully the bloke will have more videos coming out, maybe someone could ask him to visit the ones of importance to us.
 
  • Like
Reactions: 1 users
D

Deleted member 118

Guest
Which means neither Peter van der Made or Citi Nominees can be lending to shorts because their lent shares are sold to open the short position so they would drop off the shareholder register.

My opinion only DYOR
FF

AKIDA BALLISTA
Had me thinking, I wonder what the true percentage of shorts is? if we excluded the shorts the shorters can get hold off, must be a very high %.
 
  • Like
Reactions: 1 users
Fact Finder did you like the podcast??
Yes but have to take my wife to the dentist so will explain why a bit later.
Regards
FF

AKIDA BALLISTA
 
  • Like
  • Love
  • Haha
Reactions: 13 users

db1969oz

Regular
  • Haha
  • Like
  • Love
Reactions: 19 users

Steve10

Regular
I watched this and ......

Well, .... for Brainchip stockholders such as myself, the Synaptics interview was unsettling. Starting at the 3 hour 52 minute mark it is remarkably what I imagine a Brainchip interview might sound like. Certainly I must be missing something in my understanding of the 3 year abvantage Brainchip says they have over the competition which I imagine Synaptics would be considered to be. Synaptics sure looks like they are a formibable competetitor.

I would love to see an interview with Anil Mankar commenting on a review of the Synoptics interview minute by minute.

Anyone else care to comment on the Synaptics segment? Regards, dippY

Synaptics is using DSP Group’s nNet Lite NN processor.

DBM10L

DSP Group’s DBM10L is an ultra-low-power, small-form-factor, cost-effective artificial intelligence (AI) and machine learning (ML) SoC based on a digital signal processor (DSP) and neural network (NN) engine, both optimized for voice and sensor processing. It is suitable for battery-operated devices such as smartphones, tablets, wearables, and hearables, including true wireless stereo (TWS) headsets, as well as smart home devices such as remote controls. The DBM10L can enable AI/ML, voice, and sensor fusion functions that include voice trigger (VT), voice authentication (VA), voice command (VC), noise reduction (NR), acoustic echo cancellation (AEC), sound event detection (SED), proximity and gesture detection, sensor data processing, and equalization.

The DBM10L’s NN engine comprises DSP Group’s nNet Lite NN processor, a standalone hardware engine that is designed to accelerate the execution of NN inferences. nNet Lite provides the DBM10L its ML capability and is optimized for maximum efficiency to ensure ultra-low power consumption for small- to medium-size NNs.

The DBM10L is supported by embedded memory, as well as serial and audio interfaces for communication with other devices in the system, such as an application processor (AP), codecs, microphones, and sensors.



DSP Group Unveils DBM10 Low-Power Edge AI/ML SoC with Dedicated Neural Network Inference Processor​

January 7 2021, 13:00
DSP Group announced the DBM10, a new low-power, cost-effective artificial intelligence (AI) and machine learning (ML) system-on-chip (SoC). This new open platform, with a cost- and power-optimized architecture, enables rapid development of AI and ML applications for mobile, wearables, hearables, and connected devices in general. It provides a complete platform in terms of voice and audio processing, without compromising the battery life of new designs, and allowing developers to implement their own differentiating algorithms.

DSP Group is a global provider of wireless and voice-processing chipset solutions with extensive experience in voice implementation and an increasing focus in advanced audio processing for personal audio with hearables (headphones, headsets, earbuds) and wearables (on-body electronics). Its new DBM10 SoC comprises a digital signal processor (DSP) and the company’s nNetLite neural network (NN) processor, both optimized for low-power voice and sensor processing in battery operated devices.

This dual-core architecture offers developers with full flexibility of partitioning innovative algorithms between DSP and NN processor and enables fast time to market for integration of voice and sensing algorithms such as noise reduction, AEC, wake-word detection, voice activity detection and other ML models.

The DBM10 features an open platform approach with a comprehensive software framework. This allows developers to quickly get next-generation designs to market with their own algorithms, or with DSP Group’s comprehensive and proven suite of optimized algorithms for voice, sound event detection (SED), and sensor fusion, as required by applications ranging from true wireless stereo (TWS) headsets to smartphones, tablets, wearables, and connected devices.

"Edge applications for AI are many and diverse, but almost all require the ultimate in terms of low power, small form factor, cost effectiveness, and fast time-to-market, so we are very excited about what the DBM10 brings to current and new customers and partners," says Ofer Elyakim, CEO of DSP Group. "Our team has worked to make the absolute best use of available processing power and memory for low-power AI and ML at the edge — including developing our own patent-pending weight compression scheme —while also emphasizing ease of deployment. We look forward to seeing how creatively developers apply the DBM10 platform."

The DBM10 adds to DSP Group’s SmartVoice line of SoCs and algorithms that are deployed globally in devices ranging from smartphones and laptops/PCs, to set-top boxes, tablets, remote controls, and smart IoT devices for the home. In 2020, SmartVoice shipments reached the 100 millionth milestone, and the new low-power DBM10 is already supported by an established ecosystem of third-party algorithm providers. Some of these have already begun running their NN algorithms on the nNetLite NN processor at the heart of the DBM10 to achieve maximum performance at the lowest power consumption.

1678835512240.png


Working alongside a programmable low-power DSP, the nNetLite processor supports all standard deep NN (DNN) and ML frameworks and employs a comprehensive cross-platform toolchain for model migration and optimization. The SoC device is supplied in a highly-compact form factor (~4 mm2), specified to support ultra-low-power inference at ~500 μW (typical) for voice NN algorithms and being able to run Hello Edge 30-word detection model @ 1 MHz (125 MHz available) as a reference. The DBM10 allows porting of large models (10s of megabytes) without significant accuracy loss using model optimization and compression.
www.dspg.com



Hisense Selects Synaptics’ DBM10L Processor For First AI-Enabled Always-On Voice Remote Control​

AMSTERDAM, The Netherlands, Sept. 09, 2022 – Synaptics® Incorporated (Nasdaq: SYNA) today announced that Hisense, a global leader in consumer electronics and home appliances, selected the DBM10L with its dedicated neural processing unit (NPU) to implement the first artificial intelligence (AI)-enabled always-on voice (AOV) remote control unit (RCU), the EFR3B86H. Hisense paired the DBM10L-equipped RCU with its state-of-the-art 65A9H 4K OLED TV, where Synaptics' high-performance edge-AI processing and low power are vital to ensure the ultimate AOV end-user experience.

“Hisense consistently stays ahead of the curve when it comes to enabling innovative and intuitive features,” said Venkat Kodavati, SVP and Chief Product Officer at Synaptics. “With end users’ increasing reliance upon voice and voice assistants such as Alexa, we are very excited to have worked with them to bring that same experience to TV remote controls. Our collaboration on a high-performance AOV implementation creates the opportunity for remotes to now become a more integral and critical user-engagement platform for the smart home.”

While a reliable and responsive AOV experience for remote controls is increasingly desirable, it is challenging to execute in battery-driven applications. “This is particularly true in noisy environments as more noise translates to more power consumption to prevent performance degradation,” said Shay Kamin Braun, Director of Product Marketing at Synaptics.

The DBM10L enables a superior AOV user experience that combines high performance with ultra-low power consumption, allowing devices to operate for extended periods using a single pair of AAA batteries. “Along with upcoming innovations such as biometrics for voice authentication for online purchases, AOV remote controls for TVs and other consumer devices can now provide greater convenience for users and higher attachment rates for equipment and service providers,” said Kamin Braun.

The DBM10L AOV solution
To solve the power consumption challenge while delivering the best performance for AOV applications, Synaptics built an ultra-low-power voice engine around its DBM10L system-on-chip (SoC), which combines the dedicated NPU with a low-power DSP. The solution comprises the DBM10L and proven algorithms for filtering, noise suppression, beamforming, wake word detection, and voice activity detection. Optimizations allow deep neural network (DNN)-based wake-word detection and other edge AI algorithms to run on the DBM10L's NPU, targeting high performance at ultra-low power with low latency, while different voice and audio processing algorithms run optimally on the integrated low-power DSP.

Availability
The EFR3B86H AOV remote control is shipping now with the Hisense TV model 65A9H. For more information on the DBM10L, visit the DBM10L webpage or contact your local Synaptics sales representative.

For more about the potential of AOV RCUs and how they are changing how we interact with home devices, see “Always-On Voice Makes Content Control Seamless and Intuitive”.

About Hisense
Founded in 1969, Hisense is one of the largest consumer electronics and home appliances companies in the world. Hisense offers a broad range of technology-driven products that are manufactured and distributed across the world, including smart TVs, smart phones, refrigerators, freezers, and air conditioners, among other products. Hisense has a workforce of over 70000 worldwide, and its flat-panel TV market share in China has been No.1 for 13 consecutive years. Currently, Hisense boasts several subsidiaries, with sales revenue reaching CNY 100.3 billion in 2016. For more, visit www.hisenseme.com.

 
  • Like
  • Fire
  • Thinking
Reactions: 17 users
Top Bottom