BRN Discussion Ongoing

M_C

Founding Member
Last edited:
  • Haha
  • Like
  • Love
Reactions: 23 users

Diogenese

Top 20
Can someone please do a health check on @Diogenese please??

He just reacted with fire to my post (attached), please be careful when approaching him, he may be behaving wildy irrational.

very concerned.

tyia
There was sposed to be a stake ...
 
  • Haha
  • Like
Reactions: 8 users

M_C

Founding Member
  • Haha
Reactions: 3 users

Deadpool

hyper-efficient Ai
Hello All, this is for dummies to answer, so I expect no responses :ROFLMAO:

Sundays question: How many companies can currently state that they have on-chip learning, one shot, few shots, continuous learning,
learning on top of learning, power consumption ranging from 1,000th's of a watt down to as low as 1,000,000th's of a watt, can run for 6 months approximately on 2 AAA batteries, doesn't require an internet connection, doesn't require something called "a cloud", protects your security and privacy like no other, doesn't require a VPN, doesn't require 5/6G, is fully commercialized now, has it's brother in production now whom is smarter, smaller and hoping to go to Mars one day, is available in IP Blocks or as a NSoC for as little as $50 USD, can see, smell, taste, hear, feel, can out smart a human through pure speed, was first developed in Belmont, Perth, Western Australia and is nicknamed, "The Brain" ?

Have a great night.

Tech x(y)
Loihi, Am I right?



Season 6 Knowledge GIF by Friends
 
  • Haha
  • Like
Reactions: 10 users

charles2

Regular
Hello All, this is for dummies to answer, so I expect no responses :ROFLMAO:

Sundays question: How many companies can currently state that they have on-chip learning, one shot, few shots, continuous learning,
learning on top of learning, power consumption ranging from 1,000th's of a watt down to as low as 1,000,000th's of a watt, can run for 6 months approximately on 2 AAA batteries, doesn't require an internet connection, doesn't require something called "a cloud", protects your security and privacy like no other, doesn't require a VPN, doesn't require 5/6G, is fully commercialized now, has it's brother in production now whom is smarter, smaller and hoping to go to Mars one day, is available in IP Blocks or as a NSoC for as little as $50 USD, can see, smell, taste, hear, feel, can out smart a human through pure speed, was first developed in Belmont, Perth, Western Australia and is nicknamed, "The Brain" ?

Have a great night.

Tech x(y)
Do I have this right.....a $50 chip equivalent or superior to Nvidia's $30,000 GPU.

If so Brainchip has plenty of room to raise the price or NVDA should be pursuing a buyout.

Or am I delusional?? Ill informed?
 
Last edited:
  • Like
Reactions: 8 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Do I have this right.....a $50 chip equivalent or superior to Nvidia's $30,000 GPU.

If so Brainchip has plenty of room to raise the price or NVDA should be pursuing a buyout.

Or am I delusional?? Ill informed?

No, I don’t think you’re delusional at all Charlie! Have you listened to the latest podcast where PVDM said that AKIDA 10, which is due in 2030 will be the “Holy Grail” of general artificial intelligence? Buy a copy of Peter’s book and listen to older podcasts, interviews and info sessions and you’ll find he’s always been spot on with his forecasting of the timelines of the advancements of this amasing technology. I doubt the top 20 would ever entertain any thoughts of a buyout, let alone all of us long haulers here who are absolutely soldered on to our convictions of the future success of BrainChip. This is just my opinion, but it’s one that I’m prepared to stand by. Hope this helps. 👍
 
  • Like
  • Love
  • Fire
Reactions: 34 users

stockduck

Regular
...got my interest today, sorry if mention before. What kind of IP is in it?


"Stretchy computing device feels like skin—but analyzes health data with brain-mimicking artificial intelligence"​

 
  • Like
  • Fire
  • Wow
Reactions: 7 users

Tothemoon24

Top 20
Morning chippers .
Couid Akida be any part of this technology now or in the future ? I have no idea .





iPronics delivers first reconfigurable photonic microchips
10 February 2023

iPronics_0.png

Photonics chips use up to 10-times less power and can be 20-times faster than electrical chips, while processing far more information
Valencia-based start-up iPronics, the developer of a plug-and-play, programmable photonic microchip, has announced delivery of its first shipments to several companies in distinct sectors.

The chips have been dispatched to customers in the US and Europe, including a multinational telecommunications and electronics company, a European-based optical networking firm and a large US technology company.
The announcement comes seven months after the firm announced its raising of €3.7m funding last year to bring its photonic processors to market. Its goal is to make computational photonics commercially affordable and encourage its adoption across all layers of industry.

What are photonic chips?

Emerging technology trends in autonomous vehicles and lidar, 5G signal processing, deep learning and AI, cyber security, DNA sequencing, and drug discovery require much faster, more flexible, power-efficient computation. While advanced electronic chips (e.g. GPUs, TPUs or FPGAs) are continually advancing in capability, they may struggle to keep up with increasing performance requirements, leading to today’s hardware forming a bottleneck.
While in an electronic chip, signals are delivered via electron flux passing through electrical components such as resistors, inductors, transistors and capacitors, in a photonic chip signals are delivered via photons passing – at the speed of light – through optical components such as waveguides, polarisers, and phase shifters. Photonic chips have the potential to deliver lower latency, lower power consumption (photons/light consume less energy than electrons), higher bandwidth and higher density than their electronic counterparts.

The SmartLight Processor

In iPronics’ case, the firm has developed a general-purpose photonic processor capable of programming high-speed light signals on-chip with unprecedented flexibility. Dubbed ‘SmartLight Processor’, the chip uses up to 10-times less power and can be 20-times faster than electrical chips while processing far more information.


The new processor enables the reconfiguration of a common photonic hardware platform through user-friendly software. According to the firm, this is the first-in-class fully programmable photonic chip, as previous photonic integrated circuits have been fixed-function or application-specific in operation.
iPronics2.jpg

iPronics has now shipped its SmartLight Processor to firms in three different industry sectors in the EU and US, including telecoms
It represents a cost-effective solution that enables the same hardware to be applied to a wide range of applications throughout emerging markets and technologies that have a voracious appetite for computational power. Examples include 5G and 6G communication, data centres, machine learning, artificial intelligence, autonomous driving, quantum computing, and the internet of things (IoT).
The programmable nature of this technology unlocks novel commercial applications as it allows the generation of optical functionalities in software, which critically reduces time to market and total costs for system design, prototyping, and production.

Rapid time to market

In addition to its programmable advantages, the SmartLight Processor offers significant time-to-market and cost benefits, according to iPronics.
“Compared to custom photonic ICs, the development time can be cut from 18 months down to a couple of weeks,” the firm said in its shipment announcement. “This lowers the total cost and mitigates risk for our clients while delivering on the promise of photonic processing: lower power consumption, lower latency, and faster computation. The SmartLight Processor enables innovative tech companies to continue their cutting-edge silicon photonics work on several fronts such as high-speed optical communications, RF photonics, and neuromorphic computing.”
According to the firm, completing its first three product shipments in three different sectors in the EU and US makes it the first photonic start-up that has done anything beyond the R&D stage and developed a product for mass usage.
Mark Halfman, iPronics CEO, added: “For a company that was founded just prior to the pandemic, it is almost unprecedented to move so swiftly from development to shipping our first commercial orders supporting a variety of applications. Today's announcement is a testament to the vision of the company’s founders and the dedication of the entire team. This is both a watershed moment for the photonics industry and an exciting time for the company.”


12987_PHCBI_MCO_Banner_Background_1920x1080.jpg

720x900_ULT%20eerily%20quiet%20(1).jpg











Home
Subscribe
Hamburger Menu




MEDTECH

Apple's long-desired glucose tracking is reportedly at proof-of-concept stage: Bloomberg​

By Andrea ParkFeb 23, 2023 03:15pm
AppleApple Watchcontinuous glucose monitorDiabetes
Share
A wrist with an Apple Watch
Apple’s approach to glucose monitoring is said to combine silicon photonics and optical absorption spectroscopy: It beams specific wavelengths of light into the interstitial fluid below the skin, and all light not absorbed by glucose bounces back to the sensor. From there, an algorithm calculates the wearer's blood sugar levels. (Anna Hoychuk/Shutterstock)
For much of the last decade, rumors swirling around Silicon Valley have suggested that Apple is aiming to one day bring completely noninvasive glucose tracking to its eponymous smartwatch.

And that day may be closer than ever, according to the latest of those accounts. Bloomberg reported Wednesday that the tech giant has reached the proof-of-concept stage for sensor technology that could ultimately allow Apple Watch wearers with diabetes to monitor their blood sugar levels around the clock without requiring any skin-pricking for calibration or analysis.
Last%20Chance%20TMF%20300x250%20(2).png



Bloomberg’s report cited a handful of unnamed sources familiar with the highly secretive project—known as E5 within the company. After more than a decade’s worth of work, the sources said E5 has recently hit “major milestones” that have made Apple optimistic about the technology’s commercial feasibility.
Apple declined to comment on Bloomberg’s report, nor did the company immediately respond to Fierce Medtech’s request for comment.

RELATED​

Could $250 Apple AirPods disrupt the OTC hearing aid market? One study shows it's possible
The E5 initiative reportedly dates back to 2010, when then-CEO Steve Jobs led the quiet acquisition of RareLight, a startup developing a new method for noninvasive glucose monitoring. In the years since, Apple became one of two primary customers of Rockley Photonics, the maker of biosensor technology that includes noninvasive blood sugar tracking—a development that further fed the rumors of Apple’s eventual entry into the diabetes management market.

The Rockley partnership has since ended, per Bloomberg—and Rockley has since filed for bankruptcy—with Apple shifting its chip-making needs to Taiwan Semiconductor Manufacturing Co.
Though it reportedly previously operated under the guise of a health tech startup that was seemingly completely separate from Apple and dubbed Avolonte Health, the E5 project is now housed within Apple’s Exploratory Design Group, or XDG, which serves as an incubator for a handful of so-called moonshot initiatives.
Apple’s approach is said to combine silicon photonics and optical absorption spectroscopy: It beams specific wavelengths of light into the interstitial fluid below the skin, and all light not absorbed by glucose bounces back to the sensor. From there, an algorithm calculates the concentration of glucose in the blood.
E5 has already cost the company hundreds of millions of dollars and is being worked on by hundreds of engineers, the sources told Bloomberg. If the noninvasive technology is ultimately cleared by the FDA for addition to the Apple Watch, it would be the third such nod for the wearable, joining its built-in ECGand the “AFib History” feature—a sharp reversal of CEO Tim Cook’s once-professed belief that the Apple Watch would never become an FDA-regulated device.

For now, though the approach has reportedly cleared preliminary trials comparing its performance to glucose trackers that require blood samples, the hardware is currently too big to be embedded into an Apple Watch. Once sufficiently shrunk down, however—a process that will likely take several more years—it could potentially be used not only to monitor diagnosed cases of diabetes but also to help detect early signs of the condition.

 
Last edited:
  • Like
  • Thinking
  • Fire
Reactions: 14 users

Ian

Founding Member
 
  • Like
  • Wow
Reactions: 11 users

M_C

Founding Member

“Not only does Sony’s Altair ALT1350 chipset enable hybrid connectivity options, but its advanced architecture sets a new bar in ultra-low power consumption,” said Dima Feldman, VP of Product Management and Marketing, Sony Semiconductor Israel. “Its optimized standby mode (eDRX) reduces power consumption by 80% when compared to the current generation, with battery-operated devices benefitting from 4 times longer battery life for typical use cases. Integrated into Sierra Wireless’ latest 5G HL7900 module,” continued Mr. Feldman, “it answers the evolving and rapidly expanding needs of the cellular LPWA IoT market, enabling faster development, additional functionality, new use cases and reduced costs for customers. We’re thrilled to be continuing our partnership with Sierra Wireless to enable the next-generation of innovative LPWA solutions.”

Screenshot_20230227_063340_LinkedIn.jpg
 
  • Like
  • Fire
Reactions: 12 users

Makeme 2020

Regular
MOBILE PHONE EXPO BARCELONA STARTED TODAY

EVENTSNOW – ALL UPCOMING EVENT

Find all upcoming Academic Events, International Conferences, Virtual Meetings, Webinars, Concert, Festival, Video Conferences, Events near me, Web Conferences, Live Show etc. | Submit Your Eventsnow for free and get large number of attendees for your event to make grand success

MWC Mobile World Congress 2023 Barcelona​

September 29, 2022

| No Comments


MWC-Barcelona-1.jpg

MWC Mobile World Congress 2023 Barcelona​

by

498
MWC Mobile World Congress 2023 is the largest and most significant trade fair for the telecom sector bringing together the whole mobile technology industry on one global platform. MWC Barcelona 2023 provides telecom professionals an excellent opportunity to network with industry experts and showcase cutting-edge technology and innovations globally. MWC exhibition will take place from 27th February 2023- 2nd March 2023 in Barcelona.
REGISTER FOR EVENT

Date And Time​

2023-02-27 to
2023-03-02

Location​

Barcelona, Spain

Event Types​

Trade Shows, Consumer Show or Expo

Event Category​

Business & Professional

Share With Friends​

Facebook
Twitter
Linkedin
Xing
Pinterest




Post navigation​

2023 International Conference on Frontiers of Artificial Intelligence and Machine Learning (FAIML 2023)
7th International Conference on Intelligent Computing and Control Systems (ICICCS 2023)
 
  • Like
  • Fire
Reactions: 11 users
Apple has a secret team working to bring noninvasive glucose monitoring to its smartwatch, but that’s not all it’s pursuing. Also: The company’s upcoming headset likely won’t need an iPhone and follow-up models are already in the works, plus schedule changes are in store for Apple retail employees.

Last week in Power On: Apple’s upcoming mixed-reality headset and WWDC are a perfect match.



The Starters​

-1x-1.jpg

The Apple Park campus.
Photographer: Sam Hall/Bloomberg
Apple Inc. is famous for keeping its future products under wraps, but even by those standards the company’s Exploratory Design Group is secretive.

As I first revealed last week, this covert team is the brains behind future no-prick glucose tracking technology for the Apple Watch. And that’s not all it’s working on. The group is akin to X, Alphabet Inc.’s “moonshot factory,” which helped develop Waymo self-driving car technology, Google Glass and Loon internet balloons.

Though the Apple team — better known inside the company as XDG — is primarily focused on the glucose work, there are several other projects underway and it’s made key contributions to existing Apple devices.

The team originated several years ago and was long led by Bill Athas, one of the few people to have the title of engineering fellow at Apple, until he passed away unexpectedly at the end of last year. Athas was seen by the late co-founder Steve Jobs and current Chief Executive Officer Tim Cook as one of the brightest engineering minds at the company.

The XDG team sits within Apple’s Hardware Technologies group, led by Senior Vice President Johny Srouji, and works at a building known as Tantau 9 right outside of the Apple Park spaceship-shaped ring.

The team is now run on a day-to-day basis by a number of Athas lieutenants, including top Apple engineers and scientists Jeff Koller, Dave Simon, Heather Sullens, Bryan Raines and Jared Zerbe. Koller, Simon and Raines are involved in the glucose project, while Sullens and Zerbe manage other groups within the larger team.

-1x-1.jpg

Apple’s Johny Srouji, who oversees the XDG team.
Source: Apple
The Exploratory Design Group operates as a startup within Apple and is made up of only a few hundred people, mostly engineers and academic types. That’s a far cry from the many hundreds of people in the Special Projects Group, which is focused on Apple’s self-driving car, or the more than a thousand engineers in Apple’s Technology Development Group, the team building the mixed-reality headset.

Beyond the glucose work, XDG is working on next-generation display technology, artificial intelligence and features for AR/VR headsets that help people with eye diseases. The team originally came together under Athas to work on low-power processor technologies and next-generation batteries for smartphones, efforts that continue.

Like Alphabet’s moonshot team — and those at other Silicon Valley companies — the XDG staff is given vast financial resources and headroom to explore countless ideas. The members have a different remit than the engineering teams churning out new iPhones, iPads and Apple Watches annually. Instead, they’re instructed to work on projects until they can determine whether or not an idea is feasible.

The unit is even more secretive than Alphabet’s X but it’s not a pie-in-the-sky operation. It has already had breakthroughs that made their way into Apple products. Many of the chip and battery technologies developed by XDG have been shipping for years in iPhones, iPads and Macs.

While the team operates as a startup, it is still compartmentalized like any other Apple division: People working on one project within XDG aren’t allowed to communicate about their work with other members of XDG that are assigned to different projects.

But the team’s members are organized by skill sets rather than individual projects. That means that one engineer could be working on several initiatives that fit their skills, rather than on one specific product.

The Bench​

-1x-1.jpg

An HTC headset at the Apple WWDC conference in 2017.
Photographer: David Paul Morris/Bloomberg
The Apple headset probably won’t require an iPhone, and other new models are in development. The company’s first mixed-reality headset, unlike the original Apple Watch, probably won’t require an iPhone for setup or use. I’m told that the latest test versions of the device and its onboard xrOS operating system can be set up without an iPhone and can download a user’s content and iCloud data directly from the cloud.

You will, however, be able to transfer your data from an iPhone or iPad, just as you can today when setting up a new device. As I’ve written previously, the headset doesn’t have a remote control but instead is operated by a user’s eyes and hands.

A key feature for text input — in-air typing — is available on the latest internal prototypes, I’m told. But it’s been finicky in testing. So if you get the first headset, you still may want to pair an iPhone to use its touch-screen keyboard. The hope within Apple is to make rapid improvements after the device is released. The company expects its headset to follow the same path as the original Apple Watch in that respect.

Apple is currently planning to unveil its first headset, which may be dubbed the Reality Pro, at WWDC in June. The product would then ship toward the end of 2023 at the earliest. But there are already follow-ups in the works, too.

As I wrote in January, Apple is planning to launch a cheaper headset with lower-end display and processor components at the end of 2024 or in 2025. That will help address users who don’t want to pay around $3,000 for the high-end model. Based on trademark filings, the cheaper version may be dubbed the Reality One. And, unsurprisingly, there’s already a second-generation version of the Reality Pro underway.

I’m told the focus of the second Pro headset is performance. While the first model will have an M2 chip — plus a secondary chip for AR and VR processing — it’s not powerful enough to output graphics at a level Apple would ideally like. For instance, FaceTime will only support realistic VR representations of two people at a time, not everyone in a conference call.

Apple’s first headset was initially planned to be even more powerful, featuring a separate hub with additional processing power that could be beamed to the device across a home wirelessly. But former Apple design chief Jony Ive nixed that idea. Now the company is working to add a more powerful processor (perhaps a variant of the M3 or M4) for the second model, helping bridge that gap.

-1x-1.jpg

An Apple store employee helps a customer.
Photographer: Alejandro Cegarra/Bloomberg
Apple rolls out scheduling changes to all US and Canada retail employees. Over the past year, some Apple retail employees have voiced concerns about pay, benefits and working hours. To address some of those issues, Apple has revamped its scheduling process, including the number of hours it requires for both full-time and part-time workers. I first wrote about those changes last June when they were rolled out to some stores as a pilot.

Beginning on April 29, Apple will bring the new policies to all of its roughly 300 stores in the US and Canada, according to a recent memo. The main changes:

  • A maximum of five consecutive workdays, down from the prior limit of six.
  • More weekend time off for part-time employees.
  • A consistent weekend workday or day off for full-time employees.
The caveat is that these new rules could go out the window during peak shopping periods (a new iPhone launch, for instance) or for all-hands meetings. Another change is that time off must be requested at least four weeks in advance, a slight adjustment from a prior requirement of about three weeks in advance.

Still, some part-time Apple retail employees are concerned about another change that they say is being introduced by managers at some stores: a requirement to work on weekends. They fear that they’ll be terminated from the company if they don’t agree to work on those days, despite not having to do so when they joined Apple.

Other part-time employees also have said that their managers are asking them to work at least a few extra hours per week than they were previously required to.

The Schedule​

-1x-1.jpg

The Steve Jobs Theater, where Apple held its shareholder meetings before the pandemic.
Photographer: Nic Coury/Bloomberg
March 10: Apple’s annual shareholder meeting. Cook and his lieutenants, such as General Counsel Kate Adams, will take the virtual stage to field carefully selected questions from shareholders and give some company updates. Major news rarely breaks at these conferences, but there will be shareholder votes on Apple’s board, executive pay, labor and other matters.

Post-Game Q&A​

Q: Do you think the Apple Card will expand to Europe anytime soon?
Q: How risky is the Apple glucose initiative to current medical device makers?
Q: Should we expect new AirPods Pro earbuds this year?


Email me, or you can always send me a tweet or DM @markgurman.

I’m on Signal at 413-340-6295; Wickr and Telegram at GurmanMark; or ProtonMail at markgurman@protonmail.com.

More from Bloomberg​

Listen: Foundering: The John McAfee Story is a new six-part podcast series retracing the life, the myths and the self-destruction of a Silicon Valley icon. Subscribe for free on Apple, Spotify or wherever you get your podcasts.

Get Tech Daily and more Bloomberg Tech weeklies in your inbox:

  • Cyber Bulletin for coverage of the shadow world of hackers and cyber-espionage
  • Game On for a playthrough of the video game business
  • Screentime for a front-row seat to the collision of Hollywood and Silicon Valley
  • Soundbite for reporting on podcasting, the music industry and audio trends
 
  • Like
  • Thinking
  • Wow
Reactions: 16 users
Apple has a secret team working to bring noninvasive glucose monitoring to its smartwatch, but that’s not all it’s pursuing. Also: The company’s upcoming headset likely won’t need an iPhone and follow-up models are already in the works, plus schedule changes are in store for Apple retail employees.

Last week in Power On: Apple’s upcoming mixed-reality headset and WWDC are a perfect match.



The Starters​

-1x-1.jpg

The Apple Park campus.
Photographer: Sam Hall/Bloomberg
Apple Inc. is famous for keeping its future products under wraps, but even by those standards the company’s Exploratory Design Group is secretive.

As I first revealed last week, this covert team is the brains behind future no-prick glucose tracking technology for the Apple Watch. And that’s not all it’s working on. The group is akin to X, Alphabet Inc.’s “moonshot factory,” which helped develop Waymo self-driving car technology, Google Glass and Loon internet balloons.

Though the Apple team — better known inside the company as XDG — is primarily focused on the glucose work, there are several other projects underway and it’s made key contributions to existing Apple devices.

The team originated several years ago and was long led by Bill Athas, one of the few people to have the title of engineering fellow at Apple, until he passed away unexpectedly at the end of last year. Athas was seen by the late co-founder Steve Jobs and current Chief Executive Officer Tim Cook as one of the brightest engineering minds at the company.

The XDG team sits within Apple’s Hardware Technologies group, led by Senior Vice President Johny Srouji, and works at a building known as Tantau 9 right outside of the Apple Park spaceship-shaped ring.

The team is now run on a day-to-day basis by a number of Athas lieutenants, including top Apple engineers and scientists Jeff Koller, Dave Simon, Heather Sullens, Bryan Raines and Jared Zerbe. Koller, Simon and Raines are involved in the glucose project, while Sullens and Zerbe manage other groups within the larger team.

-1x-1.jpg

Apple’s Johny Srouji, who oversees the XDG team.
Source: Apple
The Exploratory Design Group operates as a startup within Apple and is made up of only a few hundred people, mostly engineers and academic types. That’s a far cry from the many hundreds of people in the Special Projects Group, which is focused on Apple’s self-driving car, or the more than a thousand engineers in Apple’s Technology Development Group, the team building the mixed-reality headset.

Beyond the glucose work, XDG is working on next-generation display technology, artificial intelligence and features for AR/VR headsets that help people with eye diseases. The team originally came together under Athas to work on low-power processor technologies and next-generation batteries for smartphones, efforts that continue.

Like Alphabet’s moonshot team — and those at other Silicon Valley companies — the XDG staff is given vast financial resources and headroom to explore countless ideas. The members have a different remit than the engineering teams churning out new iPhones, iPads and Apple Watches annually. Instead, they’re instructed to work on projects until they can determine whether or not an idea is feasible.

The unit is even more secretive than Alphabet’s X but it’s not a pie-in-the-sky operation. It has already had breakthroughs that made their way into Apple products. Many of the chip and battery technologies developed by XDG have been shipping for years in iPhones, iPads and Macs.

While the team operates as a startup, it is still compartmentalized like any other Apple division: People working on one project within XDG aren’t allowed to communicate about their work with other members of XDG that are assigned to different projects.

But the team’s members are organized by skill sets rather than individual projects. That means that one engineer could be working on several initiatives that fit their skills, rather than on one specific product.

The Bench​

-1x-1.jpg

An HTC headset at the Apple WWDC conference in 2017.
Photographer: David Paul Morris/Bloomberg
The Apple headset probably won’t require an iPhone, and other new models are in development. The company’s first mixed-reality headset, unlike the original Apple Watch, probably won’t require an iPhone for setup or use. I’m told that the latest test versions of the device and its onboard xrOS operating system can be set up without an iPhone and can download a user’s content and iCloud data directly from the cloud.

You will, however, be able to transfer your data from an iPhone or iPad, just as you can today when setting up a new device. As I’ve written previously, the headset doesn’t have a remote control but instead is operated by a user’s eyes and hands.

A key feature for text input — in-air typing — is available on the latest internal prototypes, I’m told. But it’s been finicky in testing. So if you get the first headset, you still may want to pair an iPhone to use its touch-screen keyboard. The hope within Apple is to make rapid improvements after the device is released. The company expects its headset to follow the same path as the original Apple Watch in that respect.

Apple is currently planning to unveil its first headset, which may be dubbed the Reality Pro, at WWDC in June. The product would then ship toward the end of 2023 at the earliest. But there are already follow-ups in the works, too.

As I wrote in January, Apple is planning to launch a cheaper headset with lower-end display and processor components at the end of 2024 or in 2025. That will help address users who don’t want to pay around $3,000 for the high-end model. Based on trademark filings, the cheaper version may be dubbed the Reality One. And, unsurprisingly, there’s already a second-generation version of the Reality Pro underway.

I’m told the focus of the second Pro headset is performance. While the first model will have an M2 chip — plus a secondary chip for AR and VR processing — it’s not powerful enough to output graphics at a level Apple would ideally like. For instance, FaceTime will only support realistic VR representations of two people at a time, not everyone in a conference call.

Apple’s first headset was initially planned to be even more powerful, featuring a separate hub with additional processing power that could be beamed to the device across a home wirelessly. But former Apple design chief Jony Ive nixed that idea. Now the company is working to add a more powerful processor (perhaps a variant of the M3 or M4) for the second model, helping bridge that gap.

-1x-1.jpg

An Apple store employee helps a customer.
Photographer: Alejandro Cegarra/Bloomberg
Apple rolls out scheduling changes to all US and Canada retail employees. Over the past year, some Apple retail employees have voiced concerns about pay, benefits and working hours. To address some of those issues, Apple has revamped its scheduling process, including the number of hours it requires for both full-time and part-time workers. I first wrote about those changes last June when they were rolled out to some stores as a pilot.

Beginning on April 29, Apple will bring the new policies to all of its roughly 300 stores in the US and Canada, according to a recent memo. The main changes:

  • A maximum of five consecutive workdays, down from the prior limit of six.
  • More weekend time off for part-time employees.
  • A consistent weekend workday or day off for full-time employees.
The caveat is that these new rules could go out the window during peak shopping periods (a new iPhone launch, for instance) or for all-hands meetings. Another change is that time off must be requested at least four weeks in advance, a slight adjustment from a prior requirement of about three weeks in advance.

Still, some part-time Apple retail employees are concerned about another change that they say is being introduced by managers at some stores: a requirement to work on weekends. They fear that they’ll be terminated from the company if they don’t agree to work on those days, despite not having to do so when they joined Apple.

Other part-time employees also have said that their managers are asking them to work at least a few extra hours per week than they were previously required to.

The Schedule​

-1x-1.jpg

The Steve Jobs Theater, where Apple held its shareholder meetings before the pandemic.
Photographer: Nic Coury/Bloomberg
March 10: Apple’s annual shareholder meeting. Cook and his lieutenants, such as General Counsel Kate Adams, will take the virtual stage to field carefully selected questions from shareholders and give some company updates. Major news rarely breaks at these conferences, but there will be shareholder votes on Apple’s board, executive pay, labor and other matters.

Post-Game Q&A​

Q: Do you think the Apple Card will expand to Europe anytime soon?
Q: How risky is the Apple glucose initiative to current medical device makers?
Q: Should we expect new AirPods Pro earbuds this year?


Email me, or you can always send me a tweet or DM @markgurman.

I’m on Signal at 413-340-6295; Wickr and Telegram at GurmanMark; or ProtonMail at markgurman@protonmail.com.

More from Bloomberg​

Listen: Foundering: The John McAfee Story is a new six-part podcast series retracing the life, the myths and the self-destruction of a Silicon Valley icon. Subscribe for free on Apple, Spotify or wherever you get your podcasts.

Get Tech Daily and more Bloomberg Tech weeklies in your inbox:

  • Cyber Bulletin for coverage of the shadow world of hackers and cyber-espionage
  • Game On for a playthrough of the video game business
  • Screentime for a front-row seat to the collision of Hollywood and Silicon Valley
  • Soundbite for reporting on podcasting, the music industry and audio trends
Anything Apple can do TATA can too:

EchoWrite-SNN: Acoustic Based Air-Written Shape Recognition Using Spiking Neural Networks

AM George, A Gigie, AA Kumar, S Dey… - … on Circuits and …, 2022 - ieeexplore.ieee.org
… on different available neuromorphic platforms such as Brainchip Akida [22], Intel Loihi [21] etc. … to enhance intuitive surgical robotteleoperation,” arXiv preprint arXiv:2102.10585, 2021. …
Related articles

EchoWrite-SNN: Acoustic Based Air-Written Shape Recognition Using Spiking Neural Networks​

Arun M George, Andrew Gigie, A Anil Kumar, Sounak Dey, Arpan Pal, K Aditi
2022 IEEE International Symposium on Circuits and Systems (ISCAS), 1067-1071, 2022
In this paper, we propose EchoWrite-SNN, a robust edge compatible air-writing recognition system (used in applications such as AR/VR, HRI etc.) based on principles of SONAR and neuromorphic computing. The bare finger movements in air are captured by a pair of commonly available speaker-microphone pair. A new tracking algorithm based on windowed difference cross-correlation and ESPRIT is employed which shows better tracking accuracy compared to state-of-the-art methods with a median tracking error of only 3.31mm. To classify these air-written shapes, a 5-layer CNN is trained and then converted to a Spiking Neural Network (SNN) using ANN-to-SNN conversion technique to reap the benefits of low power neuromorphic computing on edge. Experimental results show that the converted SNN achieves 92% accuracy (a mere 3% less than the CNN) while showing 4.4 × reduction in number of operations compared to CNN resulting in further energy benefit when run on actual neuromorphic computation platforms.

My opinion only DYOR
FF


AKIDA BALLISTA
 
  • Like
  • Fire
  • Love
Reactions: 55 users

Lex555

Regular
Anything Apple can do TATA can too:

EchoWrite-SNN: Acoustic Based Air-Written Shape Recognition Using Spiking Neural Networks

AM George, A Gigie, AA Kumar, S Dey… - … on Circuits and …, 2022 - ieeexplore.ieee.org
… on different available neuromorphic platforms such as Brainchip Akida [22], Intel Loihi [21] etc. … to enhance intuitive surgical robotteleoperation,” arXiv preprint arXiv:2102.10585, 2021. …
Related articles

EchoWrite-SNN: Acoustic Based Air-Written Shape Recognition Using Spiking Neural Networks​

Arun M George, Andrew Gigie, A Anil Kumar, Sounak Dey, Arpan Pal, K Aditi
2022 IEEE International Symposium on Circuits and Systems (ISCAS), 1067-1071, 2022
In this paper, we propose EchoWrite-SNN, a robust edge compatible air-writing recognition system (used in applications such as AR/VR, HRI etc.) based on principles of SONAR and neuromorphic computing. The bare finger movements in air are captured by a pair of commonly available speaker-microphone pair. A new tracking algorithm based on windowed difference cross-correlation and ESPRIT is employed which shows better tracking accuracy compared to state-of-the-art methods with a median tracking error of only 3.31mm. To classify these air-written shapes, a 5-layer CNN is trained and then converted to a Spiking Neural Network (SNN) using ANN-to-SNN conversion technique to reap the benefits of low power neuromorphic computing on edge. Experimental results show that the converted SNN achieves 92% accuracy (a mere 3% less than the CNN) while showing 4.4 × reduction in number of operations compared to CNN resulting in further energy benefit when run on actual neuromorphic computation platforms.

My opinion only DYOR
FF


AKIDA BALLISTA
Great find FF, looks like science fiction to me
 
  • Like
  • Love
Reactions: 13 users

Lex555

Regular
Good morning! We have a media release:

BrainChip Partners with emotion3D to Improve Driver Safety and User Experience​


2/26/2023
LAGUNA HILLS, CA / ACCESSWIRE / February 26, 2023 /BrainChip Holdings Ltd (ASX:BRN)(OTCQX:BRCHF)(ADR:BCHPY), the world's first commercial producer of ultra-low power, fully digital, event-based, neuromorphic AI IP, today announced that it has entered into a partnership with emotion3D to demonstrate in-cabin analysis that makes driving safer and enables next level user experience.

emotion3D offers state-of-the-art computer vision and machine learning software for image-based analysis of in-cabin environments. This analysis enables a comprehensive understanding of humans and objects inside a vehicle. The partnership will allow emotion3D to leverage BrainChip's technology to achieve an ultra-low-power working environment with on-chip learning while processing everything locally on device within the vehicle to ensure data privacy.

"We are committed to setting the standard in driving safety and user experience through the development of camera-based, in-cabin understanding," says Florian Seitner, CEO at emotion3D. "In combining our in-cabin analysis software with BrainChip's on-chip compute, we are able to elevate that standard in a faster, safer and smarter way. This partnership will provide a cascading number of benefits that will continue to disrupt the mobility industry."

Among some of the situations covered by this optimized driver monitoring functionality are warnings for driver distractions and drowsiness, device personalization, gesture recognition, passenger detection, and more.

"Processing in-cabin data requires significant compute and associated power," said Sean Hehir, BrainChip CEO. "By leveraging BrainChip's Akida processor IP, emotion3D is able to improve intelligent safety and user experience functions by analyzing the data in real time and forward inference data to the automobile's central processor. Together, we improve the next generation of intelligent vehicles and give drivers a safer, enhanced user experience."
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 121 users

Foxdog

Regular
on different available neuromorphic platforms such as Brainchip Akida [22], Intel Loihi [21]
Thanks FF. Just wondering why it is that numerous articles mention AKIDA in the same space as Loihi. As far as I understand Loihi is still being developed and is far from commercially available (if ever). AKIDA is proven and ready to implement. Is Loihi a direct competitor now or are we still 3 years plus ahead? Or is it that Intel get a mention because they are well known and BRN is not (yet)? Surely AKIDA would outperform Loihi and would be the preferred choice after comparison.
 
  • Like
Reactions: 6 users
Good morning! We have a media release:

BrainChip Partners with emotion3D to Improve Driver Safety and User Experience​


2/26/2023
LAGUNA HILLS, CA / ACCESSWIRE / February 26, 2023 /BrainChip Holdings Ltd (ASX:BRN)(OTCQX:BRCHF)(ADR:BCHPY), the world's first commercial producer of ultra-low power, fully digital, event-based, neuromorphic AI IP, today announced that it has entered into a partnership with emotion3D to demonstrate in-cabin analysis that makes driving safer and enables next level user experience.

emotion3D offers state-of-the-art computer vision and machine learning software for image-based analysis of in-cabin environments. This analysis enables a comprehensive understanding of humans and objects inside a vehicle. The partnership will allow emotion3D to leverage BrainChip's technology to achieve an ultra-low-power working environment with on-chip learning while processing everything locally on device within the vehicle to ensure data privacy.

"We are committed to setting the standard in driving safety and user experience through the development of camera-based, in-cabin understanding," says Florian Seitner, CEO at emotion3D. "In combining our in-cabin analysis software with BrainChip's on-chip compute, we are able to elevate that standard in a faster, safer and smarter way. This partnership will provide a cascading number of benefits that will continue to disrupt the mobility industry."

Among some of the situations covered by this optimized driver monitoring functionality are warnings for driver distractions and drowsiness, device personalization, gesture recognition, passenger detection, and more.

"Processing in-cabin data requires significant compute and associated power," said Sean Hehir, BrainChip CEO. "By leveraging BrainChip's Akida processor IP, emotion3D is able to improve intelligent safety and user experience functions by analyzing the data in real time and forward inference data to the automobile's central processor. Together, we improve the next generation of intelligent vehicles and give drivers a safer, enhanced user experience."

emotion3D - Garmin connection

1677451584027.png


1677451777711.png



 
  • Like
  • Fire
  • Love
Reactions: 68 users
  • Like
  • Love
  • Fire
Reactions: 65 users

Boab

I wish I could paint like Vincent
emotion3D - Garmin connection

View attachment 30706

View attachment 30707


Head office is in Austria.
 
  • Like
  • Fire
Reactions: 22 users
Top Bottom