Magik-Eye + Hololens

uiux

Regular

On the Two-View Geometry of Unsynchronized Cameras

Abstract
We present new methods for simultaneously estimating camera geometry and time shift from video sequences from multiple unsynchronized cameras. Algorithms for simultaneous computation of a fundamental matrix or a homography with unknown time shift between images are developed. Our methods use minimal correspondence sets (eight for fundamental matrix and four and a half for homography) and therefore are suitable for robust estimation using RANSAC. Furthermore, we present an iterative algorithm that extends the applicability on sequences which are significantly unsynchronized, finding the correct time shift up to several seconds. We evaluated the methods on synthetic and wide range of real world datasets and the results show a broad applicability to the problem of camera synchronization

1644739790850.png




Authors and associations to note:

Jan Heller

CTO of Magik-Eye
Researcher in the field of artificial intelligence with an expertise in computer vision, and robotics camera calibration.

Tomas Pajdla
Scientific Advisor at Magik-Eye
Assistant Professor, Czech Technical University, Department of Cybernetics
Distinguished Researcher for Center of Machine Perception
Co-Founder, Neovision

Andrew Fitzgibbon
HoloLens, Microsoft
Cambridge

---


Breakthrough of Optics and Mathematics

Invertible Light™ is in stark contrast to current 3D sensing methods. For example, Structured Light requires the projection of a specific or random pattern to measure distortions. The result is significant power, multiple components and complexity of production. In contrast to the Time of Flight (ToF) method that is good for longer distances, Invertible Light™ has greater visibility in shorter distances with similar advantages of size, speed and power-efficiency.

patternbenefits.png




Depth map with structured and flood light
Current Assignee Microsoft Technology Licensing LLC

Abstract
A method including receiving an image of a scene illuminated by both a predetermined structured light pattern and a flood fill illumination, generating an active brightness image of the scene based on the received image of the scene including detecting a plurality of dots of the predetermined structured light pattern, and removing the plurality of dots of the predetermined structured light pattern from the active brightness image, and generating a depth map of the scene based on the received image and the active brightness image.

1644739911197.png


---



1644739946536.png



---


M’soft Gives Deeper Look into HoloLens
Sensor/processor integration needed

The HoloLens “will be the next generation of personal computing devices with more context” about its users and their environments than today’s PCs and smartphones, said Marc Pollefeys, an algorithm expert who runs a computer vision lab at ETH Zurich and joined the HoloLens project in July as director of science.

It could take a decade to get to the form factor of Google Glass, Pollefeys said. “There are new display and sensing technologies coming … and a lot of trade-offs in how much processing we ship off [the headset to the cloud, and] we are going from many components to SoCs.”One of the team’s biggest silicon challenges is to render “the full region and resolution that the eye can see in a lightweight platform.”

“Most of the energy is spent moving bits around … so it would seem natural that … the first layers of processing should happen in the sensor,” Pollefeys told EE Times in a brief interview after his talk. “I’m following the neuromorphic work that promises very power-efficient systems with layers of processing in the sensor — that’s a direction where we need a lot of innovation — it’s the only way to get a device that’s not heavier than glasses and can do tracking all day.”


---





---


Mercedes-Benz and Microsoft HoloLens 2 show off augmented reality's impact in the automotive space

Mercedes-Benz is infusing its dealerships with AR technology to speed up the diagnosis and repair of tricky and complex issues with its Virtual Remote Support, powered by Microsoft's HoloLens 2 and Dynamics 365 Remote Assist





Toyota makes mixed reality magic with Unity and Microsoft HoloLens 2

Toyota has used Unity to create and deploy mixed reality applications to Microsoft’s revolutionary device across its automotive production process. Naturally, its team was eager to expand their mixed reality capabilities with HoloLens 2, the next generation of Microsoft’s wearable holographic computer




---


Army moves Microsoft HoloLens-based headset from prototyping to production phase

The IVAS headset, based on HoloLens and augmented by Microsoft Azure cloud services, delivers a platform that will keep Soldiers safer and make them more effective. The program delivers enhanced situational awareness, enabling information sharing and decision-making in a variety of scenarios. Microsoft has worked closely with the U.S. Army over the past two years, and together we pioneered Soldier Centered Design to enable rapid prototyping for a product to provide Soldiers with the tools and capabilities necessary to achieve their mission.



Samsung Working With US Military On 5G AR Testing

This week Samsung, both the Networks division and its Electronics group, along with defense contractor GBL Systems, announced the initial deployment of a new 5G-powered test bed for augmented reality (AR) applications. Part of a larger $600 million-dollar 5G testing-focused initiative that the Department of Defense announced in October of 2020, this testbed is one of the US military’s numerous efforts to leverage advanced connectivity technologies to improve its training efforts and operational effectiveness.



Microsoft’s multi-billion dollar deal with US defense for HoloLens is still on

Since 2018 Microsoft has been developing a version of the HoloLens 2 specifically for the US military. Called Integrated Visual Augmentation System (IVAS), the device replaced the Army’s own Heads-Up Display 3.0 effort to develop a sophisticated situational awareness tool soldiers can use to view key tactical information before their eyes. Early this year, Microsoft won a contract to deliver 120,000 military-adapted HoloLens augmented reality headsets worth as much as $21.88 billion over 10 years.



New US Army Military-Grade HoloLens 2 Imagery Gives Us Star Wars Stormtrooper Vibes

Looking at some of these new images, the modded HoloLens 2 gives soldiers a bit of a Star Wars Stormtrooper look, which will be exciting for some, assuming you're not on the wrong end of the AR lens during battle. It's one thing to see a single soldier wearing the device, but being presented with an entire team of AR helmet-equipped flying soldiers begins to look a lot like science fiction.

1644740404595.png




Why Microsoft Won The $22 Billion Army Hololens 2 AR Deal

Since the first Hololens launched in 2016 with the Development Edition, Microsoft has been hard at work building out its Mixed Reality platform. That same year, Microsoft also announced its Mixed Reality VR headsets for Windows. As some may remember, 2016 was a peak hype year for VR. The momentum behind the technology allowed Microsoft to establish a comprehensive understanding of the needs of AR and VR, and develop APIs for developers accordingly. Last week, Microsoft announced the closure of a $22 billion deal with the Army for AR headsets, software and services. Let’s take a look at the deal and what it means for Microsoft and the industry at large.



US Army is deploying Microsoft HoloLens-based headsets in a $21.88 billion deal

Microsoft and the US Army have announced that augmented reality headsets based on the HoloLens 2 will enter production, finalizing a prototype that has been in development since 2018. The new contract is significantly larger than the 2018 deal, providing for 120,000 headsets according to a CNBC report. The contract could be as large as $21.88 billion over 10 years.

1644740491978.png



---


Unrelated but interesting research:



The Next SoC Design: Neuromorphic Smart Glasses

As Moore’s Law quickly approaches its physical limit, system-on-chip (SoC) architects are beginning to explore new design paradigms, such as the integration of various heterogenous hardware accelerators that improve energy efficiency of new SoCs. This is especially true for low-power spiking neural network (SNN) accelerators which are beginning to see their debut in commercial neuromorphic SoCs. This paper overviews the progression of neuromorphic SoC hardware and wearable computing over the past 20 years. We argue that SNN accelerators will be an important aspect of emerging smartglasses hardware due to their energy efficiency and we propose a novel smartglasses SoC design.
 

Attachments

  • 1644739874113.png
    1644739874113.png
    197.3 KB · Views: 71
  • Like
  • Fire
Reactions: 49 users

Esq.111

Fascinatingly Intuitive.

On the Two-View Geometry of Unsynchronized Cameras

Abstract
We present new methods for simultaneously estimating camera geometry and time shift from video sequences from multiple unsynchronized cameras. Algorithms for simultaneous computation of a fundamental matrix or a homography with unknown time shift between images are developed. Our methods use minimal correspondence sets (eight for fundamental matrix and four and a half for homography) and therefore are suitable for robust estimation using RANSAC. Furthermore, we present an iterative algorithm that extends the applicability on sequences which are significantly unsynchronized, finding the correct time shift up to several seconds. We evaluated the methods on synthetic and wide range of real world datasets and the results show a broad applicability to the problem of camera synchronization

View attachment 915



Authors and associations to note:

Jan Heller

CTO of Magik-Eye
Researcher in the field of artificial intelligence with an expertise in computer vision, and robotics camera calibration.

Tomas Pajdla
Scientific Advisor at Magik-Eye
Assistant Professor, Czech Technical University, Department of Cybernetics
Distinguished Researcher for Center of Machine Perception
Co-Founder, Neovision

Andrew Fitzgibbon
HoloLens, Microsoft
Cambridge

---


Breakthrough of Optics and Mathematics

Invertible Light™ is in stark contrast to current 3D sensing methods. For example, Structured Light requires the projection of a specific or random pattern to measure distortions. The result is significant power, multiple components and complexity of production. In contrast to the Time of Flight (ToF) method that is good for longer distances, Invertible Light™ has greater visibility in shorter distances with similar advantages of size, speed and power-efficiency.

patternbenefits.png




Depth map with structured and flood light
Current Assignee Microsoft Technology Licensing LLC

Abstract
A method including receiving an image of a scene illuminated by both a predetermined structured light pattern and a flood fill illumination, generating an active brightness image of the scene based on the received image of the scene including detecting a plurality of dots of the predetermined structured light pattern, and removing the plurality of dots of the predetermined structured light pattern from the active brightness image, and generating a depth map of the scene based on the received image and the active brightness image.

View attachment 917

---



View attachment 918


---


M’soft Gives Deeper Look into HoloLens
Sensor/processor integration needed

The HoloLens “will be the next generation of personal computing devices with more context” about its users and their environments than today’s PCs and smartphones, said Marc Pollefeys, an algorithm expert who runs a computer vision lab at ETH Zurich and joined the HoloLens project in July as director of science.

It could take a decade to get to the form factor of Google Glass, Pollefeys said. “There are new display and sensing technologies coming … and a lot of trade-offs in how much processing we ship off [the headset to the cloud, and] we are going from many components to SoCs.”One of the team’s biggest silicon challenges is to render “the full region and resolution that the eye can see in a lightweight platform.”

“Most of the energy is spent moving bits around … so it would seem natural that … the first layers of processing should happen in the sensor,” Pollefeys told EE Times in a brief interview after his talk. “I’m following the neuromorphic work that promises very power-efficient systems with layers of processing in the sensor — that’s a direction where we need a lot of innovation — it’s the only way to get a device that’s not heavier than glasses and can do tracking all day.”


---





---


Mercedes-Benz and Microsoft HoloLens 2 show off augmented reality's impact in the automotive space

Mercedes-Benz is infusing its dealerships with AR technology to speed up the diagnosis and repair of tricky and complex issues with its Virtual Remote Support, powered by Microsoft's HoloLens 2 and Dynamics 365 Remote Assist





Toyota makes mixed reality magic with Unity and Microsoft HoloLens 2

Toyota has used Unity to create and deploy mixed reality applications to Microsoft’s revolutionary device across its automotive production process. Naturally, its team was eager to expand their mixed reality capabilities with HoloLens 2, the next generation of Microsoft’s wearable holographic computer




---


Army moves Microsoft HoloLens-based headset from prototyping to production phase

The IVAS headset, based on HoloLens and augmented by Microsoft Azure cloud services, delivers a platform that will keep Soldiers safer and make them more effective. The program delivers enhanced situational awareness, enabling information sharing and decision-making in a variety of scenarios. Microsoft has worked closely with the U.S. Army over the past two years, and together we pioneered Soldier Centered Design to enable rapid prototyping for a product to provide Soldiers with the tools and capabilities necessary to achieve their mission.



Samsung Working With US Military On 5G AR Testing

This week Samsung, both the Networks division and its Electronics group, along with defense contractor GBL Systems, announced the initial deployment of a new 5G-powered test bed for augmented reality (AR) applications. Part of a larger $600 million-dollar 5G testing-focused initiative that the Department of Defense announced in October of 2020, this testbed is one of the US military’s numerous efforts to leverage advanced connectivity technologies to improve its training efforts and operational effectiveness.



Microsoft’s multi-billion dollar deal with US defense for HoloLens is still on

Since 2018 Microsoft has been developing a version of the HoloLens 2 specifically for the US military. Called Integrated Visual Augmentation System (IVAS), the device replaced the Army’s own Heads-Up Display 3.0 effort to develop a sophisticated situational awareness tool soldiers can use to view key tactical information before their eyes. Early this year, Microsoft won a contract to deliver 120,000 military-adapted HoloLens augmented reality headsets worth as much as $21.88 billion over 10 years.



New US Army Military-Grade HoloLens 2 Imagery Gives Us Star Wars Stormtrooper Vibes

Looking at some of these new images, the modded HoloLens 2 gives soldiers a bit of a Star Wars Stormtrooper look, which will be exciting for some, assuming you're not on the wrong end of the AR lens during battle. It's one thing to see a single soldier wearing the device, but being presented with an entire team of AR helmet-equipped flying soldiers begins to look a lot like science fiction.

View attachment 919



Why Microsoft Won The $22 Billion Army Hololens 2 AR Deal

Since the first Hololens launched in 2016 with the Development Edition, Microsoft has been hard at work building out its Mixed Reality platform. That same year, Microsoft also announced its Mixed Reality VR headsets for Windows. As some may remember, 2016 was a peak hype year for VR. The momentum behind the technology allowed Microsoft to establish a comprehensive understanding of the needs of AR and VR, and develop APIs for developers accordingly. Last week, Microsoft announced the closure of a $22 billion deal with the Army for AR headsets, software and services. Let’s take a look at the deal and what it means for Microsoft and the industry at large.



US Army is deploying Microsoft HoloLens-based headsets in a $21.88 billion deal

Microsoft and the US Army have announced that augmented reality headsets based on the HoloLens 2 will enter production, finalizing a prototype that has been in development since 2018. The new contract is significantly larger than the 2018 deal, providing for 120,000 headsets according to a CNBC report. The contract could be as large as $21.88 billion over 10 years.

View attachment 920


---


Unrelated but interesting research:



The Next SoC Design: Neuromorphic Smart Glasses

As Moore’s Law quickly approaches its physical limit, system-on-chip (SoC) architects are beginning to explore new design paradigms, such as the integration of various heterogenous hardware accelerators that improve energy efficiency of new SoCs. This is especially true for low-power spiking neural network (SNN) accelerators which are beginning to see their debut in commercial neuromorphic SoCs. This paper overviews the progression of neuromorphic SoC hardware and wearable computing over the past 20 years. We argue that SNN accelerators will be an important aspect of emerging smartglasses hardware due to their energy efficiency and we propose a novel smartglasses SoC design.



Evening Uiux,

Big thankyou once again for the deluge of very interesting info.

BLOODY LEGEND.

Regards,
Esq.
 
  • Like
  • Fire
Reactions: 18 users

SilentioAsx

Emerged

On the Two-View Geometry of Unsynchronized Cameras

Abstract
We present new methods for simultaneously estimating camera geometry and time shift from video sequences from multiple unsynchronized cameras. Algorithms for simultaneous computation of a fundamental matrix or a homography with unknown time shift between images are developed. Our methods use minimal correspondence sets (eight for fundamental matrix and four and a half for homography) and therefore are suitable for robust estimation using RANSAC. Furthermore, we present an iterative algorithm that extends the applicability on sequences which are significantly unsynchronized, finding the correct time shift up to several seconds. We evaluated the methods on synthetic and wide range of real world datasets and the results show a broad applicability to the problem of camera synchronization

View attachment 915



Authors and associations to note:

Jan Heller

CTO of Magik-Eye
Researcher in the field of artificial intelligence with an expertise in computer vision, and robotics camera calibration.

Tomas Pajdla
Scientific Advisor at Magik-Eye
Assistant Professor, Czech Technical University, Department of Cybernetics
Distinguished Researcher for Center of Machine Perception
Co-Founder, Neovision

Andrew Fitzgibbon
HoloLens, Microsoft
Cambridge

---


Breakthrough of Optics and Mathematics

Invertible Light™ is in stark contrast to current 3D sensing methods. For example, Structured Light requires the projection of a specific or random pattern to measure distortions. The result is significant power, multiple components and complexity of production. In contrast to the Time of Flight (ToF) method that is good for longer distances, Invertible Light™ has greater visibility in shorter distances with similar advantages of size, speed and power-efficiency.

patternbenefits.png




Depth map with structured and flood light
Current Assignee Microsoft Technology Licensing LLC

Abstract
A method including receiving an image of a scene illuminated by both a predetermined structured light pattern and a flood fill illumination, generating an active brightness image of the scene based on the received image of the scene including detecting a plurality of dots of the predetermined structured light pattern, and removing the plurality of dots of the predetermined structured light pattern from the active brightness image, and generating a depth map of the scene based on the received image and the active brightness image.

View attachment 917

---



View attachment 918


---


M’soft Gives Deeper Look into HoloLens
Sensor/processor integration needed

The HoloLens “will be the next generation of personal computing devices with more context” about its users and their environments than today’s PCs and smartphones, said Marc Pollefeys, an algorithm expert who runs a computer vision lab at ETH Zurich and joined the HoloLens project in July as director of science.

It could take a decade to get to the form factor of Google Glass, Pollefeys said. “There are new display and sensing technologies coming … and a lot of trade-offs in how much processing we ship off [the headset to the cloud, and] we are going from many components to SoCs.”One of the team’s biggest silicon challenges is to render “the full region and resolution that the eye can see in a lightweight platform.”

“Most of the energy is spent moving bits around … so it would seem natural that … the first layers of processing should happen in the sensor,” Pollefeys told EE Times in a brief interview after his talk. “I’m following the neuromorphic work that promises very power-efficient systems with layers of processing in the sensor — that’s a direction where we need a lot of innovation — it’s the only way to get a device that’s not heavier than glasses and can do tracking all day.”


---





---


Mercedes-Benz and Microsoft HoloLens 2 show off augmented reality's impact in the automotive space

Mercedes-Benz is infusing its dealerships with AR technology to speed up the diagnosis and repair of tricky and complex issues with its Virtual Remote Support, powered by Microsoft's HoloLens 2 and Dynamics 365 Remote Assist





Toyota makes mixed reality magic with Unity and Microsoft HoloLens 2

Toyota has used Unity to create and deploy mixed reality applications to Microsoft’s revolutionary device across its automotive production process. Naturally, its team was eager to expand their mixed reality capabilities with HoloLens 2, the next generation of Microsoft’s wearable holographic computer




---


Army moves Microsoft HoloLens-based headset from prototyping to production phase

The IVAS headset, based on HoloLens and augmented by Microsoft Azure cloud services, delivers a platform that will keep Soldiers safer and make them more effective. The program delivers enhanced situational awareness, enabling information sharing and decision-making in a variety of scenarios. Microsoft has worked closely with the U.S. Army over the past two years, and together we pioneered Soldier Centered Design to enable rapid prototyping for a product to provide Soldiers with the tools and capabilities necessary to achieve their mission.



Samsung Working With US Military On 5G AR Testing

This week Samsung, both the Networks division and its Electronics group, along with defense contractor GBL Systems, announced the initial deployment of a new 5G-powered test bed for augmented reality (AR) applications. Part of a larger $600 million-dollar 5G testing-focused initiative that the Department of Defense announced in October of 2020, this testbed is one of the US military’s numerous efforts to leverage advanced connectivity technologies to improve its training efforts and operational effectiveness.



Microsoft’s multi-billion dollar deal with US defense for HoloLens is still on

Since 2018 Microsoft has been developing a version of the HoloLens 2 specifically for the US military. Called Integrated Visual Augmentation System (IVAS), the device replaced the Army’s own Heads-Up Display 3.0 effort to develop a sophisticated situational awareness tool soldiers can use to view key tactical information before their eyes. Early this year, Microsoft won a contract to deliver 120,000 military-adapted HoloLens augmented reality headsets worth as much as $21.88 billion over 10 years.



New US Army Military-Grade HoloLens 2 Imagery Gives Us Star Wars Stormtrooper Vibes

Looking at some of these new images, the modded HoloLens 2 gives soldiers a bit of a Star Wars Stormtrooper look, which will be exciting for some, assuming you're not on the wrong end of the AR lens during battle. It's one thing to see a single soldier wearing the device, but being presented with an entire team of AR helmet-equipped flying soldiers begins to look a lot like science fiction.

View attachment 919



Why Microsoft Won The $22 Billion Army Hololens 2 AR Deal

Since the first Hololens launched in 2016 with the Development Edition, Microsoft has been hard at work building out its Mixed Reality platform. That same year, Microsoft also announced its Mixed Reality VR headsets for Windows. As some may remember, 2016 was a peak hype year for VR. The momentum behind the technology allowed Microsoft to establish a comprehensive understanding of the needs of AR and VR, and develop APIs for developers accordingly. Last week, Microsoft announced the closure of a $22 billion deal with the Army for AR headsets, software and services. Let’s take a look at the deal and what it means for Microsoft and the industry at large.



US Army is deploying Microsoft HoloLens-based headsets in a $21.88 billion deal

Microsoft and the US Army have announced that augmented reality headsets based on the HoloLens 2 will enter production, finalizing a prototype that has been in development since 2018. The new contract is significantly larger than the 2018 deal, providing for 120,000 headsets according to a CNBC report. The contract could be as large as $21.88 billion over 10 years.

View attachment 920


---


Unrelated but interesting research:



The Next SoC Design: Neuromorphic Smart Glasses

As Moore’s Law quickly approaches its physical limit, system-on-chip (SoC) architects are beginning to explore new design paradigms, such as the integration of various heterogenous hardware accelerators that improve energy efficiency of new SoCs. This is especially true for low-power spiking neural network (SNN) accelerators which are beginning to see their debut in commercial neuromorphic SoCs. This paper overviews the progression of neuromorphic SoC hardware and wearable computing over the past 20 years. We argue that SNN accelerators will be an important aspect of emerging smartglasses hardware due to their energy efficiency and we propose a novel smartglasses SoC design.

uiux …….all I can say is thank god I am retired now, because you have given me enough study material to keep me occupied for some time. It is really wonderful research material across so many threads and it is very very much appreciated.

I am sure I am not alone when I say the more I read, the more excited I am becoming.
 
  • Like
Reactions: 17 users
I haven't noticed anyone post this today. In the latest news on Magik Eye, they are partnering with Shikino for development / commercialisation of Magik Eye cameras. There's a a lot of great info in here IMO, such as:
  1. They are speeding up commercialisation. The question is why would they if they weren't so strongly convinced people wanted it and / or they've had strong indications from customers that they want their technology ASAP.
  2. The target markets of both MagikEye and Shikino
  3. Addition of ultra-compact laser projector to achieve up to 600FPS (Akida is probably the only technology capable of processing at this speed with ultra-low power consumption at the edge, which is essentially their target market)
  4. MagikEye saying their initial kit sold out and there was great interest.

Pure speculation, DYOR



Shikino High Tech Co., Ltd.
June 09, 2022 11:00

Announcement of capital and business alliance with Magik Eye Inc, which owns advanced technology for 3D sensing​


Accelerating the development of camera modules targeting the growing 3D sensing market
Shikino High Tech
Co., Ltd. (hereinafter, Shikino High Tech) is a part of MagikEye Inc. (Headquarters: Connecticut, USA, Magic Eye),
which possesses advanced technology for 3D sensing. On June 7, 2022, we announced that we
have acquired shares and signed an MOU for joint development of camera modules that utilize this technology . This alliance is the first step towards future collaboration between the two companies developing products and technologies , especially with regard to the commercialization of cameras for 3D sensing, which is growing rapidly in the sensing camera market, where demand has been increasing in recent years . After discussions between the two companies, we plan to conclude a joint development agreement for commercialization. Takeo Miyazawa, CEO of MagikEye Inc., said: "We have developed a unique 3D measurement algorithm: ILT (Invertible Light Technology) (registered trademark). Simply add an ultra-compact laser projector designed with this technology to an existing camera module to achieve up to 600fps. It can achieve higher speed and lower delay response within 2ms at the shortest. It does not require major changes or additional resources to the CPU or SoC. By advancing, we are confident that we will be able to market the 3D sensor products required by various businesses at the fastest speed. ” Shikino Hi-Tech has introduced Magic Eye technology to commercialize advanced and high-performance 3D cameras.
Aim
For the business promoted by Magic Eye, we will consider collaborative development such as providing 3D camera modules as a camera module vendor and joint development of cameras suitable for "3D face recognition to prevent spoofing" as Shikino High Tech's original product .
About
Shikino High-Tech Co., Ltd. develops and produces inspection equipment (burn-in equipment) that applies loads such as temperature to in-vehicle semiconductors, industrial measurement equipment, and camera modules for electrical equipment. We are a company whose main business is the development and production of, and the contract design of analog / digital LSI circuits.
"One Stop Solution" that handles everything from development to in-house manufacturing based on various ideas and high technological capabilities, with the tailwind of IoT such as advanced camera functions of smartphones, internet connection of cars and automatic driving. Is achieved with high quality. Established in 1975, Shikino High Tech Co., Ltd. has its Kyushu office at the forefront of customer service, centered on its head office and factory in Uozu City, Toyama Prefecture. Engineers are assigned to the design centers in Tokyo, Osaka, and Fukuoka to develop and sell products.
For more information, please visit https://www.shikino.co.jp/ .

About MagikEye
Magik Eye Inc. was established in Connecticut, USA in 2015 with a focus on 3D ranging technology applicable to smartphones, robots and surveillance fields, which is patented as Invertible Light Technology (ILT), and is currently in Prague, Poland. We have development bases in Bangalore, India and Tokyo.
ILT has enabled the development of high-speed and highly responsive 3D sensors, and sold the entry model as a development kit in Japan in August 2021, and the initial lot sales were discontinued with great response.
For more information https://www.magik-eye.com/please look at.
 
  • Like
  • Fire
  • Thinking
Reactions: 24 users

Slade

Top 20
I haven't noticed anyone post this today. In the latest news on Magik Eye, they are partnering with Shikino for development / commercialisation of Magik Eye cameras. There's a a lot of great info in here IMO, such as:
  1. They are speeding up commercialisation. The question is why would they if they weren't so strongly convinced people wanted it and / or they've had strong indications from customers that they want their technology ASAP.
  2. The target markets of both MagikEye and Shikino
  3. Addition of ultra-compact laser projector to achieve up to 600FPS (Akida is probably the only technology capable of processing at this speed with ultra-low power consumption at the edge, which is essentially their target market)
  4. MagikEye saying their initial kit sold out and there was great interest.

Pure speculation, DYOR



Shikino High Tech Co., Ltd.
June 09, 2022 11:00

Announcement of capital and business alliance with Magik Eye Inc, which owns advanced technology for 3D sensing​


Accelerating the development of camera modules targeting the growing 3D sensing market
Shikino High Tech
Co., Ltd. (hereinafter, Shikino High Tech) is a part of MagikEye Inc. (Headquarters: Connecticut, USA, Magic Eye),
which possesses advanced technology for 3D sensing. On June 7, 2022, we announced that we
have acquired shares and signed an MOU for joint development of camera modules that utilize this technology . This alliance is the first step towards future collaboration between the two companies developing products and technologies , especially with regard to the commercialization of cameras for 3D sensing, which is growing rapidly in the sensing camera market, where demand has been increasing in recent years . After discussions between the two companies, we plan to conclude a joint development agreement for commercialization. Takeo Miyazawa, CEO of MagikEye Inc., said: "We have developed a unique 3D measurement algorithm: ILT (Invertible Light Technology) (registered trademark). Simply add an ultra-compact laser projector designed with this technology to an existing camera module to achieve up to 600fps. It can achieve higher speed and lower delay response within 2ms at the shortest. It does not require major changes or additional resources to the CPU or SoC. By advancing, we are confident that we will be able to market the 3D sensor products required by various businesses at the fastest speed. ” Shikino Hi-Tech has introduced Magic Eye technology to commercialize advanced and high-performance 3D cameras.
Aim
For the business promoted by Magic Eye, we will consider collaborative development such as providing 3D camera modules as a camera module vendor and joint development of cameras suitable for "3D face recognition to prevent spoofing" as Shikino High Tech's original product .
About
Shikino High-Tech Co., Ltd. develops and produces inspection equipment (burn-in equipment) that applies loads such as temperature to in-vehicle semiconductors, industrial measurement equipment, and camera modules for electrical equipment. We are a company whose main business is the development and production of, and the contract design of analog / digital LSI circuits.
"One Stop Solution" that handles everything from development to in-house manufacturing based on various ideas and high technological capabilities, with the tailwind of IoT such as advanced camera functions of smartphones, internet connection of cars and automatic driving. Is achieved with high quality. Established in 1975, Shikino High Tech Co., Ltd. has its Kyushu office at the forefront of customer service, centered on its head office and factory in Uozu City, Toyama Prefecture. Engineers are assigned to the design centers in Tokyo, Osaka, and Fukuoka to develop and sell products.
For more information, please visit https://www.shikino.co.jp/ .

About MagikEye
Magik Eye Inc. was established in Connecticut, USA in 2015 with a focus on 3D ranging technology applicable to smartphones, robots and surveillance fields, which is patented as Invertible Light Technology (ILT), and is currently in Prague, Poland. We have development bases in Bangalore, India and Tokyo.
ILT has enabled the development of high-speed and highly responsive 3D sensors, and sold the entry model as a development kit in Japan in August 2021, and the initial lot sales were discontinued with great response.
For more information https://www.magik-eye.com/please look at.
Thanks for posting ID. This is the first bit of news on Magik Eye for a long time. It could well be a very good development for Akilda.
 
  • Like
  • Love
  • Fire
Reactions: 12 users
  • Like
Reactions: 1 users
Also from that page

 
  • Like
Reactions: 1 users
Screenshot_20220609-235713.png
 
  • Like
Reactions: 4 users
images (3).jpeg
 
  • Fire
  • Like
Reactions: 4 users
Screenshot_20220610-000322.png
Screenshot_2022-06-10-00-06-13-50.jpg
 
  • Like
Reactions: 9 users
Screenshot_20220610-002953.png
 
  • Like
Reactions: 3 users
  • Like
Reactions: 2 users
  • Like
Reactions: 5 users

butcherano

Regular
Everything I read in those articles spells serious competition to Akida. I was hoping for Akida to be licenced IP in their chip. No sign of that. I think we need @Diogenese to take a look if he hasn’t already.
The key for us is the power consumption.

If they’re using a 9v battery with 0.5Ah capacity for their AiOnIc camera, then at 1W this battery will last 4.5hrs (or “several hours” as per their quote).

Akida can run person detection at 8mW (using MobileNet v1 0.25 at 96x96x3 4 bit act/wt) which would last 562hrs on the same battery. And facial identification at 22.6mW using Akida would last 199hrs.

So if they’re marketing this as a hand held device which is reliant on batteries then they probably should be using Akida IP imo.
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 25 users
Everything I read in those articles spells serious competition to Akida. I was hoping for Akida to be licenced IP in their chip. No sign of that. I think we need @Diogenese to take a look if he hasn’t already.
Good to have some discussion, peaked my interest with some of the phrases that were used. Appreciate the investigative efforts by holders here.
 
  • Like
Reactions: 7 users
The key for us is the power consumption.

If they’re using a 9v battery with 0.5Ah capacity for their AiOnIc camera, then at 1W this battery will last 4.5hrs (or “several hours” as per their quote).

Akida can run person detection at 8mW (using MobileNet v1 0.25 at 96x96x3 4 bit act/wt) which would last 562hrs on the same battery. And facial identification at 22.6mW using Akida would last 199hrs.

So if they’re marketing this as a hand held device which is reliant on batteries then they probably should be using Akida IP imo.
Hi @butcherano
I am not absolutely sure because there is a building company with a similar name here in Australia but fairly sure this company has been dismissed in the past.

My quick read of their website which I think is new tells me they are vision only and they are preprogrammed.

Add in your power calculations which I agree with and they are not playing first grade against AKIDA and are unlikely to be promoted.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Love
  • Fire
Reactions: 9 users
Sent a short query off to that architekai coy. Dropped a few Akida bombs. I'll have to go back over what peaked my interest but I'm sure they are connected to magik eye somehow.
Life's hard when eyesight is crapping out.😐
 
  • Like
  • Sad
Reactions: 3 users
A recent video by Magik Eye confirming it can achieve 600 frames per second

 
  • Like
  • Fire
  • Love
Reactions: 11 users
Does that mean they are not using akida for this?
" MagikEye have achieved 600 FPS using a single, standard CMOS image sensor without any hardware accelerator"
 
  • Like
Reactions: 1 users
Does that mean they are not using akida for this?
" MagikEye have achieved 600 FPS using a single, standard CMOS image sensor without any hardware accelerator"
Hi Rise,

The ILT001 is a sensor. Akida processes sensor data, it doesn't do the sensing itself. Akida isn't embedded in the ILT001 but being an IP solution, it could be in the future or companies could design it into solutions themselves.
If someone wanted to process this 600 FPS data at the edge in real time and at low power, Akida is likely the only processor that can do this right now.
 
  • Like
  • Love
  • Fire
Reactions: 13 users
Top Bottom