BRN Discussion Ongoing

SERA2g

Founding Member
Perth WA Chippers.
As per one of my last posts the Perth Xmas drinks.
I have booked for 40 people at Samuals Bar (base of Hilton Hotel, Mill Street Perth) It is from 4-7pm on Friday evening 25 November. This is an opportunity to catch up put a face to a TSE name, have a laugh, realize that we are all mom and dads (mostly) who kid ourselves we are Warren Buffet's, but hell its better than talking about our normal 9-5 grind. So come along. The BRN office staff are aware and may or may not attend for a quick hello depending on what break though they have had or if the market was red that day. If you could just PM if you are attending so I can add extra numbers if I have to but also so I can add new members to a local list for any future events.
Stay safe, stay strong and hold long.
Put me down for 2 mate.

Bout time I introduced the fiancé to the cult.

BYO paddles for the hazing?
 
  • Haha
  • Like
  • Love
Reactions: 21 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Hot off the press.



Screen Shot 2022-11-17 at 11.46.40 am.png






Qualcomm's new Snapdragon platform is built for slim augmented reality glasses​

Jon Fingas
Jon Fingas
·Reporter
Thu, 17 November 2022 at 10:00 am·2-min read


2c6eaac0-610e-11ed-9ff7-0de5109e7d07



If companies are going to make augmented reality glasses you'd actually want to wear, they'll need chips that are powerful but won't require a large battery on your head. Qualcomm thinks it can help. The company has unveiled a Snapdragon AR2 Gen 1 platform that's built with slim AR glasses in mind. The multi-chip design reportedly delivers 2.5 times the AI performance of the company's XR2-based reference design while using half the power. You could have eyewear that intelligently detects objects in the room while remaining slim and light enough to use for hours at a time.
Part of the trick is to spread the computing load across the glasses' frame, Qualcomm says. The primary, 4nm-based AR processor includes a CPU, Tensor AI processing, graphics and engines for features like visual analytics. It can support up to nine simultaneous cameras for tracking both your body and the world around you. A co-processor elsewhere in the glasses includes an AI accelerator for tasks like eye tracking and computer vision, while a third chip handles connectivity to networks and phones. This not only better-balances the weight, but leads to smaller circuit boards and fewer wires than you'd see with a single do-it-all chip.

That networking is also important, Qualcomm says. Like Snapdragon 8 Gen 2 in phones, AR2 Gen 1 is one of the first platforms to support WiFi 7. That's crucial not just to provide the gobs of bandwidth for connecting to a handset (up to 5.8Gbps), but to reduce latency (under 2ms to your phone, according to Qualcomm). Combined with lag reduction in the processor and co-processor, you should have a more natural-feeling and responsive experience.

Hardware built on AR2 Gen 1 is in "various stages" of progress at multiple well-known companies, including Lenovo, LG, Nreal, Oppo and Xiaomi. Importantly, Microsoft had a hand in the platform requirements. Don't be surprised if you're one day using AR2 for virtual collaboration in Mesh, not to mention other Microsoft apps and services.

Qualcomm has also introduced meaningful updates to its audio technology. New S3 Gen 2 Sound and S5 Gen 2 Sound platforms promise to make the latest listening tech more commonplace, including spatial audio with head tracking, lower latency for games and the latest take on adaptive active noise cancellation (think of the transparency modes found on some earbuds). You won't see real-world products until the second half of 2023, but these chips could democratize features that were previously reserved for pricier buds and headphones.
https://au.finance.yahoo.com/news/cyber-attack-cost-medibank-35m-223335107.html

 
Last edited:
  • Like
  • Thinking
  • Fire
Reactions: 30 users

rgupta

Regular
Just going back last question where he said they are going from 32 bits to 4 bits without a loss in accuracy. Is there a scope for brainchip ip here?
The other reason he said it is a research project and he did not quote how the solutions are getting upto that level.
So may be a case here that snapdragon is using our ip. Dyor
There is an other dot joining here.
Mercedes is going to use snapdragon for its latest vehicles. And we all know Mercedes is going to use brainchip ip for its concept car EQXX.
Dyor
 
  • Like
  • Fire
Reactions: 9 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
This might seem like a silly question, but is there a chance there's an NDA in place between BrainChip and Nuvia?

This article says the Arm-compatible CPU cores that were designed in house by Qualcomm's acquired Nuvia team will be marketed under a new brand called Oryon. The components are now due to ship in products by the end of 2023 or early 2024.

It also states "Now Qualcomm executives have gone on stage, in a livestreamed keynote, and bragged how its Nuvia team has produced for the Snapdragon line "world-class" CPU cores that will take on the competition."

 
  • Thinking
  • Like
Reactions: 8 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
  • Like
  • Love
  • Fire
Reactions: 21 users
Posted elsewhere to expand on my thinking.

I would suggest either way it's cut, that there is a requirement to work with those companies respective IP and ours, whether direct or through someone like Moschip maybe.

If you have a look at the below definition you can see where verification IP (Siemens, Cadence, Synopsys IP solutions) is potentially being required.

Bottom line is the role needs to provide:
"functional verification of the IP solution of Siemens/Synopsys/Cadence.

Now why?

Has any/all of those companies integrated Akida (test / protype perhaps) and we need to provide the tech support (Akida side) to verify their overall solution works as intended?

They sure as hell wouldn't need us to verify their own stand alone IP solutions.




Verification IP (VIP)
A pre-packaged set of code used for verification.

DESCRIPTION

Verification IP (VIP) is a pre-packaged set of code used for verification. It may be a set of assertions for verifying a bus protocol, or it could be a module intended to be used within a defined verification methodology, such as UVM. This would often contain stimulus sequences, bus functional models, a set of checkers, coverage model and other things associated with a particular block in the design, such as a USB interface.

The main engine inside verification IP is the transactor model — sometimes called masters and slaves, sometimes just called VIPs, and in some flows called agents — that are the elements that can get told by a UVM, or whatever approach is being used, to write the test vectors.

VIP emerged as a form of reusable IP, which can be used to create the tests needed to shorten SoC verification time and increase coverage. While it is often used to verify standard bus protocols, it also can be used for system performance analysis and is increasingly being used with emulation, simulation, and virtual prototyping.

For example, VIP can be used with emulation for simulation acceleration and APIs. The simulation acceleration side uses emulation to run faster with a UVM test environment, very similar to the verification IP used with UVM, but not targeted to run on an emulator.

The API side of the VIP that is sitting in the testbench communicates between the two and uses transactions rather than low level signals. However, if a design is put onto the emulator and it is running in a UVM testbench mode, the speed of the emulator is limited because everything is moving back and forth at the signal level. Part of that signal level is still running in simulation so it’s going to throttle back the speed possible with emulation.
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 25 users

Getupthere

Regular
 
  • Like
  • Fire
Reactions: 5 users

alwaysgreen

Top 20

Bolded the bits that may be related to Akida. PSVR2 with Akida? Will take a look into it.


Sony has just submitted its latest patent, referencing the development of an interactive 3D avatar that players would be able to use to convey emotions and more. The tech giant has its fair share of mid-production technologies and patent-level hypotheses, and some of them sound like they could truly provide users with a new kind of media utility, whether through video games or through some other manner of interaction.

While they don't necessarily mean much on their own, companies' patents can sometimes afford fans a close look at some developing and/or upcoming technologies that may or may not see the light of day, sometimes years down the line. In Sony's case, specifically, the company seems to have a bunch of ongoing projects, and one of them concerns the translation of gamers' emotions in real-time, using none other than their virtual avatars.
According to the latest patent listing provided by Sony, the company is attempting to develop a video game avatar that would allow for animated modification on the fly using information sourced from the user's own facial expressions. The provided documentation suggests that users' faces would be scanned for various expressions (happiness, sadness, etc.), which would then be converted to their in-game avatar's own face. This new submission may or may not be connected to Sony's earlier AI-based facial animation patent, it's worth pointing out.



Furthermore, the listing mentions that users would also have the option to convert their own faces into 3D models to be used in video games. This could be where Sony's picture-in-picture patent would potentially tie into the system, as it would allow for expressive pop-in screens that could appear in response to various in-game actions, such as victories and defeats. Other use cases include speech bubbles, animated gestures, gesticulations, and similar instances.

Keeping the above in mind, it's likely that Sony would need a comprehensive way of capturing data about the users' respective faces before using them in video games and 3D interfaces. The specifics of this aren't detailed in this patent, but it's possible that the system would be used in tandem with particular kinds of hardware that might be able to supply such data. The surprisingly pricy PS VR2 headset immediately comes to mind, for example.
 
  • Like
  • Fire
Reactions: 8 users

alwaysgreen

Top 20
  • Like
  • Fire
Reactions: 11 users
Job description has vanished?
 

Attachments

  • Screenshot_20221117-134810.png
    Screenshot_20221117-134810.png
    265.2 KB · Views: 178
  • Haha
  • Wow
  • Like
Reactions: 39 users
D

Deleted member 118

Guest
  • Like
Reactions: 5 users
D

Deleted member 118

Guest
  • Haha
  • Like
  • Wow
Reactions: 16 users
Hold On Waiting GIF by Fleischer Studios
Season 3 Waiting GIF by SpongeBob SquarePants
 
  • Haha
Reactions: 5 users
Sprocket GIF by Chris
 
  • Haha
  • Like
  • Love
Reactions: 12 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
  • Like
  • Haha
  • Fire
Reactions: 36 users

Mccabe84

Regular
  • Like
Reactions: 10 users

Diogenese

Top 20
  • Like
  • Thinking
Reactions: 8 users

Shadow59

Regular
But has the horse bolted?
I can imagine a lot of serious discussions going on behind the scenes now! Fire extinguishers everywhere!
There could be a human resource position opening very soon!
 
  • Haha
  • Like
Reactions: 17 users
Top Bottom