Sally Ward-Foxton has a new article about using NNs in the cloud.
NeuReality Boosts AI Accelerator Utilization With NAPU - EE Times
The NeuReality patents look pretty clunky:
US11915041B1 Method and system for sequencing artificial intelligence (AI) jobs for execution at AI accelerators 20190912
[0021] The sequencer 100 is configured to send the AI jobs to one or more AI accelerators 120 . An accelerator is a dedicated processor configured to perform a specific function, thus offloading the processing of an AI job from the application host CPU or the AI-server host CPU. An AI accelerator 120 may include one or more neural network core processors, a GPU, an FPGA, a DSP (or multiple DSP cores), one or more video codec core processors, one or more CPU processor cores, a deep neural network (DNN) accelerator, and the like. It should be noted that the accelerator 120 can support acceleration of tasks that are not AI tasks.
NeuReality Boosts AI Accelerator Utilization With NAPU - EE Times
The NeuReality patents look pretty clunky:
US11915041B1 Method and system for sequencing artificial intelligence (AI) jobs for execution at AI accelerators 20190912
[0021] The sequencer 100 is configured to send the AI jobs to one or more AI accelerators 120 . An accelerator is a dedicated processor configured to perform a specific function, thus offloading the processing of an AI job from the application host CPU or the AI-server host CPU. An AI accelerator 120 may include one or more neural network core processors, a GPU, an FPGA, a DSP (or multiple DSP cores), one or more video codec core processors, one or more CPU processor cores, a deep neural network (DNN) accelerator, and the like. It should be noted that the accelerator 120 can support acceleration of tasks that are not AI tasks.