Gentlemen (and women), start your inference engines. One of the world’s largest buyers of systems is entering evaluation mode for deep learning accelerators to speed services based on trained models.
Inference is rapidly emerging as the next major frontier in artificial intelligence (AI). Historically, the AI development and deployment focus has been overwhelmingly on training with approximately ...
1. Flex Logix’s nnMAX 1K inference tile delivers INT8 Winograd acceleration that improves accuracy while reducing the necessary computations. The InferX X1 chip includes multiple nnMax clusters. It ...
SANTA CLARA – Today, d-Matrix, a AI-compute and inference company, announced a collaboration with Microsoft using its low-code reinforcement learning (RL) platform, Project Bonsai, to enable an ...