Il y a 3 h
Decentralized GPU networks emerge as a complementary layer for AI inference workloads
AI training for frontier models remains concentrated in hyperscale data centers, while decentralized GPU networks are increasingly targeting inference and routine workloads as a lower-cost option. Executives from Theta Network, Ovia Systems, Fluence and Salad Technologies describe how consumer and gaming GPUs can handle open-source models, data processing and geographically distributed tasks that do not require tightly synchronized hardware. These networks are positioned as a complementary layer in the AI computing stack rather than a replacement for centralized training clusters.