Explore products built on the idyl open compute network. From AI inference to media processing — see what people are building.
Distributed AI inference across heterogeneous GPUs. Run any model on any hardware in the network with automatic hardware matching and zero configuration.
Media transcoding distributed across the network. Upload once, the work fans out automatically across available hardware.
Automated optimisation across the network. Define what you're improving, upload a genesis file, and let thousands of parallel experiments find the answer overnight.
Try adjusting your search or filters.
Built something on idyl? Get it in front of every user browsing the ecosystem.
Submit a product →