Training@home

2025 · 2025 Competition

School: School of Computer and Information Sciences
Category: Data SciencePrimary

Project Overview

One Liner: A framework for trustless, distributed machine learning training that leverages consumer device compute power with cryptographic verifiability.

Abstract

This project addresses the need to reduce reliance on cloud providers and high-cost hardware by enabling distributed training of machine learning models using the collective computational power of consumer devices. This project builds a framework that allows for verifiable and trustless distributed training, where any participant can contribute compute resources without compromising security or integrity. Using cryptographic verification methods, the framework will ensure computation correctness across untrusted networks, addressing data privacy, scalability, and security challenges. Preliminary experiments show promise for this approach on virtual networks and smaller models, with potential applications for broader, community-driven machine learning training models similar to efforts like SETI@home.

Video available at this link.

Screenshots

0 image(s)

No screenshots uploaded yet.

Team Members

Di Huynh
Lead
Erick Meyer
Nikolay Tokarenko
Anomitro Paul
Saquib Baig

Advisors

Milad Toutounchian
Milad Toutounchian

Stakeholders

Rittik Ghanshan
Rittik Ghanshan