Dear all,
I’ve been reading about the Mozilla cluster for training having 2 towers with 8 Titan X GPUs each.
Is there any reference available on the hardware setup of the cluster?
Did you buy it off the shelf or configure on your own?
Best regards,
daniel