Rendered at 18:58:13 GMT+0000 (Coordinated Universal Time) with Netlify.
gdevillers 2 days ago [-]
Will we see people being paid to host small single-GPU servers in their home ? I guess that would require redesigning the training system because the data transfer speed would be much slower with a higher latency. Maybe that is not even compatible with LLM training ?
128 to 210 WEEKS for power transformers. You can fab more GPUs in 18 months but the electrical infrastructure to run them takes 3-4 years and there are maybe a handful of manufacturers. Feels like the stories from the telecom fiber overbuild of 2000 except this time the supply chain can't even overbuild if it wanted to.