I would only use the open source models anyway, but it just seems rather silly from what I can tell.
I feel like the last few months have been an inflection point, at least for me. Qwen 2.5, and the new Command-R, really make a 3090 feel “dumb, but smart,” useful enough so I pretty much always keep 34B loaded on the desktop for its sheer utility.
It’s still in the realm of enthusiast hardware, but hopefully that’s about to be shaken up with bitnet and some stuff from AMD/Intel.
What do you think about the possibility of decentralized AI through blockchain so that you could pay some tokens or something like that to rent the GPUs to run your AI for as long as you wish to instead of having to buy all the hardware and assemble it yourself?
I feel like the last few months have been an inflection point, at least for me. Qwen 2.5, and the new Command-R, really make a 3090 feel “dumb, but smart,” useful enough so I pretty much always keep 34B loaded on the desktop for its sheer utility.
It’s still in the realm of enthusiast hardware, but hopefully that’s about to be shaken up with bitnet and some stuff from AMD/Intel.
What do you think about the possibility of decentralized AI through blockchain so that you could pay some tokens or something like that to rent the GPUs to run your AI for as long as you wish to instead of having to buy all the hardware and assemble it yourself?
I cannot tell if you are being serious or just having fun with buzzwords