You won’t get the AI stuff through. I’m working on a machine like you described and drilled a hole in it to provide power to a 3070 as the power supply wasn’t beefy enough (and server power supplies are too expensive compared to a hole).
This is the first generation where I can play around with local LLMs from what I can tell - even used the hardware for that is way more expensive :(
You won’t get the AI stuff through. I’m working on a machine like you described and drilled a hole in it to provide power to a 3070 as the power supply wasn’t beefy enough (and server power supplies are too expensive compared to a hole).
This is the first generation where I can play around with local LLMs from what I can tell - even used the hardware for that is way more expensive :(