geekhack Community > Other Geeky Stuff

SSD capacity

<< < (13/15) > >>

_rubik:

--- Quote from: Axiom_ on Sat, 06 April 2024, 00:53:41 ---
--- Quote from: _rubik on Thu, 04 April 2024, 23:53:18 ---Also speculating here, but I don't know of many folks training their own models on PB scale datasets either. That sort of training is ludicrously expensive and, outside of the startups getting fun money thrown in their direction, most folks are using off-the-shelf models for most of their heavy lifting.

--- End quote ---

The trend seems to be towards the deployment of cloud-based, pre-trained foundation models which are then fine-tuned by businesses using local datasets. Due to the latter, SMB/enterprise demand for storage is likely to grow more than usual. My personal take is that individual PC users only ever account for a small share of hardware demand.

--- End quote ---

Oh I was only talking about enterprise demand. I agree, consumers have and will always get the dregs of enterprise. Training an AI at home (at this point) is like running a Kubernetes cluster at home. Sure, it mimics the patterns of a production cluster but the scale is effectively a rounding error.

My point is the, even in the enterprise context, few companies are training their own models. And even if they are, they're rarely training from the ground up. More often than not, they're extending a base model. Storage development and therefore prices will continue to increase because we're spinning off more data as a society in a day than we can ever hope to capture, not because we're all training openai-scale models.

Maybe you can argue that we have an increased interest in capturing the more "mundane" data for training purposes, but so much of that data is noise that filtering the raw bits down into a useful signal will always bring us back to the real bottleneck: compute.

The only evidence I can really point to is that researchers were working on transformers for _years_ before this wave. We've only just hit a point where it's computationally feasible to scale the architecture and train the model.

I also acknowledge that, judging by your post history, we're likely on opposite sides of the bullish-bearish spectrum.

tp4tissue:
Ok, so 1 scenario where Tp4 thinks a bigger SSD heatsink is warranted. If someone is using the SSD slot that gets hot air from the GPU, usually slots toward the bottom, those slots get crazy hot when the GPU is going full out.  These should be either water cooled or at least actively cooled. A makeshift airguide/ heat shield to direct the hot gpu air away from the drive may be easiest.

phinix:
Looks like we will get bigger SSDs soon, phinix is now a happy hippo :)

Bring on 8TB SSD for $200! :D

https://www.techpowerup.com/321557/samsung-readies-290-layer-3d-nand-for-may-2024-debut-planning-430-layer-for-2025

tp4tissue:

--- Quote from: phinix on Wed, 17 April 2024, 13:20:36 ---
--- End quote ---

Production cost to ssd is already lower than HDD, right now they're just fleecing the customers.

TomahawkLabs:

--- Quote from: phinix on Wed, 17 April 2024, 13:20:36 ---Looks like we will get bigger SSDs soon, phinix is now a happy hippo :)

Bring on 8TB SSD for $200! :D

https://www.techpowerup.com/321557/samsung-readies-290-layer-3d-nand-for-may-2024-debut-planning-430-layer-for-2025

--- End quote ---

The biggest winner here IMO is the users. I've been into PCs since the late 00s. Hard drive wiring was always a pain, now every Mobo has 2 NVME slots and there are now power or data wire to run. Much roomier and easier to manager. I can also see a future in which case manufactures opt out of having 2.5" drives even. Just use NVME.

Navigation

[0] Message Index

[#] Next page

[*] Previous page

Go to full version