In reply to
Aceticon
@Aceticon@lemmy.dbzer0.com
lemmy.dbzer0.com
Unless things have changed recently LLMs don't really used slow data stores with very high capacity such as HDDs, at least not beyond the training stage.
The prices that have been pushed up by AI are for GPUs and DRAM (price rises which in turn possibly feed onwards to other kinds of chip done in the same kind of fab), whilst this stuff is magnetic data storage on movable disk plates, a very different tech.
I expect these things at most will only be affected in price very indirectly (for example, if memory prices go up because of all the datacenters targetting AI applications, there might be fewer datacenters set up for other kinds of server side application which are more data-centric, which would impact demand for ultra high-capacity HDDs).
Not that it makes much of a difference to us run-of-the-mill techies as consumers - even if HDDs get cheaper, with many times more expensive GPUs and RAM we can hardly put together new systems using these things, so at best it might just get a bit cheaper to expand one's large storage NAS (the slower kind just storing data that doesn't get accessed often, as the other kind uses SDDs).
View full thread on lemmy.dbzer0.com
7
8
0
Conversation (8)
Showing 0 of 8 cached locally.
Syncing comments from the remote thread. 8 more replies are still loading.
Loading comments...