Which of these two external drives should I use for "cold" storage while I work towards affording a proper NAS?
tldr; Brand new 2.5 external 4 TB Seagate Expansion HDD vs 3.5 external 4 TB Seagate Expansion Desktop Drive from 2021. Which is better to use for storing some stuff I don't want to lose and keeping it (mostly) unplugged? More info below.
____________
Hello,
I'd like to get a big storage solution in the future but it's not going to happen overnight (due to the cost, research etc). I've always kept my stuff on a variety of external drives which I am sure is something this community balks at. Sorry, haha, I'm hoping to change that.
My short term goal is to put some important (not life critical) stuff on a few of these drives until I can get a proper NAS or similar running hopefully in a year or two. At the moment I have two available HDDs to use. I won't need access to it frequently so I was going to just have it on a drive which I will spin up a few times I year but otherwise keep unplugged (in a cool dry place etc).
The drives are both the sort of thing you just get off the shelf in a PC shop so I suspect neither would be great but I'd like to know which one would be the more reliable for saving, storing and being unplugged for a decently long period.
Drive 1: \~4 year old, 4TB "Seagate Expansion desktop drive" and based on the enclosure size a 3.5 inch drive with a USB and separate power cable . Had it since 2021 and it's done some storage but mostly backing up videos, photos and other random bits and pieces. (model no. STKP4000400)
Drive 2: Brand new, unused 4TB "Seagate Expansion drive" which is one of the smaller 2.5 HDDs with just a USB cable. (model no. STKM4000400)
I did enough reading before posting this to see that the general consensus is that 3.5 drives are better (although factors like CMR are more important). However in this case where it's an older 3.5 vs a brand new 2.5 and also in a situation where they won't be spinning all the time has me unsure which is the better one to use.
Thanks for your patience in dealing with what is probably an obvious question to an expert, but please do let me know.
Thanks!
https://redd.it/1pya18l
@r_DataHoarder
tldr; Brand new 2.5 external 4 TB Seagate Expansion HDD vs 3.5 external 4 TB Seagate Expansion Desktop Drive from 2021. Which is better to use for storing some stuff I don't want to lose and keeping it (mostly) unplugged? More info below.
____________
Hello,
I'd like to get a big storage solution in the future but it's not going to happen overnight (due to the cost, research etc). I've always kept my stuff on a variety of external drives which I am sure is something this community balks at. Sorry, haha, I'm hoping to change that.
My short term goal is to put some important (not life critical) stuff on a few of these drives until I can get a proper NAS or similar running hopefully in a year or two. At the moment I have two available HDDs to use. I won't need access to it frequently so I was going to just have it on a drive which I will spin up a few times I year but otherwise keep unplugged (in a cool dry place etc).
The drives are both the sort of thing you just get off the shelf in a PC shop so I suspect neither would be great but I'd like to know which one would be the more reliable for saving, storing and being unplugged for a decently long period.
Drive 1: \~4 year old, 4TB "Seagate Expansion desktop drive" and based on the enclosure size a 3.5 inch drive with a USB and separate power cable . Had it since 2021 and it's done some storage but mostly backing up videos, photos and other random bits and pieces. (model no. STKP4000400)
Drive 2: Brand new, unused 4TB "Seagate Expansion drive" which is one of the smaller 2.5 HDDs with just a USB cable. (model no. STKM4000400)
I did enough reading before posting this to see that the general consensus is that 3.5 drives are better (although factors like CMR are more important). However in this case where it's an older 3.5 vs a brand new 2.5 and also in a situation where they won't be spinning all the time has me unsure which is the better one to use.
Thanks for your patience in dealing with what is probably an obvious question to an expert, but please do let me know.
Thanks!
https://redd.it/1pya18l
@r_DataHoarder
Reddit
From the DataHoarder community on Reddit
Explore this post and more from the DataHoarder community
Looking for cloud storage to share videos with password protection (view-only, no download)
Looking for a cloud service where I can upload a video and share it with a password protected link.
View only access, no download option, just watch/stream. Free or paid, both are fine. Any suggestions?
https://redd.it/1pycuhn
@r_DataHoarder
Looking for a cloud service where I can upload a video and share it with a password protected link.
View only access, no download option, just watch/stream. Free or paid, both are fine. Any suggestions?
https://redd.it/1pycuhn
@r_DataHoarder
Reddit
From the DataHoarder community on Reddit
Explore this post and more from the DataHoarder community
Best way to take daily snapshots of various subreddits?
Basically noscript. I'd like something I could set up to run automatically that will take a snapshot of a subreddit, and archive the threads and comments from the first page of that subreddit at that moment in time (sorted by "hot" or whatever the default reddit sorting method is), then puts it into some kind of browsable archive.
Any suggestions?
https://redd.it/1pyfeqh
@r_DataHoarder
Basically noscript. I'd like something I could set up to run automatically that will take a snapshot of a subreddit, and archive the threads and comments from the first page of that subreddit at that moment in time (sorted by "hot" or whatever the default reddit sorting method is), then puts it into some kind of browsable archive.
Any suggestions?
https://redd.it/1pyfeqh
@r_DataHoarder
Reddit
From the DataHoarder community on Reddit
Explore this post and more from the DataHoarder community
The faceseek proving that deleted sites never actually die.
i just ran a faceseek myself and it was a massive wake up call for why we hoard data locally. it found photos of me from a defunct hobbyist forum that i thought was wiped from the web years ago.
it proves that even if a site goes dark, the crawlers have already indexed the biometric data and linked it to your current identity. it made me realize that unless your data is on a cold-storage drive in your desk, it’s basically public metadata for any ai to find. is anyone here working on a local facial-recognition index for their own archives?
https://redd.it/1pyhaib
@r_DataHoarder
i just ran a faceseek myself and it was a massive wake up call for why we hoard data locally. it found photos of me from a defunct hobbyist forum that i thought was wiped from the web years ago.
it proves that even if a site goes dark, the crawlers have already indexed the biometric data and linked it to your current identity. it made me realize that unless your data is on a cold-storage drive in your desk, it’s basically public metadata for any ai to find. is anyone here working on a local facial-recognition index for their own archives?
https://redd.it/1pyhaib
@r_DataHoarder
Reddit
From the DataHoarder community on Reddit
Explore this post and more from the DataHoarder community
❤1
Storage strategy
Hi guys,
A few years ago, I started to build a nice homelab for my own use that I wanted quiet as hell and as low power as possible. I invested in a JCVD 12S4 case with 12 slots that I populated over time with 8TB SATA SSDs and been using them with TrueNAS Scale (passed to a VM through Proxmox and a dedicated HBA). It made me very happy on every aspect of it. Everything is backed up on a 2nd NAS with mechanical HDDs.
But yesterday, I ordered the 12th SSD meaning the enclosure is now full. Data has grown up quickly since I opened my Plex server to my family and friends as I wanted to please them with content they ask for. Videos are basically 90% of my storage use.
Since I don't see 16TB SATA SSD being sold at large scale and no hint that they will in the future, I am questioning myself about how to continue adding storage to my homelab while keeping my initial quiet+lowpower quest in sight (budget is less of a problem).
My future data strategy could take many paths:
- Invest in a 24 slots chassis and dedicate such box for TrueNAS and continue hoarding until I get to the same point later. Basically, pushing the problem to later.
- Start to delete useless data and recover some free space. This will be a continuous job. This will be exhausting and not rewarding as much as expected.
- Begin to do some tiering with a dedicated slow/mechanical vdev for data that I nearly never access. In other mean, expect such mech disk to be powered off most ofnthe time.
- As SATA might not be futureproof, start to migrate to M.2 storage on PCIe cards (i.e. 8x8TB NVMe on one) and fill a server with such cards. This would be a radical move with lot of possible problems (compatiblity, heat, etc.).
Which route would you take?
https://redd.it/1pyh60w
@r_DataHoarder
Hi guys,
A few years ago, I started to build a nice homelab for my own use that I wanted quiet as hell and as low power as possible. I invested in a JCVD 12S4 case with 12 slots that I populated over time with 8TB SATA SSDs and been using them with TrueNAS Scale (passed to a VM through Proxmox and a dedicated HBA). It made me very happy on every aspect of it. Everything is backed up on a 2nd NAS with mechanical HDDs.
But yesterday, I ordered the 12th SSD meaning the enclosure is now full. Data has grown up quickly since I opened my Plex server to my family and friends as I wanted to please them with content they ask for. Videos are basically 90% of my storage use.
Since I don't see 16TB SATA SSD being sold at large scale and no hint that they will in the future, I am questioning myself about how to continue adding storage to my homelab while keeping my initial quiet+lowpower quest in sight (budget is less of a problem).
My future data strategy could take many paths:
- Invest in a 24 slots chassis and dedicate such box for TrueNAS and continue hoarding until I get to the same point later. Basically, pushing the problem to later.
- Start to delete useless data and recover some free space. This will be a continuous job. This will be exhausting and not rewarding as much as expected.
- Begin to do some tiering with a dedicated slow/mechanical vdev for data that I nearly never access. In other mean, expect such mech disk to be powered off most ofnthe time.
- As SATA might not be futureproof, start to migrate to M.2 storage on PCIe cards (i.e. 8x8TB NVMe on one) and fill a server with such cards. This would be a radical move with lot of possible problems (compatiblity, heat, etc.).
Which route would you take?
https://redd.it/1pyh60w
@r_DataHoarder
Reddit
From the DataHoarder community on Reddit
Explore this post and more from the DataHoarder community
Zero Loss Compress: Reduce Photo Library Size Without Data Loss!
https://apps.apple.com/us/app/zero-loss-compress/id6738362427
https://redd.it/1pyldnm
@r_DataHoarder
https://apps.apple.com/us/app/zero-loss-compress/id6738362427
https://redd.it/1pyldnm
@r_DataHoarder
App Store
Zero Loss Compress App - App Store
Download Zero Loss Compress by EditingTools.io on the App Store. See screenshots, ratings and reviews, user tips, and more games like Zero Loss Compress.
Is the WD Elements 10 TB Desktop External HDD a good choice for long term storage?
Ive been looking for a HDD that prioritizes reliability and longevity. I wanna use it for storing lots of old mp4 files and photos. Currently i have been eyeing WD Elements 10 TB Desktop External HDD, but i still want to hear other peoples opinion that have more knowledge on this topic.
I plan on getting 2, one for general use and one for backup.
Are there any better choices for long term storage? Ive looked into M-DISC Blu-ray but that seemed to like too much trouble for what its worth.
https://redd.it/1pyjn93
@r_DataHoarder
Ive been looking for a HDD that prioritizes reliability and longevity. I wanna use it for storing lots of old mp4 files and photos. Currently i have been eyeing WD Elements 10 TB Desktop External HDD, but i still want to hear other peoples opinion that have more knowledge on this topic.
I plan on getting 2, one for general use and one for backup.
Are there any better choices for long term storage? Ive looked into M-DISC Blu-ray but that seemed to like too much trouble for what its worth.
https://redd.it/1pyjn93
@r_DataHoarder
Reddit
From the DataHoarder community on Reddit
Explore this post and more from the DataHoarder community
What enclosure for 3-5x 3.5" drives in a 10" rack?
Hi, I'm trying to build a backup NAS in a 10" rack to host at a secondary location. I need 50Tb of usable storage so using 2.5" drives seems like an issue. I'm thinking about something like the Icy Dock FatCage MB155SP-B.
Has anyone had any success mounting this in a 10" rack directly or with a 3d printed enclosure?
Any other recommendations?
Thanks!!
https://redd.it/1pynqbd
@r_DataHoarder
Hi, I'm trying to build a backup NAS in a 10" rack to host at a secondary location. I need 50Tb of usable storage so using 2.5" drives seems like an issue. I'm thinking about something like the Icy Dock FatCage MB155SP-B.
Has anyone had any success mounting this in a 10" rack directly or with a 3d printed enclosure?
Any other recommendations?
Thanks!!
https://redd.it/1pynqbd
@r_DataHoarder
Reddit
From the DataHoarder community on Reddit
Explore this post and more from the DataHoarder community
Probablem with Data Corruption.
I've been messing with getting sonarr/radarr up and running for the last month. I've just had some issues with data corruption that I don't know how to fix.
Right now I just have the one pc running all the arrs with 2 harddrives(one as a backup) in a [Vantec Dual Bay Dock](https://www.canadacomputers.com/en/hdd-docking-stations/255906/vantec-jx-usb-3-2-gen1-dual-bay-dock-for-sata-drive-clone-function-nst-d258s3-bk.html). Now we've had some brownouts a handful of times in the last month because of snow storms. Everytime this happens and the power goes out a harddrive corrupts. Luckily it hasn't knocked out both so I can restore it. I was about to send back one of the drives since I suspected it was the harddrive. But this morning the same thing happened with a new drive.
What can I do to stop this from happening? Is it because of the enclosure I'm using? Or is it because the arrs are usually in the middle of writing something which causes the corruption?
I'm at a loss.
https://redd.it/1pyptvu
@r_DataHoarder
I've been messing with getting sonarr/radarr up and running for the last month. I've just had some issues with data corruption that I don't know how to fix.
Right now I just have the one pc running all the arrs with 2 harddrives(one as a backup) in a [Vantec Dual Bay Dock](https://www.canadacomputers.com/en/hdd-docking-stations/255906/vantec-jx-usb-3-2-gen1-dual-bay-dock-for-sata-drive-clone-function-nst-d258s3-bk.html). Now we've had some brownouts a handful of times in the last month because of snow storms. Everytime this happens and the power goes out a harddrive corrupts. Luckily it hasn't knocked out both so I can restore it. I was about to send back one of the drives since I suspected it was the harddrive. But this morning the same thing happened with a new drive.
What can I do to stop this from happening? Is it because of the enclosure I'm using? Or is it because the arrs are usually in the middle of writing something which causes the corruption?
I'm at a loss.
https://redd.it/1pyptvu
@r_DataHoarder
Anyone else have products from orico or sharge?
I see the ads all the time, so misleading. They never say how much the actual product is, let alone how much the storage is.
I have seen the ads for the tiny NVME Sharge. Looks amazing, until you realise the 2-3TB NVME is, at least for me, super expensive.
https://redd.it/1pynikn
@r_DataHoarder
I see the ads all the time, so misleading. They never say how much the actual product is, let alone how much the storage is.
I have seen the ads for the tiny NVME Sharge. Looks amazing, until you realise the 2-3TB NVME is, at least for me, super expensive.
https://redd.it/1pynikn
@r_DataHoarder
Reddit
From the DataHoarder community on Reddit
Explore this post and more from the DataHoarder community
Ohara: An open archive of verifiably timestamped video hashes
Hi everyone, I'd like to share a small project of mine that I thought, given that there have been discussions about the Internet Archive, some members of this community might appreciate. The main idea is to "label" videos that have not been AI manipulated in a trust-minimized way by timestamping them before massive AI edits become too cheap, which we're not far from. It's a way to protect historical videos against rewrites and thus manipulation. The project is an open archive of such timestamp proofs, which can be verified by anyone and contains proofs for a bit more than 2M Internet Archive identifiers that had the "movies" media type. The software also allows for checking which files were timestamped from a given identifier. It would be good if the archive replicas were spread around, so if you find 1GB of free disk space, consider cloning the repository. This can be done by visiting the page below and clicking on the green button "Code" and then "Download ZIP". I believe the proofs should stay open and available to anyone, and replicas are the best way to achieve this.
The details of the project are described in the project's README.md file.
Github: Ohara repository
Hope you had a great 2025, and may 2026 be even better than 2025.
I'm including the project's motivation section below:
## Motivation
Creating a digital copy of real-world signal is easy, we can read the writings on a stone from an ancient civilization and publish a copy on the web. But how can a reader know the copy is authentic? The problem lies in how cheap it is to
Video remains the last widespread signal that's still hard to edit convincingly at a massive scale. Given the fast advancement of AI, we're likely just years away from cheap, indistinguishable video forgeries flooding the internet. For the first time in history, civilization will have to question the signal we see and hear that supposedly describes real world events. Note that the (raw) signal being a lie is different than the interpretation of the signal data being a lie. The latter lies have a long history, it's only the former that's new to us. While some fakes will be obvious, countless others won't be.
### A world of false copies
The low cost of editing will not affect only new videos, but we'll also become unable to tell what videos from the past were the "correct" ones. Why would anyone flood the world with false copies of past data? To manipulate collective thinking, create knowledge asymmetry (only the forger knows what's original e.g. for AI training), or many other reasons we haven't yet imagined. Cheap edits enable history rewrites through modified videos.
Can we do something about it? Can the civilization of today point a finger at a video from today and say "This is the real one."? Perhaps a bit counterintuitively, the answer is that we can. We want to bring back a signal we can trust, but we don't want to assume trust in any particular individual. What if we proved a video existed before the cost of editing dropped low enough to fake it? For this we need a trustworthy timeline. Bitcoin fits this criterion since creating an event in its timeline requires immense energy, but more importantly, editing an event requires the same energy because we need a new, equally hard block. This makes history rewrites
Hi everyone, I'd like to share a small project of mine that I thought, given that there have been discussions about the Internet Archive, some members of this community might appreciate. The main idea is to "label" videos that have not been AI manipulated in a trust-minimized way by timestamping them before massive AI edits become too cheap, which we're not far from. It's a way to protect historical videos against rewrites and thus manipulation. The project is an open archive of such timestamp proofs, which can be verified by anyone and contains proofs for a bit more than 2M Internet Archive identifiers that had the "movies" media type. The software also allows for checking which files were timestamped from a given identifier. It would be good if the archive replicas were spread around, so if you find 1GB of free disk space, consider cloning the repository. This can be done by visiting the page below and clicking on the green button "Code" and then "Download ZIP". I believe the proofs should stay open and available to anyone, and replicas are the best way to achieve this.
The details of the project are described in the project's README.md file.
Github: Ohara repository
Hope you had a great 2025, and may 2026 be even better than 2025.
I'm including the project's motivation section below:
## Motivation
Creating a digital copy of real-world signal is easy, we can read the writings on a stone from an ancient civilization and publish a copy on the web. But how can a reader know the copy is authentic? The problem lies in how cheap it is to
edit that copy. Text is trivial to edit; we just open a file and type. We have to find a signal that's easy to copy, but harder to edit. Editing sound is quite a bit harder. Trying to edit a sound file such that from 3:47-4:09 Joe says something different is not an easy task. But it turns out that AI has become an efficient and cheap edit function, turning what was a strict 1-1 mapping between real-world sounds and digital captures into a 0-many relationship. A single digital sound "capture" can now have zero real-world equivalents and infinitely many variants in the digital world. Consequently, we lose the ability to tell which sound copy is real, if any at all.Video remains the last widespread signal that's still hard to edit convincingly at a massive scale. Given the fast advancement of AI, we're likely just years away from cheap, indistinguishable video forgeries flooding the internet. For the first time in history, civilization will have to question the signal we see and hear that supposedly describes real world events. Note that the (raw) signal being a lie is different than the interpretation of the signal data being a lie. The latter lies have a long history, it's only the former that's new to us. While some fakes will be obvious, countless others won't be.
### A world of false copies
The low cost of editing will not affect only new videos, but we'll also become unable to tell what videos from the past were the "correct" ones. Why would anyone flood the world with false copies of past data? To manipulate collective thinking, create knowledge asymmetry (only the forger knows what's original e.g. for AI training), or many other reasons we haven't yet imagined. Cheap edits enable history rewrites through modified videos.
Can we do something about it? Can the civilization of today point a finger at a video from today and say "This is the real one."? Perhaps a bit counterintuitively, the answer is that we can. We want to bring back a signal we can trust, but we don't want to assume trust in any particular individual. What if we proved a video existed before the cost of editing dropped low enough to fake it? For this we need a trustworthy timeline. Bitcoin fits this criterion since creating an event in its timeline requires immense energy, but more importantly, editing an event requires the same energy because we need a new, equally hard block. This makes history rewrites
GitHub
GitHub - phyro/ohara: An open archive of verifiably timestamped video hashes.
An open archive of verifiably timestamped video hashes. - phyro/ohara
too energy-intensive to see them happen in practice.
We can use Bitcoin as a timestamping server to label original video data before we enter the era of cheap fakes. Not only does this show us and future generations which past videos were untampered, but it also preserves our ability to analyze them and reach correct (i.e. untampered) conclusions. A simple example is AI analyzing the murder of a celebrity from different unmodified video sources and finding lies in reporting due to new observations that the human eye/mind missed.
https://redd.it/1pyrice
@r_DataHoarder
We can use Bitcoin as a timestamping server to label original video data before we enter the era of cheap fakes. Not only does this show us and future generations which past videos were untampered, but it also preserves our ability to analyze them and reach correct (i.e. untampered) conclusions. A simple example is AI analyzing the murder of a celebrity from different unmodified video sources and finding lies in reporting due to new observations that the human eye/mind missed.
https://redd.it/1pyrice
@r_DataHoarder
Reddit
From the DataHoarder community on Reddit: Ohara: An open archive of verifiably timestamped video hashes
Explore this post and more from the DataHoarder community
WD_BLACK 2TB SN8100 NVMe SSD @ $803.50
I saw this at the Best Buy site today. Is this for real?
https://redd.it/1pyvad8
@r_DataHoarder
I saw this at the Best Buy site today. Is this for real?
https://redd.it/1pyvad8
@r_DataHoarder
Reddit
From the DataHoarder community on Reddit
Explore this post and more from the DataHoarder community
Where/how to get large amounts of youtube video trannoscripts?
I need a very large amount (\~500k) of trannoscripts from youtube videos. Most existing APIs that I found so far have very low batch size limits or they charge a lot. I wouldn't mind paying a bit of money but obviously the price quickly gets very high when you have to pay a few cents for each trannoscript and you're requesting so many.
The official youtube api does not have an endpoint for trannoscripts and I got ip banned very quickly when I tried to scrape the trannoscripts.
Are any of you guys familiar with any possible solutions? It's for a NLP related project.
https://redd.it/1pyxgiu
@r_DataHoarder
I need a very large amount (\~500k) of trannoscripts from youtube videos. Most existing APIs that I found so far have very low batch size limits or they charge a lot. I wouldn't mind paying a bit of money but obviously the price quickly gets very high when you have to pay a few cents for each trannoscript and you're requesting so many.
The official youtube api does not have an endpoint for trannoscripts and I got ip banned very quickly when I tried to scrape the trannoscripts.
Are any of you guys familiar with any possible solutions? It's for a NLP related project.
https://redd.it/1pyxgiu
@r_DataHoarder
Reddit
From the DataHoarder community on Reddit
Explore this post and more from the DataHoarder community
External Drives or NAS?
My use case is a Plex Server. I am running out of storage. I currently am using my old desktop as storage, connected via SMB to a miniPC that is running the Plex server. Seagate still has their external drives on pretty good sale (\~$11/TB for the 22TB and 24TB models). I would plan to buy 2 and connect one to my desktop and one to the miniPC, so that I can rip from CD/DVD using my desktop, then create a simultaneous copy to the drive connected to the miniPC.
The other option would be to buy recertified/-furbished SAS drives and build a purpose built NAS. Obviously this would be more expensive. But would it be worth the extra time and expense?
The only near-future thing I might add is NVR for exterior surveillance cameras.
https://redd.it/1pywu6b
@r_DataHoarder
My use case is a Plex Server. I am running out of storage. I currently am using my old desktop as storage, connected via SMB to a miniPC that is running the Plex server. Seagate still has their external drives on pretty good sale (\~$11/TB for the 22TB and 24TB models). I would plan to buy 2 and connect one to my desktop and one to the miniPC, so that I can rip from CD/DVD using my desktop, then create a simultaneous copy to the drive connected to the miniPC.
The other option would be to buy recertified/-furbished SAS drives and build a purpose built NAS. Obviously this would be more expensive. But would it be worth the extra time and expense?
The only near-future thing I might add is NVR for exterior surveillance cameras.
https://redd.it/1pywu6b
@r_DataHoarder
Reddit
From the DataHoarder community on Reddit
Explore this post and more from the DataHoarder community
From history education to a question about current times
Many years ago when a 6.4 Gbyte Maxtor PATA drive was king of the hill in terms of price/gb maxtor had a tool called powermax for testing their (and connor) drives in your own machine. Including a factory recertification test that would basically do a full drive write and read to test all sectors and re-map defective sectors.
I'm curious: Do any hard disk manufacturer (or even third party) have tools like this available for the end user to download and run?
https://redd.it/1pz39ko
@r_DataHoarder
Many years ago when a 6.4 Gbyte Maxtor PATA drive was king of the hill in terms of price/gb maxtor had a tool called powermax for testing their (and connor) drives in your own machine. Including a factory recertification test that would basically do a full drive write and read to test all sectors and re-map defective sectors.
I'm curious: Do any hard disk manufacturer (or even third party) have tools like this available for the end user to download and run?
https://redd.it/1pz39ko
@r_DataHoarder
Reddit
From the DataHoarder community on Reddit
Explore this post and more from the DataHoarder community
Non raid nas for dummies
Im looking for the cheapest/simplest way to get my bunch of externals into a nas like thing, if I were to shuck them .
Will only be used for Plex or seeding stuff. Any data that could be lost is easily redownloadable and a long period of downtime doesn't matter.
I looked into raid, and I don't need any performance boosts from Raid 0 and pooling all the disks doesn't provide any meaningful benefit in my use case for the extra risk.
Pretty much , if I can access each drive on its own over the network that's all I need.
Thanks!
https://redd.it/1pz580w
@r_DataHoarder
Im looking for the cheapest/simplest way to get my bunch of externals into a nas like thing, if I were to shuck them .
Will only be used for Plex or seeding stuff. Any data that could be lost is easily redownloadable and a long period of downtime doesn't matter.
I looked into raid, and I don't need any performance boosts from Raid 0 and pooling all the disks doesn't provide any meaningful benefit in my use case for the extra risk.
Pretty much , if I can access each drive on its own over the network that's all I need.
Thanks!
https://redd.it/1pz580w
@r_DataHoarder
Reddit
From the DataHoarder community on Reddit
Explore this post and more from the DataHoarder community
Seagate Expansion Desktop Hard Drive
I'm worry about A/I hiking prices so should I get the large capacity Seagate Expansion now with their current price or wait until the next bigger sale?
https://redd.it/1pz51hi
@r_DataHoarder
I'm worry about A/I hiking prices so should I get the large capacity Seagate Expansion now with their current price or wait until the next bigger sale?
https://redd.it/1pz51hi
@r_DataHoarder
Reddit
From the DataHoarder community on Reddit
Explore this post and more from the DataHoarder community
NVMe's in a 5.25" enclosure - which to pick?
So I am rebuilding my NAS in a 1U case and I already picked and received some parts for it; mainly an ICYDOCK 5.25" cage for SATA drives (4x 2.25") and I have another 5.25" bay free to use - and in that, I want to put NVMe drives.
IcyDock offers one solution that mounts m.2 SSDs and offers OcuLink in the rear, and another version that goes to MiniSAS (or something like it - it's one of the SFF with numbers plugs; I am relatively new to those). On my board, I have a x16 slot I can bifocate into 4x4 just fine.
Now, that IcyDock cage costs easily 500€ (ranges from 450-550 depending if I find it on Amazon.de or eBay.de) but I am a little surprised by the price; sure, adapting PCIe signals requires a lot of engineering, but compared to the 60€ I paid for the SATA cage, this seems... a little excessive.
Are there other solutions for this that hopefuly are less expensive?
I want to mount 4 PCIe Gen3 or Gen4 SSDs (probably the former for price) into that cage and then RAID them together (either through BTRFS or
I also looked into m.2 to U.2 adapters and cages, but putting those together almost had me at the same price. Perhaps it's just that expensive to do what I would like to, but before I overspend on something that I could've done for less, I'd just like to reaffirm.
A little detail on the host itself: It's a Milk-V Pioneer that comes with one x16 and one x8 (physical x16) slot, five SATA ports and will primarily run anything related to storage - it's my NAS, after all - and with it's many cores, will also handle CI/CD using the Concourse CI system. So, for all that, it needs disks. So I was looking to build three storage tiers:
- Hot: NVMe based (four)
- Warm: SATA SSD based (two)
- Cold: SATA HDD based (two)
And I am just trying to find a good way to properly put together the "hot" tier. :)
Thanks and kind regards!
https://redd.it/1pz8um6
@r_DataHoarder
So I am rebuilding my NAS in a 1U case and I already picked and received some parts for it; mainly an ICYDOCK 5.25" cage for SATA drives (4x 2.25") and I have another 5.25" bay free to use - and in that, I want to put NVMe drives.
IcyDock offers one solution that mounts m.2 SSDs and offers OcuLink in the rear, and another version that goes to MiniSAS (or something like it - it's one of the SFF with numbers plugs; I am relatively new to those). On my board, I have a x16 slot I can bifocate into 4x4 just fine.
Now, that IcyDock cage costs easily 500€ (ranges from 450-550 depending if I find it on Amazon.de or eBay.de) but I am a little surprised by the price; sure, adapting PCIe signals requires a lot of engineering, but compared to the 60€ I paid for the SATA cage, this seems... a little excessive.
Are there other solutions for this that hopefuly are less expensive?
I want to mount 4 PCIe Gen3 or Gen4 SSDs (probably the former for price) into that cage and then RAID them together (either through BTRFS or
mdadm). I found a neat 1U compatible SFF-8654 card and even a SFF-8654 8i to 2x SFF-8654 4i cable. But I only added them to my wishlist so I could re-find them later on.I also looked into m.2 to U.2 adapters and cages, but putting those together almost had me at the same price. Perhaps it's just that expensive to do what I would like to, but before I overspend on something that I could've done for less, I'd just like to reaffirm.
A little detail on the host itself: It's a Milk-V Pioneer that comes with one x16 and one x8 (physical x16) slot, five SATA ports and will primarily run anything related to storage - it's my NAS, after all - and with it's many cores, will also handle CI/CD using the Concourse CI system. So, for all that, it needs disks. So I was looking to build three storage tiers:
- Hot: NVMe based (four)
- Warm: SATA SSD based (two)
- Cold: SATA HDD based (two)
And I am just trying to find a good way to properly put together the "hot" tier. :)
Thanks and kind regards!
https://redd.it/1pz8um6
@r_DataHoarder
Reddit
From the DataHoarder community on Reddit
Explore this post and more from the DataHoarder community
IDE HDD for backups?
So, i have this old samsung r40 laptop sitting around, it has like 1gb of ram and a 200ish gb hdd in ide format, couple of months ago i booted and it took me a whole hour just to get my hands on the files i needed, mind you this was months ago.
Im gonna throw it away and throw that old ide into my rig, is it really worth it? I planned on use it as a backup drive, nothing too heavy, might as well just download wikipedia on it 😂. What do you guys think?
https://redd.it/1pzckwi
@r_DataHoarder
So, i have this old samsung r40 laptop sitting around, it has like 1gb of ram and a 200ish gb hdd in ide format, couple of months ago i booted and it took me a whole hour just to get my hands on the files i needed, mind you this was months ago.
Im gonna throw it away and throw that old ide into my rig, is it really worth it? I planned on use it as a backup drive, nothing too heavy, might as well just download wikipedia on it 😂. What do you guys think?
https://redd.it/1pzckwi
@r_DataHoarder
Reddit
From the DataHoarder community on Reddit
Explore this post and more from the DataHoarder community