I just finished rebuilding my home server, and I got to thinking, are there any data integrity scenarios I'm missing? All the files I care about are on the server, with Windows Offline file mirrors to machines that need them. The server drives are mirrored. The server backup drive is mirrored. I was recently reading at work about the non-zero occurence of memory soft errors - hilariously, it does happen on hosted compute solutions like Azure or EC2; they apparently use commodity hardware, not the expensive ECC memory stuff, so it's surprisingly common. Similarly, the TCP checksum is rather weak; I recall Blizzard discussing how a terrifying number of users couldn't download World of Warcraft and get it to work just because something kept corrupting packets. Hrm, what about hard drive bit rot? I haven't thought about that in years, but I remember seeing it on hard disks multiple times in the 1990s, where files would just mysteriously get corrupted though no one had touched them. Alternatively it could be firmware bugs or something, but it's the same effect. I'm sure there's high end server solutions, but what do you do for this on a Windows commodity pc? Is there some software out there that checksums files as an automated detection solution?