
It can be important to determine this, as file damage can indicate an underlying network or system problem. This solution may solve the problem, but it prevents you from knowing whether the original file was damaged. If a file is suspect, the typical solution is to recopy from a known good file. Therefore, you want to make sure that they are the same. Sometimes you may experience unusual program behavior and may suspect that a file is damaged, or you may suspect that two files have the same byte count but different dates.

The file byte count and the creation date are not reliable indications.Īpplies to: Windows Server 2012 R2, Windows 10 - all editions Original KB number: 159214 Summary It would make a lot more sense to replace that one tool with one that understands folders than to build a Rube Goldberg machine to auto-sync folders into a flat list.This article describes how to use the Windiff.exe utility, a tool that graphically compares the contents of two ASCII files, or the contents of two folders that contain ASCII files, to verify whether they are the same. With this and your other thread (which are both really the same question, I think?), it seems like you are layering on more and more tools and complexity in an attempt to work around a fairly ridiculous limitation of the "database" tool you are using.


Synching the copies is the same job as synching the originals, and that's the part that's problematic. What would adding a TeraCopy copy-and-verify to an intermediate folder get you, other than more complexity? You'd just be duplicating your source folders and then synching the duplicate to your destination, which doesn't seem to get you anywhere. There's no underlying "basic compare command", it's just the way people want synching to work in almost all cases, and the only way synching can work in the general case (where two files may have the same names in different directories and thus cannot be resolved to a single flat destination).
