Mailing List Archive


[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[tlug] Seeking recommendations for file consolidation



TLUG,

I have a whole bunch of back up CD-ROMs from the last few years. Mostly from my days as a Windows user.

There are so many of them now, though, that I don't need them all, and I know most of the files will be duplicates anyway. Not to mention a lot of junk that simply isn't needed at all anymore.

But I can't quite toss them out for fear that on some of them might be some data that might be unique.

Now that my current computer has many gigabytes of free space, I'm copying the contents of all the CD-ROMS to a directory on my hard drive. Each CD-ROM's contents goes into it's own sub-directory to prevent accidental over-writing.

Once all the data is in one place, I hoped to find a way I can weed out duplicates and be left with one set of just the most recent versions of unique files.

I know that there's the command line "diff" command. But my understanding of that is that it compares two individual files and you can learn how different they are. That seems a little different than the more global comparison between multiple directories that I'm looking for.

I also downloaded and ran Kompare. It says on their web site that it can recursively compare subdirectories. But I can't find any such feature in the interface.

Beyond that, I haven't had much luck in discerning an appropriate application for this task.

Can anyone recommend something suitable?

Thank you for your support and advice.

--
Dave M G



Home | Main Index | Thread Index

Home Page Mailing List Linux and Japan TLUG Members Links