posted 10 years ago
We have eight production servers. All do basically the same thing, but the loads/jobs are split based on customer location or some other business reason.
Each box has several 'root level' directories - one contains perl scirpts, another shell scripts, another tcl, another crosswalk tables...etc.
We have a problem where a developer will promote a new version of a script up to the root level directory on servers A, B and C, but not D-H. Then someone else makes a change, and that gets promoted to B, G, and H...etc. All promotions are done by hand - i.e. someone going to a command line and typing "scp fileA serverB:/path/to/wherever/."
This, obviously, had consequences.
I don't want to automatically CHANGE anything - i.e. I don't want to use something like rsync to copy files from A to B. But is there a good way to get a report of what might be different across all our boxes?
I also don't care WHAT the differences are, i just want to know that there ARE differences.
The only think I can think of is a perl/tcl/shell script on each box that dumps a listing of files to a common location, then something that compares them all...I don't think different dates matter...but a different size would be a red flag...
There are only two hard things in computer science: cache invalidation, naming things, and off-by-one errors