Need some advice on how to proceed with this situation: My current scenario is that I have several postgresql (50+) boxes deployed throughout various locations and data centers and a beefy postgresql box setup at a homebase location. All of the deployed boxes have identical database layouts. I'm looking for a solution that would allow for a few things. I realize some of these options overlap and some might only contain mutually exclusive solutions. However, I'm interested to hear your thoughts :)
- Remotely query the deployed boxes and pull the results back to the homebase box for processing. 
- Nightly (remote) "sync" or dump the deployed boxes' databases to a master database on the homebase box. Although each remote has box has the same table layout, the actual data differs across the board. Being able to dump all of this data remotely into master DB would be ideal. 
- Remotely push a table entry to all of the deployed boxes from the homebase box. 
- Ensure security of data in transit, and remotely deployed boxes. 
Up to this point I've been floating on a homebrew multithreaded python/perl system that SSH's into these boxes remotely, which are ACL'ed off to the homebase server and pulls (or pushes) the raw query results over the ssh connection. I have even touched #2 (remote syncing) as I know that would get nasty really quick. I'm interested in any ideas for a more elegant solution that can scale up and stick to my FreeBSD/Linux environment.