1

I have a local mysql database called intranet_production, and on a remote server (which I have root access to the server and mysql) there is a database called 'extranet`.

Within the remote extranet database are dozens of tables. I only need around 5 of them so importing the entire database seems excessive.

I am looking for the best way of importing the database, once the import is working I will set it up as a cron job for every 24 hours. At the moment, the remote database can overwrite it's only local data as we aren't writing anything to the imported tables.

So, given that the local database is called intranet_production and the remote database is called extranet, what's the best way to import tables called table1, table2 and table3 as an example?

I've come across the following method, but I believe that has to be run on the remote server to export the table. I need to run the cron job on the local server.

mysqldump db-name foo | ssh [email protected] mysql bar 

Any pointers would be appreciated!

2 Answers 2

1

You can use mysqldump with remote hosts:

mysqldump -h database.server.com -u username -pPassword dbName tblName > local.file.sql 

instad of redirecting to a file, you can just pipe it to mysql:

mysqldump <args> | mysql -u uname -pPwd localDbName 
2

You can write a small script that does the following:

ssh [email protected] "mysqldump db table1 table2 > /tmp/your_dump.sql" rsync [email protected]:/tmp/your_dump.sql /tmp/ mysql db < /tmp/your_dump.sql 

Optionally, you can compress the file during transfer for slow links.

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.