Skip to content

Conversation

davidkopp
Copy link
Contributor

There was already an import_backup.sh script in the folder. To be able to quickly create database backups and import them I think it would make sense to have also a create_backup.sh script.

@ArneTR
Copy link
Member

ArneTR commented Sep 17, 2025

having a backup script like this is sure nice.

However the implementation will only have limited use in production like this.

ATM we are running backups on our cluster daily but export per table.

The issue being is that some sql files become really large. At some point we had 80GB text blobs which were impossible to handle.
Can you change the script to create one file per table. This would also mean you have to create a table list beforehand.

@davidkopp
Copy link
Contributor Author

I enhanced the scripts to support also a table-based backup and import. The import script still supports the file-based import as before.

@ArneTR
Copy link
Member

ArneTR commented Sep 17, 2025

I am not a fan of this re-work.

This script has now become absurdly complex for a miniature functionality. I do not want to maintain such a big script.

Please reduce the functionality to a super simple variant:

  • Import all files from a directory
  • Export all tables into separate sql files

If the user wants to import partially they can move files around in the filesystem or delete them. No backup manifest etc. is needed.

=> Please make this a task to be touched after the carbon aware testing platform. This is very low prio please

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

2 participants