Bug #221

Investigate size of data_xfer table in customer database

Added by admin 9 days ago. Updated 9 days ago.

Status:New Start:2025-08-30
Priority:Normal Due date:
Assigned to:- % Done:

0%

Category:-
Target version:-
Votes: 0

Description

The content of the data_xfer table of the customer database appears to be about 760 MiB gzipped which is huge compared to the entire rest of the database and is being backed up every day, leading to a delta of more than half a gigabyte of backup data every day.

The data is of limited value and certainly not worth burning somewhere between 0.5 and 1 GiB of disk space per day to back up, so possibly we could just stop backing it up. Possibly though this could indicate that purging of old records from this table has stopped working at some point. This should be investigated.

If it's the case that this is the real amount of useful data then it could possibly be worth it to not store it compressed. Despite use of the --rsyncable option to gzip, the compressed nature of this file is causing a lot of daily churn in the backups. Not compressing it will make the next instance of the file take up a very large amount of space but each subsequent delta should be much smaller, and the backups are stored compressed.

Until I find time to investigate all this I'm going to disable backups of the content of this table.

History

Updated by admin 9 days ago

  • Subject changed from Investigate size of @data_xfer@ table in customer database to Investigate size of data_xfer table in customer database

Also available in: Atom PDF