I have a table that is closely approaching 2 million records. This table stores a history of transactions. This is on a high traffic website but also the table is not accessed regularly. We currently have no slow queries due to this table, but am looking as to when I should expect to need to migrate data from this table to a data archival method.
The server is an amazon ec2 high cpu medium box.
High-CPU Medium Instance 1.7 GB of memory 5 EC2 Compute Units (2 virtual cores with 2.5 EC2 Compute Units each) 350 GB of instance storage 32-bit platform I/O Performance: Moderate API name: c1.medium Engine: InnoDB Version: 10 Row_format: Compact Rows: 1677878 Avg_row_length: 71 Data_length: 120209408 Max_data_length: 0 Index_length: 246497280 Data_free: 0 Auto_increment: 1914179 Create_time: 2011-08-07 20:15:29 Update_time: NULL Check_time: NULL Collation: latin1_swedish_ci Checksum: NULL Create_options: Comment: InnoDB free: 7168 kB Few questions:
At which point might I start seeing performance degrade due to the amount of records in this table?
What settings should I be checking on the mysql server, via SHOW INNODB STATUS and on the table itself(ie, row format) to ensure I am maximizing performance as I need it?
What metrics should I be gathering to calculate performance over time?