Hi All, Anyone Knows how to optimize select query when daily MySQL dump data is more than 10 million.
I m selecting 8 columns to fetch for last 3 month data with pagination and page size max 100 row return?
can you elaborate what are you trying to fetch? like there are so many ways to fetch data from tables like if you’ve a ECOMMERCE STORE then try to fetch data with PRODUCT BRAND OR for being more precise go with Product ID.
run this in psql
EXPLAIN ANALYZE [yOUR SELECT qUERY]
Here are some innodb examples that work on large tables of approx. 60 to 500 million rows that demonstrate the advantages of a well designed innodb table and how best to use clustered indexes (only available with innodb)
-
60 million entries, select entries from a certain month. How to optimize database?
-
Rewriting mysql select to reduce time and writing tmp to disk
You will also want to read the following:
-
http://dev.mysql.com/doc/refman/5.0/en/innodb-index-types.html
-
http://www.xaprb.com/blog/2006/07/04/how-to-exploit-mysql-index-optimizations/
Once you’ve sorted out your table designs and optimised your innodb config:
http://www.mysqlperformanceblog.com/2006/09/29/what-to-tune-in-mysql-server-after-installation/
http://www.mysqlperformanceblog.com/2007/11/03/choosing-innodb_buffer_pool_size/
You can try something like:
start transaction;
insert into target_table (x,y) select x,y from source_table order by x,y;
commit;
Hope this helps.
Thanks you i will go through it.
!