• Large MySQL Databases Index Creation

    Posted on June 29th, 2009 admin No comments

    Hello,

    Maybe some of you guys know that in the gwebtools.com website we index daily millions and millions of websites, its really a hard cpu and memory usage process and takes sometime.

    For the name server spy tool wich we update monthly we need to import some files to our database monthly big files more than 10GB of domains and name servers data.

    We have test a lot of ways to speeding up this process as much we can, if some visitor of the web site type a name server address in the ns spy tool, we need to check a database with 70,000,000 rows, for speeding up this search is necessary a btree index, another tool is the domain list where we separate all domains in prefixes.

    If you create a table name with the columns but with no index the process of inserting rows is really very fast, but if you create a table with indexes the process is slow.

    We suggest you if you have big tables, and big data files if you need to import they to a table create your index tables after inserting all that, this is what we do, we create all indexes after inserting all the data, it takes a lot of hours but is  faster.

    Just a suggestion from someone who have experience with large databases.