Changes

Jump to: navigation, search

Tips for large databases

181 bytes removed, 07:43, 12 October 2012
Loading the file
==Loading the file==
For small files, you Inital import of a large (100.000+) database from either gramps format or gedcom istiresome and can work on GEDCOM or GRAMPS XML formattake a few hours. This works greatYou will need to adjust thenumber of allowable locks.For 140.000 people you should use: ''max_locks'' 300000 and''max_objects'' 300000
However, these formats are completely loaded in your computer memory so, for larger files, The easiest way to do this becomes slow and might bring your computer is to create an empty database the add a file'''DB_CONFIG''' to the database directory before importing (see '''gramps -l''' output for the directory of a standstillspecific family tree.Contents of this DB_CONFIG file should be:
Therefore, you should only handle large files by using the GRAMPS GRDB format. You do this by creating a new GRDB file and importing the large file into it. This can take a long time, so do it in the evenings! If you '''enable database transactions''' in the {{man menu|Edit->Preferences->General}} menu tab, the importing of data will then be much faster.#may want to fiddle with cachesize also #set_cachesize 0 200000000 2 set_lk_max_locks 300000Also if you have a file that large it might be prudent to break up file into smaller pieces (i.e. Maternal & Paternal lines in different files, if they are not interrelated) so you only have to use a smaller file. set_lk_max_objects 300000
{{-}}

Navigation menu