Creating the take home database: Difference between revisions
From Knot Atlas
Jump to navigationJump to search
No edit summary |
|||
Line 4: | Line 4: | ||
#Go to <code>/www/html/data/<code> |
#Go to <code>/www/html/data/<code> |
||
#As root, run <code>./create-rdf.sh<code> |
#As root, run <code>./create-rdf.sh<code> |
||
#Get a coffee, or perhaps go to sleep, travel to a conference etc. |
|||
#Eventually, check that [http://katlas.org/data/katlas.rdf.gz] has been created. |
|||
==What does this do?== |
==What does this do?== |
Revision as of 10:46, 18 July 2007
This page is really only for Scott and Dror. It documents the procedure for creating RDF dumps from the Knot Atlas. If you're only interested in the actual data, go to The Take Home Database.
Quick start
- Go to
/www/html/data/
- As root, run
./create-rdf.sh
- Get a coffee, or perhaps go to sleep, travel to a conference etc.
- Eventually, check that [1] has been created.
What does this do?
XML dump
First, we dump the current version of every page in the Knot Atlas to XML, using the /w/maintenance/dumpBackup.php
script from mediawiki. This produces the file katlas.xml
.
Convert to RDF
This is actually a two step process. First, we create RDF statements in an on-disk RDF repository. Second, we dump these statements back to the file katlas.rdf
Cleaning up
The file katlas.xml
gets deleted, as it's not so useful, and the file katlas.rdf
gets gzipped to katlas.rdf.gz
. It's then available at [2].