What is the bast way to load amassive dataset (YAGO) to Stardog

YAGO is 170Gb in 110 ttl files. The console obviously doesn’t handle that well.

The best way is to copy the files to the server and use the CLI command stardog-admin db create .... We also recommend zipping up the files to reduce IO. The most important thing is to set memory.mode=bulk_load in stardog.properties file in $STARDOG_HOME and give enough memory to the server. Some guidelines can be found at Home | Stardog Documentation Latest but for Yago it's probably better to go at least x1.5 above that.

Best,
Pavel

PS. Remove memory.mode parameter after bulk loading and restart the server before it starts normal operations on the database.

Was loading for 3 hours, then the server crashed on the background as the command line gave the response

WARN 2018-08-24 22:16:57,680 [main] com.complexible.stardog.protocols.http.client.BaseHttpClient:retryRequest(176): No response from server, will retry (tries: 1)
Connection refused (Connection refused)

The laptop has 6-core cpu, however is using 2TB hybrid drive (meaning it is a spinning drive with 32Gb NAND used as cache)

Anything in Stardog.log?

A lot. Attaching the file. I had to ass .log to the compressed filestardog.last.zip.log (467.6 KB)

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.