Need help loading Wikidata dump locally

I've been trying to load a Wikidata dump locally so I can run queries that their endpoint would timeout and I got some questions.
I've set Stardog to PATH, created STARDOG_HOME and a stardog.properties file there with strict.parsing = false, so it ignores some errors in the dump, and memory.mode = bulk_load just for loading the database.
The docs say that "Stardog supports loading data from compressed files directly: there’s no need to uncompress files before loading. Loading compressed data is the recommended way to load large input files. Stardog supports GZIP, BZIP2 and ZIP compressions natively.". Is it faster than loading decompressed files and does this work for .ttl, .nt and .json?
In any case, STARDOG_HOME is set to another partition other than /, yet by running db create my SSD space is being taken (files are being created in /tmp). Is there any way to change this?

See this section of the docs for setting the location of the tmp dir.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.