We encountered a issue while creating a repository from the commandline and populating it with triples. For a project I developed some stuff locally and put everything in a local repository. After it was completed a exported the repository to a TRIG file an sent it to a remote sever with stardog. The idea was to create a new db and populate it during creation with the TRIG file. However after a couple of minutes stardog returns: “Bulk load was cancelled: java.lang.OutOfMemoryError: Java heap space”. Is there a way for get the bulk upload done, instead of split the TRIG file in parts?
We use Stardog 5.0.4 Enterprise.
The TRIG file is 914 MB.
That's not a particularly large amount of data although you didn't mention if the 914MB is compressed which would make a big difference. How many triples are in there? How much memory do you have allocated to Stardog? Are you able to increase the heap space?
The file is not compressed and contains 9255347 triples. The server has 2 cores with 4 GB memory.
What would you suggest? Using a compressed version of the file?
Compressing the file won’t help you as far as memory is concerned but it may speed up the bulk load and TRIG should compress a lot depending on the data (possibly as much as 10:1) so uploading it should be much faster as well.