Out of memory while creating a database with Trig file from commandline

Hi,

We encountered a issue while creating a repository from the commandline and populating it with triples. For a project I developed some stuff locally and put everything in a local repository. After it was completed a exported the repository to a TRIG file an sent it to a remote sever with stardog. The idea was to create a new db and populate it during creation with the TRIG file. However after a couple of minutes stardog returns: “Bulk load was cancelled: java.lang.OutOfMemoryError: Java heap space”. Is there a way for get the bulk upload done, instead of split the TRIG file in parts?

We use Stardog 5.0.4 Enterprise.
The TRIG file is 914 MB.

That's not a particularly large amount of data although you didn't mention if the 914MB is compressed which would make a big difference. How many triples are in there? How much memory do you have allocated to Stardog? Are you able to increase the heap space?

You can try setting the memory allocation property memory.mode to bulk_load Home | Stardog Documentation Latest

The file is not compressed and contains 9255347 triples. The server has 2 cores with 4 GB memory.
What would you suggest? Using a compressed version of the file?

It’s a good suggestion to change the memory configuration to bulk_load. See if it helps (don’t forget to change it back afterwards).

Cheers,
Pavel

Compressing the file won’t help you as far as memory is concerned but it may speed up the bulk load and TRIG should compress a lot depending on the data (possibly as much as 10:1) so uploading it should be much faster as well.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.