Issue: Problem while building index

Hello,

I am trying to create a database using Billion Triples Challenge (BTC500M.nt) using the below command,

stardog-admin db create -n btc_big /home/backup/sgao/dataset/BTC/btc500M.nt I am getting a problem while building index error. Meanwhile i tried to create database with BTC150M it works and the database was successfully created. Can someone please help.

Below are the logs thrown while getting the error,

INFO 2019-03-14 10:11:42,846 [Stardog.Executor-8] com.complexible.stardog.index .Index:printInternal(314): Parsing triples: 97% complete in 00:00:19 (2.9M tripl es - 150.8K triples/sec)
INFO 2019-03-14 10:11:44,642 [stardog-user-1] com.complexible.stardog.index.Ind ex:printInternal(314): Parsing triples: 100% complete in 00:00:20 (3.0M triples - 141.1K triples/sec)
INFO 2019-03-14 10:11:44,642 [stardog-user-1] com.complexible.stardog.index.Ind ex:stop(326):
INFO 2019-03-14 10:11:44,643 [stardog-user-1] com.complexible.stardog.index.Ind ex:stop(329): Parsing triples finished in 00:00:20.948
WARN 2019-03-14 10:11:46,530 [stardog-user-1] com.complexible.stardog.index.Abs tractIndexData:updateInPlace(348): Error updating index
java.util.concurrent.ExecutionException: com.complexible.stardog.index.IndexExce ption: No space left on device
at com.google.common.util.concurrent.AbstractFuture.getDoneValue(Abstrac tFuture.java:531) ~[guava-26.0-jre.jar:?]
at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.j ava:512) ~[guava-26.0-jre.jar:?]
at com.google.common.util.concurrent.AbstractFuture$TrustedFuture.get(Ab stractFuture.java:83) ~[guava-26.0-jre.jar:?]
at com.complexible.common.util.concurrent.ExecutionGroup$1.executeAndWai t(ExecutionGroup.java:78) ~[stardog-utils-common-6.1.0.jar:?]
at com.complexible.stardog.index.AbstractIndexData.updateInPlace(Abstrac tIndexData.java:335) ~[stardog-6.1.0.jar:?]

The file size is just 500MB.

Hello and welcome!

Even though the file size is 500MB, stardog will need an additional chunk of disk space to store the data and indexes, as well as some temporary space in java.io.tmpdir. Given that your trace shows java.util.concurrent.ExecutionException: com.complexible.stardog.index.IndexExce ption: No space left on device, you are either running out of disk space entirely (most likely) or running out of temp space

Hello Stephen,

Here is my server status,

[root@coul stardog]$ stardog-admin server status
Backup Storage Directory : .backup
CPU Load : 0.00 %
Connection Timeout : 10m
Export Storage Directory : .exports
Memory Direct : 400M (Max: 1.0G)
Memory Direct Buffers : 52K (Max: 269M)
Memory Direct Mapped : 8.0M (Max: 115M)
Memory Direct Pool : 0B (Max: 384M)
Memory Heap : 834M (Max: 1.8G)
Memory Mode : DEFAULT
Page Cache Count : 3
Page Cache Hit Ratio : 95.25 %
Page Cache Memory : 5.3K
Platform Arch : amd64
Platform OS : Linux 2.6.32-754.3.5.el6.x86_64, Java 1.8.0_181
Query All Graphs : false
Query Timeout : 5m
Security Disabled : false
Stardog Home : /var/opt/stardog
Stardog Version : 6.1.0
Strict Parsing : true
Uptime : 1 hour 24 minutes 34 seconds

It's disk space you're running out of.

The disk space of the machine you mean?

How much size does it require, is it the size of the dataset file?

Yes, disk space on the machine that is running the database. It's hard to say exactly how much space you'll need. It's going to depend on a lot of factors. The docs give a rough approximation Home | Stardog Documentation Latest

As @stephen mentioned you'll also need enough space on in your temp directory.