Is it possible to split a large n-triple data file in multiple smaller ones and load them in parallel?
The best performance can be obtained using this method to bulk load at db creation: Database Administration | Stardog Documentation 7.6.0
And don’t forget to keep in mind blank nodes when splitting n-triples. You can do it with the Stardog command
stardog file split <inputFile> command which can also compress the resulting files as well.