Hello, I was trying to import a virtual graph from a PostgreSQL database into Stardog in order to gain better performance.
However, when importing the graph I got an OutOfMemoryError. I upgraded my machine to one with 128GB memory and the error still persists.
Any idea, how to fix this?
Would you supply the following log files from $STARDOG_HOME?
I can think of two possible cases off the top of my head:
- Stardog does not know you have 128G memory due to Java configuration
- Linux heap corruption is occurring (making it look like memory is gone).
The above logs will give a clear indication in both cases. And give us hints if this is a completely different case.
I receive stardog.log via another channel. Your Java config is still set for 4G.
This link should help: Capacity Planning | Stardog Documentation Latest
Thanks, I added this line in /etc/stardog.env.sh and it worked.
export STARDOG_SERVER_JAVA_ARGS="-Xms16g -Xmx16g -XX:MaxDirectMemorySize=40g"
However, when I tried to import a second virtual graph from Snowflake I got another error related to disk usage. I will describe it in a new topic.