jasonlaz
(Iason Lazaridis)
July 18, 2022, 2:56pm
1
Hello, I was trying to import a virtual graph from a PostgreSQL database into Stardog in order to gain better performance.
However, when importing the graph I got an OutOfMemoryError. I upgraded my machine to one with 128GB memory and the error still persists.
Any idea, how to fix this?
matthewv
(Matthew Von-Maszewski)
July 18, 2022, 3:22pm
2
Would you supply the following log files from $STARDOG_HOME?
stardog.log
starrocks.log
data/LOG*
I can think of two possible cases off the top of my head:
Stardog does not know you have 128G memory due to Java configuration
Linux heap corruption is occurring (making it look like memory is gone).
The above logs will give a clear indication in both cases. And give us hints if this is a completely different case.
matthewv
(Matthew Von-Maszewski)
July 18, 2022, 4:18pm
3
I receive stardog.log via another channel. Your Java config is still set for 4G.
This link should help: Capacity Planning | Stardog Documentation Latest
jasonlaz
(Iason Lazaridis)
July 19, 2022, 11:57am
4
Thanks, I added this line in /etc/stardog.env.sh and it worked.
export STARDOG_SERVER_JAVA_ARGS="-Xms16g -Xmx16g -XX:MaxDirectMemorySize=40g"
However, when I tried to import a second virtual graph from Snowflake I got another error related to disk usage. I will describe it in a new topic.