Stardog with FIBO

Hi Team,
I am working with FIBO ontology. I saw on their website that EDM council also works with Stardog. I uploaded FIBO prod ontology in Stardog DB. Now I am trying to query on this database with reasoning ON as there are many DL rules. But, it takes alot of time and once I got GC overlimit error too. Would you please help me on the same?

Thank you!

Best Regards,
Smrati

Also, I imported FIBO and one more small ontology and added equivalence between the two and tried running SPARQL with reasoner ON. It takes alot of time and shows error!!
Is there any other approach to do the same?
Thanks!

I’m familiar with FIBO but I’ve never really looked closely at it. What options are you using when you create your database? What reasoning profile are you using? DL? What are your memory settings? Are you using the Stardog default memory settings? What warnings and/or errors are you seeing in Stardog.log? What version of Stardog are you using?

I am using the following:

  1. default reasoning settings in browser.
  2. Stardog 5.2.2
  3. Memory settings are default.
  4. To create the database, i used the option
    stardog-admin db create -n FIBO path-to-FIBO rdfs (there are 418)
    Error in the logs:
    It closes connection due to 10 m of inactivity and then says can not query a closed connection

Thank you!!
Best,
Smrati

Interesting. Are you running this on the same server that’s hosting Stardog or a remote machine? Do you have any settings in your stardog.properties file?

I tried on both same server as well as different machine too. I have default settings in stardog.properties.
Is there any way ican change :

  1. time out setting?
  2. memory arguments?

Thanks!
Best,
Smrati

The error looks like:
ERROR 2018-05-01 14:51:10,659 [stardog-user-21] com.stardog.http.server.undertow.ErrorHandling:writeError(180): Unexpected error on the server
java.lang.OutOfMemoryError: Java heap space
at com.google.common.collect.Sets.newHashSet(Sets.java:164) ~[guava-18.0.jar:?]
at com.clarkparsia.pellet.tableau.branch.DisjunctionBranch.tryBranch(DisjunctionBranch.java:204) ~[stardog-reasoning-core-5.2.2.jar:?]
at com.clarkparsia.pellet.tableau.branch.Branch.tryNext(Branch.java:114) ~[stardog-reasoning-core-5.2.2.jar:?]
at com.clarkparsia.pellet.tableau.completion.rule.DisjunctionRule.applyDisjunctionRule(DisjunctionRule.java:95) ~[stardog-reasoning-core-5.2.2.jar:?]
at com.clarkparsia.pellet.tableau.completion.rule.DisjunctionRule.apply(DisjunctionRule.java:51) ~[stardog-reasoning-core-5.2.2.jar:?]
at com.clarkparsia.pellet.tableau.completion.rule.AbstractTableauRule.apply(AbstractTableauRule.java:65) ~[stardog-reasoning-core-5.2.2.jar:?]
at com.clarkparsia.pellet.tableau.completion.SROIQStrategy.doComplete(SROIQStrategy.java:141) ~[stardog-reasoning-core-5.2.2.jar:?]
at com.clarkparsia.pellet.tableau.completion.AbstractCompletionStrategy.complete(AbstractCompletionStrategy.java:387) ~[stardog-reasoning-core-5.2.2.jar:?]

Thanks and Regards,
Smrati

Smrati, I'm part of the FIBO team.
I'm puzzled you seemed to be saying you had 418 files for FIBO Production. In fact there are currently only 181 files in FIBO Production. In fact 75 of these are "Aboutxxx" files used for loading into file-based tools such as Protege that you can safely delete.
To get the official version you should go to this page About FIBO and download prod.rdf.zip towards the bottom of the page.
If you're still having problems it would be useful to know what SPARQL query you ran to get the error you listed. And the additional smaller ontology you added equivalence for (ideally attach it).
Cheers
Pete

Hi Pete,

Thank you for getting back!!
I apologize for that number, I checked later on and realized the same. now i have removed all the About*.rdf as well and uploaded the same. I am trying to add equivalence and running the reasoner over the same. This is what I am trying to achieve:

  1. Create a new .rdf file. Test.rdf
  2. Add a class under owl:Thing. Name : Loan
  3. Import another file in the same: C:\Users\smrat\Downloads\prod\fibo\ontology\master\2018Q1\FBC\DebtAndEquities\Debt.rdf
  4. Add Loan “isEquivalentTo” Debt
  5. Load the complete FIBO ontology to a satrdog database with the command:
    stardog-admin db create -n FIBOSample C:\prod\fibo\ontology\master\2018Q1
  6. Add the file Test.rdf to the same Stardog db.
  7. Use Stardog console and query:
    select * where {
    new:Loan ?p ?o
    }
    with reasoning ON.

The error occurs: Internal Server Error:
The logs display:
WARN 2018-05-01 16:05:53,571 [stardog-user-27] com.clarkparsia.blackout.Saturator:saturate(104): Saturation was interrupted. Query results might be incomplete.
WARN 2018-05-01 16:10:00,013 [d05fe54b-0bef-472e-9bd2-93f52c89fca1_Worker-7] com.complexible.stardog.db.DatabaseImpl:lambda$null$0(1356): Closing connection to database FIBOSample5 due to inactivity for 10m
WARN 2018-05-01 16:10:00,013 [d05fe54b-0bef-472e-9bd2-93f52c89fca1_Worker-8] com.complexible.stardog.db.DatabaseImpl:lambda$null$0(1356): Closing connection to database FIBOSample5 due to inactivity for 10m
WARN 2018-05-01 16:14:56,612 [stardog-user-27] com.clarkparsia.blackout.Saturator:saturate(104): Saturation was interrupted. Query results might be incomplete.
ERROR 2018-05-01 16:14:56,613 [stardog-user-27] com.complexible.stardog.protocols.http.server.StardogHttpServiceLoader:accept(235): An unexpected exception was handled by the server
org.openrdf.query.QueryEvaluationException: Cannot use a closed connection
at com.complexible.common.rdf.query.IteratorAsTupleQueryResult.hasNext(IteratorAsTupleQueryResult.java:81) ~[stardog-utils-rdf-5.2.2.jar:?]
at org.openrdf.query.QueryResults.report(QueryResults.java:158) ~[sesame-query-4.0.0.jar:?]
at org.openrdf.query.resultio.QueryResultIO.writeTuple(QueryResultIO.java:449) ~[sesame-queryresultio-api-4.0.0.jar:?]
at com.complexible.stardog.protocols.http.server.ProtocolUtils.writeTupleResponse(ProtocolUtils.java:571) ~[stardog-protocols-http-server-5.2.2.jar:?]
at com.complexible.stardog.protocols.http.server.ProtocolUtils.executeReadQuery(ProtocolUtils.java:478) ~[stardog-protocols-http-server-5.2.2.jar:?]
at com.complexible.stardog.protocols.http.annex.QueryPanelService.executeQuery(QueryPanelService.java:179) ~[stardog-webconsole-annex-5.2.2.jar:?]
at com.complexible.stardog.protocols.http.annex.QueryPanelService.post(QueryPanelService.java:120) ~[stardog-webconsole-annex-5.2.2.jar:?]
at com.stardog.http.server.undertow.jaxrs.ExtractRoutes.lambda$handleIt$5(ExtractRoutes.java:196) ~[stardog-protocols-http-server-5.2.2.jar:?]
at org.apache.shiro.subject.support.SubjectRunnable.doRun(SubjectRunnable.java:120) ~[shiro-core-1.2.3.jar:1.2.3]
at org.apache.shiro.subject.support.SubjectRunnable.run(SubjectRunnable.java:108) ~[shiro-core-1.2.3.jar:1.2.3]
at com.stardog.http.server.undertow.ErrorHandling.lambda$safeDispatch$1(ErrorHandling.java:71) ~[stardog-protocols-http-server-5.2.2.jar:?]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [?:1.8.0_131]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [?:1.8.0_131]
at java.lang.Thread.run(Thread.java:748) [?:1.8.0_131]
Caused by: com.complexible.stardog.plan.eval.ExecutionException: Cannot use a closed connection
at com.complexible.stardog.plan.eval.ExecutablePlanFactory.createOptimized(ExecutablePlanFactory.java:107) ~[stardog-5.2.2.jar:?]
at com.complexible.stardog.plan.eval.ExecutionContext.executable(ExecutionContext.java:346) ~[stardog-5.2.2.jar:?]
at com.complexible.stardog.plan.eval.ExecutionContext.exec(ExecutionContext.java:321) ~[stardog-5.2.2.jar:?]
at com.complexible.stardog.plan.eval.ExecutionContext.exec(ExecutionContext.java:307) ~[stardog-5.2.2.jar:?]
at com.complexible.stardog.reasoning.blackout.PropertyOp.getSolutions(PropertyOp.java:101) ~[stardog-reasoning-core-5.2.2.jar:?]
at com.complexible.stardog.reasoning.blackout.PropertyOp.lambda$getSolutions$0(PropertyOp.java:81) ~[stardog-reasoning-core-5.2.2.jar:?]
at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193) ~[?:1.8.0_131]
at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193) ~[?:1.8.0_131]
at java.util.HashMap$KeySpliterator.tryAdvance(HashMap.java:1569) ~[?:1.8.0_131]
at java.util.stream.StreamSpliterators$WrappingSpliterator.lambda$initPartialTraversalState$0(StreamSpliterators.java:294) ~[?:1.8.0_131]
at java.util.stream.StreamSpliterators$AbstractWrappingSpliterator.fillBuffer(StreamSpliterators.java:206) ~[?:1.8.0_131]
at java.util.stream.StreamSpliterators$AbstractWrappingSpliterator.doAdvance(StreamSpliterators.java:169) ~[?:1.8.0_131]
at java.util.stream.StreamSpliterators$WrappingSpliterator.tryAdvance(StreamSpliterators.java:300) ~[?:1.8.0_131]
at java.util.Spliterators$1Adapter.hasNext(Spliterators.java:681) ~[?:1.8.0_131]
at com.complexible.common.base.Streams$1.computeNext(Streams.java:269) ~[stardog-utils-common-5.2.2.jar:?]
at com.google.common.collect.AbstractIterator.tryToComputeNext(AbstractIterator.java:143) ~[guava-18.0.jar:?]
at com.google.common.collect.AbstractIterator.hasNext(AbstractIterator.java:138) ~[guava-18.0.jar:?]
at com.complexible.stardog.plan.eval.operator.SolutionIterator$2.hasNext(SolutionIterator.java:63) ~[stardog-5.2.2.jar:?]
at com.complexible.stardog.reasoning.blackout.AbstractReasoningOperator.computeNext(AbstractReasoningOperator.java:131) ~[stardog-reasoning-core-5.2.2.jar:?]
at com.complexible.stardog.reasoning.blackout.AbstractReasoningOperator.computeNext(AbstractReasoningOperator.java:142) ~[stardog-reasoning-core-5.2.2.jar:?]
at com.complexible.stardog.reasoning.blackout.AbstractReasoningOperator.computeNext(AbstractReasoningOperator.java:33) ~[stardog-reasoning-core-5.2.2.jar:?]
at com.complexible.common.collect.AbstractSkippingIterator.tryToComputeNext(AbstractSkippingIterator.java:143) ~[stardog-utils-common-5.2.2.jar:?]
at com.complexible.common.collect.AbstractSkippingIterator.hasNext(AbstractSkippingIterator.java:130) ~[stardog-utils-common-5.2.2.jar:?]
at com.complexible.stardog.plan.eval.operator.impl.SingleProjectionOp.computeNext(SingleProjectionOp.java:70) ~[stardog-5.2.2.jar:?]
at com.complexible.stardog.plan.eval.operator.impl.SingleProjectionOp.computeNext(SingleProjectionOp.java:29) ~[stardog-5.2.2.jar:?]
at com.complexible.common.collect.AbstractSkippingIterator.tryToComputeNext(AbstractSkippingIterator.java:143) ~[stardog-utils-common-5.2.2.jar:?]
at com.complexible.common.collect.AbstractSkippingIterator.hasNext(AbstractSkippingIterator.java:130) ~[stardog-utils-common-5.2.2.jar:?]
at com.complexible.stardog.plan.eval.operator.impl.DistinctOp.computeNext(DistinctOp.java:53) ~[stardog-5.2.2.jar:?]
at com.complexible.stardog.plan.eval.operator.impl.DistinctOp.computeNext(DistinctOp.java:21) ~[stardog-5.2.2.jar:?]
at com.complexible.common.collect.AbstractSkippingIterator.tryToComputeNext(AbstractSkippingIterator.java:143) ~[stardog-utils-common-5.2.2.jar:?]
at com.complexible.common.collect.AbstractSkippingIterator.hasNext(AbstractSkippingIterator.java:130) ~[stardog-utils-common-5.2.2.jar:?]
at com.complexible.stardog.plan.eval.operator.impl.SliceOp._hasNext(SliceOp.java:87) ~[stardog-5.2.2.jar:?]
at com.complexible.stardog.plan.eval.operator.impl.SliceOp.computeNext(SliceOp.java:95) ~[stardog-5.2.2.jar:?]
at com.complexible.stardog.plan.eval.operator.impl.SliceOp.computeNext(SliceOp.java:26) ~[stardog-5.2.2.jar:?]
at com.complexible.common.collect.AbstractSkippingIterator.tryToComputeNext(AbstractSkippingIterator.java:143) ~[stardog-utils-common-5.2.2.jar:?]
at com.complexible.common.collect.AbstractSkippingIterator.hasNext(AbstractSkippingIterator.java:130) ~[stardog-utils-common-5.2.2.jar:?]
at com.complexible.stardog.plan.eval.operator.util.AutoCloseOperator.computeNext(AutoCloseOperator.java:112) ~[stardog-5.2.2.jar:?]
at com.complexible.stardog.plan.eval.operator.util.AutoCloseOperator.computeNext(AutoCloseOperator.java:25) ~[stardog-5.2.2.jar:?]
at com.complexible.common.collect.AbstractSkippingIterator.tryToComputeNext(AbstractSkippingIterator.java:143) ~[stardog-utils-common-5.2.2.jar:?]
at com.complexible.common.collect.AbstractSkippingIterator.hasNext(AbstractSkippingIterator.java:130) ~[stardog-utils-common-5.2.2.jar:?]
at com.complexible.stardog.plan.eval.operator.util.OpBasedBindingSetIteration.computeNext(OpBasedBindingSetIteration.java:110) ~[stardog-5.2.2.jar:?]
at com.complexible.stardog.plan.eval.operator.util.OpBasedBindingSetIteration.computeNext(OpBasedBindingSetIteration.java:34) ~[stardog-5.2.2.jar:?]
at com.google.common.collect.AbstractIterator.tryToComputeNext(AbstractIterator.java:143) ~[guava-18.0.jar:?]
at com.google.common.collect.AbstractIterator.hasNext(AbstractIterator.java:138) ~[guava-18.0.jar:?]
at com.complexible.common.rdf.query.IteratorAsTupleQueryResult.hasNext(IteratorAsTupleQueryResult.java:77) ~[stardog-utils-rdf-5.2.2.jar:?]
… 13 more
Caused by: java.lang.IllegalStateException: Cannot use a closed connection
at com.google.common.base.Preconditions.checkState(Preconditions.java:173) ~[guava-18.0.jar:?]
at com.complexible.stardog.index.IndexReaderImpl.getOrders(IndexReaderImpl.java:204) ~[stardog-5.2.2.jar:?]
at com.complexible.stardog.index.IndexOrders.selectOrder(IndexOrders.java:75) ~[stardog-5.2.2.jar:?]
at com.complexible.stardog.index.IndexOrders.selectIterator(IndexOrders.java:59) ~[stardog-5.2.2.jar:?]
at com.complexible.stardog.index.statistics.EstimateCardinality.getBinaryCardinality(EstimateCardinality.java:165) ~[stardog-5.2.2.jar:?]
at com.complexible.stardog.index.statistics.EstimateCardinality.getCardinality(EstimateCardinality.java:114) ~[stardog-5.2.2.jar:?]

Kindly let me know if something is wrong in the above mentioned steps.
I am just trying to import FIBO ontology in my own ontology and add some DLs for the same.

Thankyou!
Best Regards,
Smrati

You can try setting the query.timeout property but I'm a little confused because it should default to 5min not the 10min you're seeing Home | Stardog Documentation Latest

That’s part of the Pellet reasoner. OWL DL reasoning can become very expensive even for small ontologies. The types of axioms are at least as important as the number of axioms. Clearly, OutOfMemoryError means that you’d have to increase the heap space and hope whether it’s enough.

Given that he got an OutOfMemoryError, I don’t think the timeout would change anything. OWL DL reasoning is complex, everything is done via Pellet in-memory. The only way would be to increase the Java heap space.

You are correct that DL reasoning can be complex and is done in memory. He seems to be having several problems and this would be one of them. He mentioned DL axioms but then said that he created his database using the defaults which would be SL.

Stardog server memory settings can be set using the environment variable STARDOG_SERVER_JAVA_ARGS

@smrati.hans You should verify that you need and that the database is configured to use DL as the reasoning level.

Thank you Zachary and Lorenz, will try allocating more memory to the process and set the reasoning to DL.

Best Regards,
Smrati

Hi Team,

I tried setting up memory args for stardog the same way as mentioned in Manual. Now i am facing some issues:
As soon as server starts, I see following message on server:

Address already in use
Waiting for running tasks to complete…done. Executor service has been shut down.
Stardog server 5.2.2 shutdown on Mon May 07 15:45:16 EDT 2018.

Also, I am not able to access localhost:5820

Kindly help me with the same.
Not sure what went wrong.

Thank you!
Best Regards,
Smrati

It sounds like another Stardog instance is tying up the port. If server stop doesn’t kill it, you could use a program like lsof (assuming *nix) to see what else might be tying up the port.

Or if you restart the machine this will almost certainly go away

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.