What type of virtual graph did you create? We don't currently support variable type or variable predicate queries on virtual graphs with reasoning. However, an error should be returned. Can you share the query plan?
A workaround was to ask the service to do the reasoning remotely. Let re-use the queries in my previous post.
If I do a virtual import in database example3, I can use the following
SELECT *
{
{
SERVICE <http://admin:admin@localhost:5820/example3/query/reasoning> {
?s a vcard:Individual
}
}
}
Reasoning works because I added /reasoning at the end of my service endpoint. It done on the remote endpoint but that fine.
My question is, is there a way to achieve the same thing within a virtual add graph? I understand the performance would probably be impacted, but only looking at using T-box type of inferencing.
no worries, Was hoping there an undocumented option.
These are the main limitations but we do support reasoning over virtual graphs in general. Can you share which database is underlying your virtual graph? This might shed some light on things. If you're running a query with reasoning which is not producing the expected results (and not using a variable type or property), we can help you debug that.
Also check that you haven't somehow disabled the database option "Flag to enable reasoning over virtual graphs and SERVICE clauses." (reasoning.virtual.graph.enabled)
Confirm that reasoning is enabled for virtual graph
Using out of the box config. I loaded my rule with a turtle file. It is loaded in a namegraph by itself however, let me see if I load it in the default graph what happens.
Can you try inserting the data locally and see if the reasoning query works? It's gotta be something simple somewhere. This is a pretty straightforward query.
Before doing anything, I deleted the database and the virtual graph. During my tests, I often reseted either the database or the virtual graph but never both at the same time. I found it weird that there was no query plan so wanted to start from a clean slate thinking maybe something is not being cleanup properly.
Anyway , after resetting my installation and reloading exactly the same data , inference is working now. Maybe a simple server restart would of suffice, should of tried that first, but just happy it now works.