Hello, I have a question about the new much-discussed version of Stardog, Stardog 5. My team is interested in using Stardog in conjunction with TinkerPop, but so far in Stardog 4 there is only compatibility up to TinkerPop version 3.0.2-incubating. We need to use TinkerPop 3.2.4 (the latest version of TinkerPop) due to some features which do not exist in the older version. Therefore, I am wondering:
- Is there an estimated release date for Stardog 5? I don’t need an exact date, even the anticipated month would be helpful for us.
- Will Stardog 5 have compatibility with TinkerPop 3.2.4? If this is something that is still being considered and not fully decided, please allow me to make a formal feature request for this compatibility.
Stardog 5 will be out this month and we’ll probably do a series of beta and RC releases before doing a final release this spring.
We’re not considering upgrading to TinkerPop 3.2.4 for 5.0; any upgrades of that support would take place during the 5.x release cycle.
What’s the reason to prefer an ad-hoc API like Tinkerpop as opposed to something standards based?
My team is exploring Stardog/Tinkerpop possibilities because we are interested in building a graph traversal system. We feel that the Gremlin language provides a better framework for graph traversals than the SPARQL language. Because we are going to be creating algorithms which will crawl through our graph DB and send back pathway information, an iterative traversal is necessary.
We are also interested in executing graph traversals through Python using the Gremlin-Python language or the Goblin framework which connects to the Gremlin Server.
However, we have had to explore other options due to the compatibility issues between Stardog/Tinkerpop.
I would be curious to hear whether you have any other solutions that might be helpful for our use case, other than what I have said above.
We’re in the process of extending the SPARQL language to add traversals. We recently blogged about the first part of this work, which requires that we modify how solutions are returned. We’ll be talking about the second part of the work in an upcoming post. That would be the best fit for you, it should support your use case, and you’ll get the best performance because it’s our stack top to bottom.
Thank you for sending me this blog, it is certainly relevant to the work we are trying to do. I will show this to my team and we will discuss our options moving forwards with this new info in light.