Open Sparql endpoint for external access (Comunica, LDKit)

Currently we are building a nodejs (typescript) backend using the stardog SDK to run queries on a stardog database. For the time being, we are building the queries manually (assisted with [LDFlex](GitHub - LDflex/LDflex: A JavaScript DSL for querying Linked Data on the Web to prevent sparql injection attacks). We want to abstract the data layer using LDKit (dto alike) to facilitate development (we are using stardog as an operational database on our experiment) but I have not been able to connect a stardog sparql endpoint in the context to let LDKit run the queries. Is there any way to make it happen or I am limited to the rest API and the stardog SDK to run sparql queries? Are third parties going to be able to run queries through tools like Comunica without having to use the REST API?


I feel a little awkward replying to your post w/o any sort of answer .. but I wanted to thank you for the interesting references.

I think that these forums are often as much about spreading knowledge and stimulating ideas as they are about answering questions.

I've been able to send SPARQL queries to a stardog free endpoint by creating a new user, giving them the cloud and reader roles, and then using that new user's user/pass as credentials to send the SPARQL queries to an endpoint that looks like:


Such as:

But I could not authenticate using my "account credentials", I had to create the new user, associate the roles, in the Studio Security tab

Thanks. Just in case I managed to connect it to the [host]/[database]/query endpoint. Sadly, LDKit only works with rdf+json and stardog does not support it as an accepted endpoint response. I will work on doing the manual mapping, will post a follow up if I manage to make it work.

Thank you! It is indeed the way to make it. It also works with Comunica and LDLink, but for the second one it needs some workaround to handle the incompatible response format (rdf+json vs stardog supported types). It also needs an additional workaround as query and update/post endpoints are different.

As a follow up, managed to make it work. Changed the query engine to Comunica, and set up the context in the following way:

        const context: Context = {
            sources: [
                    type: "sparql",
                    value: "",

Can confirm it works for reading, writing, updating and deleting. I cannot express how much this helps our codebase and development. Highly recommended!