SPARQL Update format error

I’m just starting with Stardog.

I tried to send


to the http://localhost:5820/test1/update endpoint,
which resulted in

{"message":"No known format matches the file: /tmp/data6682508545189935642.scr"}

I also tried to add ?'format=RDF/XML' to the URL, but it didn’t change anything.

Should this work?

I was able to reproduce the error you’re seeing. It should work and I was able to successfully load other resources. Looks like some issue with the content type and could be because it’s returning text/xml rather than application/rdf+xml for the content type

Thanks a lot for investigating!

No known format matches the file

The error message seems to be a bit misleading.
It sounds like it tried several formats, but if it actually had tried rdf+xml it should have succeeded (I assume rdf+xml is a known format).

What about supporting a parameter that allows to override the format (like with a query parameter as I tried)?

Zach is correct here. Stardog uses the Content-Type of the response to determine the file format, and the only acceptable MIME types for RDF/XML are, according to our parser, application/rdf+xml and application/xml.

I will open up a ticket for us to implement some sort of workaround for this case, but in the meantime I would suggest manually downloading the RDF/XML and adding it to Stardog.

I don’t think that it tries formats until it finds one that succeeds. I believe that it will try the content type first, if it’s not one that is supported it might fall back to guessing from the extension. In this case there is no extension and the file type is returning it as text/xml. With that type it could either be TRIX or RDF/XML. It doesn’t seem as though the specs have much about how it should be handled. Adding a parameter to the url would be problematic. You wouldn’t know if it should be used for dereferencing the file and stripped off or if it was actually part of the url and should be passed along. Unfortunately the endpoint isn’t providing enough information to parse the file and some guessing would be involved.

…bash to the rescue

stardog data add --named-graph mydb <(curl -s

I messed around a while ago with writing a custom function to allow you to make http calls. Something along the line of

SELECT url:get(?url) where { ...}

Where it would hash the file and return a local file: url like file://home/myhome/stardog-files/4A/B3/983ABF382

I’m not sure how good an idea it was but it was fun to play with. If anyone thinks it’s not a dumb idea I’d be happy to polish it up.

Thanks a lot for all the support!
I was able to upload the file using a shell command.

I still think the error message could be more helpful.
But otherwise you can consider the case closed.

Is there any other HTTP API to upload such data (besides SPARQL-Update)?
I try to avoid the need to reach out to the command line so I’m able to do it from a remote client application.

Otherwise I’d download it into memory, transform it and upload it using INSERT DATA

You can POST the data over HTTP via our API:

That’s great, exactly what I was looking for.
I thought I read somewhere that this HTTP API was removed or is deprecated.
I’m new to Stardog. I might have mixed something up.

This worked fine. Thanks again for the pointer.

I can load the data into the default graph, but wasn’t yet able to load them into a named graph

curl --trace-ascii trace.txt \
     --include \
     --request POST \
     --header "Accept: text/plain" \
     --header "Content-Type: application/rdf+xml" \
     --data-binary "@data/qudt_dimensionalunit.rdf" \
     -u admin:admin \

This doesn’t seem to be the correct way to pass the graph-uri.
Any hint what I’m doing wrong?


did the trick.
I found this quite difficult to decipher from the format used for the URL is shown in the docs.
A concrete example without the {} replacements would have been very useful.

We are currently in the process of updating/redoing the apiary docs, however this is the standard way to denote query params as stated in their Docs.

For a simple load such as this, we also support the SPARQL Graph Store Protocol, so instead of beginning a tx, adding the data, committing the tx, you could simply PUT http://localhost:5820/test1?graph= with your Content-Type and data set appropriately.

I see. Thanks a lot for all the pointers.
It’s a lot of info to process at once at the beginning :wink:

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.