I am looking to programmatically download the model (specifically what's shown in the 'Text Editor' in Stardog Studio - see image below). Is there a way to do this through pystardog
or the HTTP API?
(Ultimately, what I am interested in is downloading the Turtle file generated by clicking the button in the photo above.)
Using the "Generate model" GET endpoint from the HTTP API gets me close but it doesn't include the SHACL constraints at the bottom of the text editor.
I scoured the API Reference website a few times now. This feature doesn't appear to be available as of Stardog 9.1.0.
https://stardog-union.github.io/http-docs/
Hi Jamie.
I think what you're looking for is to export a named graph contained in a database which is what that "download" button is doing in Studio.
REST API: API Reference | ReDoc -
Note: I think what may be missing from this documentation is the graph_uri
query parameter which will allow you to export just a named graph from the database instead of the entire database.
pystardog docs: Modules — pystardog documentation
Below is example using pystardog that shows how to export a graph/model 2 different ways.
- with streaming the contents of the model/graph back (avoids reading the graph into memory all at once - useful for large graphs)
- just getting the contents of the graph back all at once:
import stardog
connection_details = {
"endpoint": "https://sd-######.stardog.cloud:5820",
"username": "someuser",
"password": "somepassword",
"database": "my_database",
}
model_uri = "urn:example:starwars:model"
# streams an export of the graph/model serialized as Turtle
# a generator that yields bytes making up the graph/model is returned
# avoids reading the graph/model at once into memory for large graph/models
# `chunk_size` parameter available to control the amount of bytes to read per chunk when streaming
model = ""
with stardog.Connection(**connection_details) as conn:
with conn.export(
graph_uri=model_uri, stream=True, content_type=stardog.content_types.TURTLE
) as stream:
for chunk in stream:
model += chunk.decode("utf-8")
print(f"---MODEL (STREAMED)---\n{model}\n\n")
# exports the graph/model serialized as turtle
# send whole response (graph/model) back as as bytes
with stardog.Connection(**connection_details) as conn:
model = conn.export(graph_uri=model_uri, content_type=stardog.content_types.TURTLE)
print(f"---MODEL---\n{model.decode('utf-8')}")
Hope that helps!
Cheers,
Noah
Hi Noah,
Thank you. With your help, I was able to export a named graph from a database.
I am attempting to import a named graph to a database, and doing so unsuccessfully. Below is the code I use, though the named graph doesn't populate in Stardog Studio under the database.
source_model = ''
with stardog.Connection(database=source_database, **conn_details) as conn:
with conn.export(graph_uri=source_model_uri, stream=True, content_type=stardog.content_types.TURTLE) as stream:
for chunk in stream:
source_model += chunk.decode('utf-8')
target_model = stardog.content.Raw(source_model, content_type=stardog.content_types.TURTLE)
with stardog.Connection(database=target_database, **conn_details) as conn:
conn.begin()
conn.add(target_model, graph_uri=target_model_uri, server_side=False)
conn.commit()
Is there something I am missing or doing incorrectly?
I found a way to load the named graph into the "Models tab" in Stardog Studio.
Is there a programmatic way to go about this?
I haven't found a programmatic way to go about this. My solution will likely be using the UI to load it...into the UI interface.