Ttl or csv data size to load via studio

Dear Stardog Egnineerign Support Team,

So, there are many ways to load csv and json data into stardog: cli, curl, studio, designer, etc.

I am using a cloud instance and would like to have a robust etl pipeline. My main constraint - no Java … which means no stardog cli

So, I could I could use curl for 80+ csvs. I see documentation here Adding Data | Stardog Documentation Latest

  • But something tells me it will be an error prone effort and limiting as far as size or other functionality that cli provides and is necessary but not available via curl Can you please provide your guidance?

To do load the same 80+ csvs manually via Studio or Designer would be a pain, right? Or, shall I transform my data.csv(s) to data.ttl and loaded manually via Studio? Again:

  • Are there size or other limitations and problems down the road with such an approach?

Thanks,
Radu

Hi Radu,

Were you successful before with CSV JDBC driver - http://csvjdbc.sourceforge.net/doc.html ?

If so, using Designer you can create your JDBC connection and virtual graph(s). Then in Studio you can materialise your virtual graphs. There should be no size (within reason) or other limitations.

Alternatively if you can create .ttl data, you can load using Studio. How many triples are you loading?

Thanks

Joe

Joe,

Thanks for providing you answers. I have never tried CSV JDBC drivers - and that would require Java on my end …

1.2MB data.ttl import via studio worked though - thank you.

Thanks,
Radu

Also since my hack etl pipeline is relying on TTL import via Studio (for now) my subsequent questions are:

  • How should I update data.ttl?
  • Unload/delete it first and load the data.ttl?
  • What if my model.ttl changes?

I need some best practices and guidelines. Meanhwhile getting java and stardog cli …