We are considering the use of the HTTP API to create ETL pipelines in Stardog (we prefer to avoid NiFi, per internal reasons). Accordingto the documentation, the API supports atomic insertion and update of data, which is OK, but we wonder how it works with concurrency. Can we safely run several threads of atomic insert/updates in the same database? Or may it run into deadlock/data corruption issues when inserting data in parallel.
Thank you in advance