Supposing a data extraction which comes from a quite complex query; including many table relations. DB extract weights 5 MBs or more in plain text (a csv generated by a sql client).
How feasible is to use SOAP and a Clarity webservice for massive data extraction?
I know that write XOG files has to be chunked when they are quite big, so I suppose that this limitation applies in the same way to a Clarity webservice.
Even being relate to environment; hadware and configuration, what do you think about the following points.
Possibility of serve this amount of data (probably more than 10MB including xml tags)
Impact on Clarity performance
Impact related to Java heap space
I'd chunk files in 3MB max...
When xogging it will consume APP JVM memory.. so, yes, it will affect performance if you don't have a cluster or 2nd APP (1 app dedicate for Xogging).
Also, If you don't assign enough memory it will time out without completing.
My suggestion is to test in a low environment, and monitor APP JVM consumption (assign at least 2 GB for xogging). You can easily do it just via URL "security.cahces" and refresh screen to see % updated.
Benchmark also how long it takes.
We use Clarity web services to extract large amounts data (creating multiple XLSX spreadsheets ~1MB, Access database ~10MB and interfaces to other systems).
To reduce the payload size I typically have a query that gives me a list of IDs for the items that I what to extract. Another query gives me the details for a given record id or set of ids. In the program that uses the Clarity web services I reconstitute the response payloads into the desired format.