Hello, I'm currently attempting to store a 37mb JSON payload in the store to cache assertion. However, when I attempt to do this I'm receiving an error message stating:
Unable to read stream: the specified maximum data size limit would be exceeded
I've adjusted the Maximum entry size property on the store to cache assertion to accommodate for this but I still receive the error message above. I assume there is a cluster wide property that I need to configure to override this. Could someone please tell me what this is?
I suspect the answer to this question here will help: https://communities.ca.com/thread/241769078-message-size-limit
Specifically, I think the io.xmlPartMaxBytes CWP may be the trick in this case.
Additionally, I'm not confident yet but my gut tells me that if the above doesn't work (data from the answer on that other question or the CWP above), then it may be that the message is compressed but we expand it when storing it into cache and that expanded/original size is too large for the set limit. I know we do this when running through threat protection assertions on messages or setting limits in policy on sizing, but not entirely sure if that behaviour also applies to the Store to Cache assertion. I'll need to double-check that part.
I hope the above helps point you in the right direction.
Thanks Dustin I will see if setting this cwp will do the trick.
This did the trick, thanks again!
Dear mrutherford ,
The max size of store to cache is 1G,
But the error you got is due to what DustinDauncey said, the request/response payload has another restriction on io.xmlPartMaxBytes, 2.5M by default.
But, I have to remind you, cache large data in memory is not a good practice, you may need to ensure you set enough heap size for gateway.
Dustin's suggestion ended up resolving the issue. Thanks!