Our company does use the above tools for a long time. These days most of our applications use dynamic SQL and so the data inside of DETECTOR and SUBSYSTEM Analyzer is growing fast. We are extracting that data with CA standard processes out of the internal structures into db2 tables. In the db2 tables we hold that data for a period of one month. Today we only have some sql queries do filter valueable informations out of that db2 tables.
Our first questions are:
- what are other customers (with a very intensive usage of dynamic SQL) doing with that data
- do you use any further CA tools to get trends / informations out of that data
- do you use other tools / methods to aggregate that data in some way to get informations out of that data
I would be very interested to get your feedback!
We will run this past product management. We would be interested in hearing from other community members as well
regarding this post.
We define our datastores to hold 1 months data – we then create multiple datastores for history
Datastor – live
Datasth1 – previous month
Datasth2 – 2 months back
Datasth3 – 3 months back
Each month we do the following:
Rename sth3 datasets to temp
Rename sth2 datasets to sth3
Rename sth1 Datasets to sth2
Rename temp datasets to sth1
Then use detector copy to copy live to sth1
Our performance team runs daily extracts to look for anomalies, and when needed, pulls data from the “archives”
This seems to have worked for us
Technology Architect, Database Infrastructure Services
Technology Solution Services
123 East Main Street
Louisville, KY 40202
(502) 476-2538 or 407-7266
Thank you very much for your comments! Our DB2 transaction workload is dominated by dynamic SQL - this has the effect that the accumulated data in DETECTOR is growing very fast and also the effect that it is not so easy to identify "one statement" as a root cause for problems. Do you have similiar problems?
Regarding DYNAMIC SQL, most customers offload the Detector metrics using the "NORMALIZE" parameter so they get "identical" statements aggregated. This allows them to handle dynamic pretty much like static SQL due to the contoken which is a hash routine which will replace literals and constants with "?'.
If you have a high number of infrequent executed dynamic statements, you could consider not to load these into the Db2 tables.
In terms of trending, we are working on a MOI/Db2 solution (Mainframe Operational Intelligence) which will analyze static SQL for anomalies, but the first cut doesn't provide support for dynamic SQL. However, you could import the most resource intensive/most executed dynamic SQL statements into a spreadsheet and graph them from here.
Hope this helps /Steen Rasmussen.