I believe it will be too difficult to get into a concrete design in this thread.
The final design might best utilize a combination of scripting, customization, and other features.
If I understand the inputs correctly, your input sheet contains:
- The data to identify the ESB and method to execute
- Data identifying a variable number of filters (i.e., 0, 1 or many) to extract from the ESB method response
- Data identifying a variable number of assertions (i.e., 0, 1, or many) that apply to the filtered ESB response elements
- It appears that the filters and assertions the specific ESB method call can vary based on the input request
Plus, these additional requirements:
- You need to execute the test 1m times -- presumably in parallel using multiple DevTest vUsers for performance and runtime reasons
- You need to record the results of the entire execution in a single, XML formatted result document
Just "thinking out loud":
- Standardize the definition of each assertion and filter such that they can be programmatically determined and executed. You would need to do this under most circumstances regardless of the implementation.
- Allow a 'driver' test case to consume the ESB selection sheet and create an array of the assertions and filters for the ESB request method -- or use an XML to provide the definition (see below)
- Use the subprocess approach in the previous post to initiate whichever ESB contains the method call. Pass as input the request to the method call. Have the subprocess return the response from the ESB method call to the 'driver'.
- Call a custom class passing the ESB response and the array of filters and assertions that apply to that specific ESB method call
- Implement a JSR step that calls the custom class that you create (compiled in a JAR) passing the ESB response data and array of filters and assertions.
- Have the custom class implement the behavior to extract the filters and apply the assertions
- Have the custom class return a pass/fail response to the 'driver'
- Have the 'driver' evaluate the pass/fail and commit pass/fail information into a database table for this instance of the test (maybe set up a unique key for this test run and separate entries using Java epoc date/time)
- The test 'driver' always branches to 'End the Test' - no pass / fail is recorded here as the pass/fail report is generated as part of the tear down process
- Set up a Staging Document to run the test 1m times with however many instances of vUsers you want to use
- Set up Test Suite to execute the test. Add a 'Start Up' test to clear out the database table. Add a 'Tear Down' test case that runs at the end of the 1m test run. The tear down test case queries the database table and produces the XML response for the 1m run. This response is most likely written to the file system in a directory such that previous run data is not overwritten.
Perhaps, each row on the ESB selection sheet contains instructions in an XML document of some sort which might make processing easier. This is hypothetical, but a single row in the ESB selection sheet might look something like this:
ESB_Control_Block>
<ESB_data>
<ESB_Type>JCAPS</ESB_Type>
<ESB_QueueName>/myJcapsQueue</ESB_QueueName>
<ESB_Method>getCustomerData</ESB_Method>
<Assertions>
<Assertion type="equals" name="fl_field1">SUCCESS</Assertion>
<Assertion type="numequals" name="fl_field_count">3</Assertion>
</Assertions>
<Filters>
<Filter type="xpath" name="fl_field1">/some/data/text()</Filter>
<Filter type="xpath" name="fl_field_count">count(/some/data)</Filter>
</Filters>
<input_request>HTML_ENCODED_REQUEST_TO_ESB_HERE_IF_DATA_IN_XML_FORMAT</input_request>
</expected_response>
</ESB_data>
</ESB_Control_Block>
The custom class can iterate this XML to determine the assertions and filters to apply. Attributes included with each provide instructions as to how the Class extracts and evaluates the response content.