Service Virtualization

 View Only
  • 1.  ESB integration

    Posted Nov 27, 2017 12:53 AM

    How can i communicate between multiple ESBs using Devtest subprocess or any other way to do it in devtest 9.5.



  • 2.  Re: ESB integration

    Posted Nov 27, 2017 10:27 AM

    I do not understand what you mean by 'communicate between multiple ESBs'.

    Can you describe the architecture of the environment a bit further and provide an example? 

    1) Do you need DevTest to submit transactions into the ESB? (e.g., act as the System-Under-Test)

    2) Do you need an approach where DevTest provides virtual responses on behalf of 'provider' application subscribed to queues or invoked by the ESB?

    3) Are you trying to virtualize certain activities within the ESB? (e.g., virtualize a Tibco BW Process or Java class running inside an ESB)

    4) Is there a particular transport protocol (JMS, HTTP, Java, TCP, etc.) involved?



  • 3.  Re: ESB integration

    Posted Nov 29, 2017 12:02 AM

    Hi Joel,

     

    Thanks for the reply i just want to create multiple sub process for multiple ESB's and need to run it according to the testing flow i have already know how it will flow but is it possible to create regression suite for the same including 1m TC.

    we are using multiple protocol according to the ESB's service method which we need to triggered   



  • 4.  Re: ESB integration
    Best Answer

    Posted Jan 04, 2018 11:48 AM

    Yes, it is possible.  Keep in mind, that we do not know all of your requirements and what processing, if any, is needed after the ESB call.

     

    Your test needs some sort of test driver (perhaps a dataset or a query against a database table) to drive the test.

    The dataset might specify which ESB to call and provide a column containing request payload data. Maybe there is also a column for expected response payload data.

     

    Assertions or some other type of mechanism determine which subprocess to invoke based on the value in the test data.

     

    Each subprocess accepts the payload data as an input parameter, invokes the ESB and maybe evaluates the ESB response. 

     

    Some type of response from the subprocess could be used to determine what error / warning conditions should be output. Failing the test causes this instance of the test to stop.

     

    Create a Suite to execute the test 1m times.

    Make the dataset "global" (i.e., uncheck Local).

    If you have 1m rows of data in a table, you can execute the test until the dataset is exhausted and At End Execute End the Test.

    If you do have 1m rows of input data, I would consider acquiring the data from an enterprise database table rather than using a CSV file to drive input.

    If you have a limited number of rows, "At End: Start Over" so the data in the dataset can be reused.

     

    A basic pattern (having no clear understanding of your specific requirements) might look like this:

      

    The subprocess accepts the input and sends "response" back to the driver.

    Perhaps the subprocess has the behavior to determine if the call is successful or not. Maybe the driver test case needs to pass an "expected response" for the comparison.

    Once you decide on what information is passed in the input and response to the subprocess, you can determine how you want the driver to handle error conditions.



  • 5.  Re: ESB integration

    Posted Jan 04, 2018 10:36 PM

    Hi Joel,

    Thanks for the details but i didn't get the payload option what we have to give in that one.

    in my case there are multiple and different type of assertions and filters in the ESB methods for each scenarios or you can say for different test cases can we put filters or assertions in a data sheet and fetch it according  to the scenarios it will look like this:-

     

    I haven't find out a way to add filters and assertions automatically using datasheet or database.

    We are using multiple transport protocols for each ESB. 



  • 6.  Re: ESB integration

    Posted Jan 05, 2018 11:21 AM

    I believe it will be too difficult to get into a concrete design in this thread.

    The final design might best utilize a combination of scripting, customization, and other features.

    If I understand the inputs correctly, your input sheet contains:

    - The data to identify the ESB and method to execute

    - Data identifying a variable number of filters (i.e., 0, 1 or many) to extract from the ESB method response

    - Data identifying a variable number of assertions (i.e., 0, 1, or many) that apply to the filtered ESB response elements

    - It appears that the filters and assertions the specific ESB method call can vary based on the input request

    Plus, these additional requirements:

    - You need to execute the test 1m times -- presumably in parallel using multiple DevTest vUsers for performance and runtime reasons

    - You need to record the results of the entire execution in a single, XML formatted result document

     

    Just "thinking out loud":

    - Standardize the definition of each assertion and filter such that they can be programmatically determined and executed. You would need to do this under most circumstances regardless of the implementation.

    - Allow a 'driver' test case to consume the ESB selection sheet and create an array of the assertions and filters for the ESB request method -- or use an XML to provide the definition (see below)

    - Use the subprocess approach in the previous post to initiate whichever ESB contains the method call. Pass as input the request to the method call. Have the subprocess return the response from the ESB method call to the 'driver'.

    - Call a custom class passing the ESB response and the array of filters and assertions that apply to that specific ESB method call

    - Implement a JSR step that calls the custom class that you create (compiled in a JAR) passing the ESB response data and array of filters and assertions.

    - Have the custom class implement the behavior to extract the filters and apply the assertions

    - Have the custom class return a pass/fail response to the 'driver'

    - Have the 'driver' evaluate the pass/fail and commit pass/fail information into a database table for this instance of the test (maybe set up a unique key for this test run and separate entries using Java epoc date/time) 

    - The test 'driver' always branches to 'End the Test' - no pass / fail is recorded here as the pass/fail report is generated as part of the tear down process

    - Set up a Staging Document to run the test 1m times with however many instances of vUsers you want to use

    - Set up Test Suite to execute the test. Add a 'Start Up' test to clear out the database table. Add a 'Tear Down' test case that runs at the end of the 1m test run. The tear down test case queries the database table and produces the XML response for the 1m run. This response is most likely written to the file system in a directory such that previous run data is not overwritten.

     

    Perhaps, each row on the ESB selection sheet contains instructions in an XML document of some sort which might make processing easier.  This is hypothetical, but a single row in the ESB selection sheet might look something like this:

    ESB_Control_Block>
      <ESB_data>
        <ESB_Type>JCAPS</ESB_Type>
        <ESB_QueueName>/myJcapsQueue</ESB_QueueName>

        <ESB_Method>getCustomerData</ESB_Method>
        <Assertions>
          <Assertion type="equals" name="fl_field1">SUCCESS</Assertion>
          <Assertion type="numequals" name="fl_field_count">3</Assertion>
        </Assertions>
        <Filters>
          <Filter type="xpath" name="fl_field1">/some/data/text()</Filter>
          <Filter type="xpath" name="fl_field_count">count(/some/data)</Filter>
        </Filters>

        <input_request>HTML_ENCODED_REQUEST_TO_ESB_HERE_IF_DATA_IN_XML_FORMAT</input_request>

        </expected_response>
      </ESB_data>
    </ESB_Control_Block>

    The custom class can iterate this XML to determine the assertions and filters to apply. Attributes included with each provide instructions as to how the Class extracts and evaluates the response content.