We're using Excel data files to drive our Application Test scripts. Data changes are causing a lot of churn and file management overhead.
I'm investigating moving our data sheets to Mongo DB to centralize our test collateral. So far, I don't see any blockers, but wanted to check if someone has already done this. I've imported a sample data sheet into Mongo, and am looking for the best way to query the data and use in our existing scripts with minimal changes.
For context, our existing data sheet/script approach is as follows:
- data sheet rows get loaded by script and has columns for things like host, port, method type (REST calls) and logging path for HP ALM.
- each of the cells becomes a variable the script uses to make the call and log the result.
I've looked at using Mongo db java classes, writing a custom extension, all of which seem pretty straightforward, but looking for the simplest and most efficient method possible. Looking for input from anyone with similar experience.
Sorry not done this but would love to find out if anyone can provide guidance here. I know there was a similar question on MongoDB last year on here...
I did a quick search both on this community and the Test Data Manager community for "Mongo" but wasn't able to find any previous questions that might help to answer Jason's question. Is there another term i could look for to try and find that question?
Melanie - oh...I did not answer this from the TDM perspective but from DevTest
Jason - can you please confirm?
Hi Koustubh and Melanie,
My question is from DevTest perspective, how to consume data in a Mongo DB from a DevTest automated script perspective. I'm able to connect from DevTest to Mongo DB, but looking for guidance on retrieving data and using that data inside DevTest automated script.
Writing your custom class as a dataset is probably the only option available. Curious, why go for a NoSQL DB. Importing your data into a regular RDBMS would allow you to use the DB Dataset out of the box, without having to take care of iterating etc. I'm sure you have reasons to go this route (huge data volume perhaps? or no data relationships across your spreadsheets)
Another option if you do choose to go the NoSQL route might be something like Apache Drill - Schema-free SQL for Hadoop, NoSQL and Cloud Storage which provides a SQL like interface for NoSQL. You can import the driver into your lib folder and treat it just like a regular RDBMS.
Thanks for the reply. We want to import parameterized json request files, and all of the Excel data sheets that drive our automation, and several other file types. NoSQL seemed like the best approach to quickly migrate and centralize our current data files. We do have a pretty large data volume and wanted a solution that scales well and is easy to query. Writing the custom solution is what I was expecting, I was just trying to see if someone came up with alternate approach or was able to do something clever we hadn't tried.
The most straightforward way of interacting with MongoDB might be to ignore its noSQL capabilities and treat it as a JDBC data source, using a JDBC driver and being able to use all the normal DevTest data set capabilities. I haven't tried it, but I assume a good place to start would be GitHub - erh/mongo-jdbc: JDBC Driver for MongoDB
I had the same idea initially and tried a number of different JDBC driver packages, none worked inside DevTest to connect to MongoDB via any JDBC UI. All of these packages are usable as classes via Dynamic Java Execution or JSR-223 steps, so that's where I'm currently investigating. If you're able to get an actual JDBC driver type connection, I'd love to hear how you got it to work because it has eluded me
Thanks for the response,
I now have a reliable and working solution using CA DevTest & MongoDB.
I used the MongoDB Java Driver APIs and then wrote custom Java/Beanshell code using DevTest scripted assertion steps. Basically, the flow is to create a MongoDB session, save the session object for reuse in later steps, then query and iterate over results and set the results as DevTest variables. I took the standard web services Excel-based solution and deconstructed this into functional CSVs and imported them into MonogoDB as collections (like tables). I query each collection for things like environment info (host, port, etc.), test cases, test data, request payload (I import Postman collection right into MongoDB, it supports json natively). Each call builds on the previous data, then once i have all the call info i need I make a service call and do the validation, then I log results back to MongoDB. I have an independent service pick up the results from MongoDB and log them where/as needed (helps manage server load for DevTest server and HP ALM, in my case).
The performance is great and this has enabled us to move all our Excel-driven tests into MongoDB and only make changes in the spot needed, vs. across many single files. This approach served as a forcing function to use a standard data format for our tests, which we struggled with using many (diverging) Excel-based files. We can update our MongoDB data independently from our DevTest script and we're now easily able to execute across many different environments by defining new environments and the specific test data needed. Since the solution is fully data-driven, new tests don't require new scripts, they just get added to the CSV and Postman files and once imported into Mongo, they "just work". I build/deploy Suite and Test .mar files to our DevTest server and use CVS to execute Suites on a static schedule, 2x/day, then we use Jenkins + Script + Lisa-Invoke to query/execute individual Test .mar files.
I just wanted to share that it is possible and there are some very big wins going this route.
Could you please attach the sample project of your implementation.
We are interested in implementing the same.
Thanks & regards
Would you please attach sample project for our implementation.
We have one requirement for this where we are lagging.
I have tried your flow and able to achieve the connection between CA DevTest & MongoDB
I used the MongoDB Java Driver and then wrote custom Groovy code using DevTest scripted assertion steps (Execute script (JSR-223)). Basically, the flow is to
Added the Jars to hotDeploy folder
JSR-223 script as follows, Just to make connection and fetch 1 record from collection in JSON format
import com.mongodb.MongoClient;import com.mongodb.MongoClientURI;import com.mongodb.ServerAddress;import com.mongodb.MongoCredential;import com.mongodb.MongoClientOptions;import com.mongodb.client.MongoDatabase;import com.mongodb.client.MongoCollection;import static com.mongodb.client.model.Filters.*;import java.util.Arrays;MongoClient mongoClient = new MongoClient(new MongoClientURI("mongodb://mymongodb:test123@host:port"));MongoDatabase database = mongoClient.getDatabase("mymongodb");MongoCollection collection = database.getCollection("test");record = collection.find(eq("accountNumber", 123456)).first().toJson();
MongoDB Java Driver
Hi Viruba, can you please elborate the steps and sample JSR script steps so that we can use it.We are trying to connect to Mongo DB collections and query it for DevTest automation purposes.