We are in the process of planning an upgrade from 13.0 to 13.2 (or 13.3) which may involve a move to On Demand (and a switch from MSSQL to Oracle if that is the case). As such we have several processes that currently use sql:update commands that as I understand is no longer supported and will need to be rewritten using XOG.
Rather than readding the same block of GEL script to invoke the SOAP calls to XOG IN in each and every script, I was thinking about creating a generic process and script that would house all of the SOAP related GEL, and passing the info from the other specific scripts to that one to do the XOGging. I figure this would make it easier to maintain if changes are ever needed (like changing the user account/password for example), and using the "main" scripts only to create the XML that would need to be XOGged.
Has anyone ever done anything like this or are there any reasons not to proceed down this path? If others have done it previously, any recommendations? My initial thought was to have the "main" process dump out the formatted XML to a file on the server using the instanceid as an ideintifier then having the XOG process read that file in and XOG but was wondering if there was a better way accomplish this? gel:persist has a character limit so I thought it best to avoid that but I'm open to ideas.
I have taken a slightly different approach. Instead of building up a single xog request, I am breaking it up into a single soap:invoke for each object.
I have also been working on eliminating the sql:update commands within our Gel scripts – we have over a hundred that have sql:update.
To that end, I have a simple program that extracts all the current Gel scripts and places them into a file structure on my disk. With this I can do text searches looking for with scripts have sql:update statements.
I have started to use gel:include tags for common functions: login, logout, Xog Object (read/write) templates, read properties, dump context. To do this, I have defined a common set of vars for use across the context i.e. XogSessionId for the session id, the name of the Xog Object include file is the name of the var holding the parsed xml.
In addition to include files, I have setup a common template. Within this template, there is a defined approach to handling exceptions, running the script in debug mode, being able to break out of the script given a condition. I have also added logging and a debug:break taglib to the command line xog client (which is where I do my development) to enable using gel:log instead of gel:out and using a debugger for development.
My goal is to be able to develop scripts in a development environment, test them within the test environment and deploy them to production without any script modification between environments.
My next task is to have a include function that parses a sql recordset into an arraylist => hashtable object (the first pair in the arraylist is the code, pair object is the parent code, 3-n are the attribute names and values). With this type object definition it would be easy to have flexible routine for parsing the object and calling soap:invoke for a given object write format. I am doing this today within scripts but I haven’t pulled it out into an include file.
On a side note 13.3 has a setting for “Maximum XML Nodes” which specifies the number of XML nodes that can be imported or exported. The default value is 150,000. So if you do head down the create XML file to be Xoged, you will need to make sure you don’t exceed this setting.
A while ago I pondered the same sort of question here ; GEL Script : "Subprocedures"?
I never tried implementing anything though, and I think my usage of GEL/XOG is a lot less-mature than Gene's.
So the one useful comment I will make is just to be careful that any complex solution you might come up with is available on On-Demand (i.e. without deploying anything to server itself).