I need help deploying many Smart Office Mashups, on multiple environments, fast, several times a day. Think continuous integration. Currently, it takes cubic time. Ideally, there would be a solution in linear time.
Here is a typical scenario: a user reports an error with a Mashup, I fix the error in the XAML file, and I propagate the fix to the server. For that, I generate the Lawson package in Mashup Designer in Smart Office (ISO), I upload the package in LifeCycle Manager (LCM), and I upgrade the Mashup on each environment.
My best case scenario is one Mashup and two environments once a day. My average scenario is three Mashups and three environments three times a day. My worst case scenario is 13 Mashups and five environments seven times a day.
(Note: normally we should develop in the DEV environment and push to the TST environment for users to test, but we are in a transition period between on-premise and Infor Cloud, and somehow we have five environments to maintain. Also, we could tell users to install Mashups locally or we could share Mashups with a role, but users got confused with versions and roles, so we have to do global deployments only.)
I count mouse clicks, mouse moves, and keystrokes as clickks. To optimize as much as possible, I suppose that ISO, Mashup Designer, LCM Client, the Manage Products page, and the Applications tab are all launched and ready to use, and that I don’t close them during the day. The clickk counter is approximately the following:
7 clickks to generate the Lawson package in Mashup Designer 20 clickks to upload the package in LCM 11 clickks to upgrade the Mashup in the environment
x: number of Mashups y: number of environments z: number of times per day
The formula is:
(7x + 20x + 11xy)z
- 49 clickks (7*1+20*1+11*1*2)*1 for my best case scenario
- 540 clickks (7*3+20*3+11*3*3)*3 for my average scenario
- 7462 clickks (7*13+20*13+11*13*5)*7 for my worst case scenario
The exact numbers are not important. What matters is that the result has order of n3 time complexity, i.e. it takes cubic time to deploy many Mashups on multiple environments, several times a day !!! In reality the number of environments is pretty constant, so the result will tend to order n2, but that’s still quadratic, not linear. Also, the number of Mashups and the number of times per day will eventually reach a limit, and the result will be constant, but still insanely high (in the range of 7462 clickks in my case).
$ command line ?
The goal is to reduce the number of clickks to a matter of a few double-clicks, ideally via the command line. How? The step in Smart Office can easily be done with a command line, it’s a simple archive of an archive (we can use ZIP tools, command line, .NET System.IO.Packaging.Package, or the Pack.exe tool in the Smart Office SDK). But the steps in LCM do not have a command line. They are Apache Velocity scripts, compiled into Apache Ant tasks, that execute Java code, remotely, to upload files to the server and add records to the Apache Derby database (lcmdb), and also the Mashups contents are saved as blobs in the MangoServer Grid database (GDBC) which is a distributed H2 database (h2db). I think. There is probably a distributed in-memory Grid cache as well. I could not find documentation nor a quick hack for all this.
Ideally there would be a command line like this fake screenshot:
Do you have any suggestions? Please let me know in the comments below.
- Continuous integration of Mashups #1 – HELP WANTED!!
- Continuous integration of Mashups #2 – Reverse engineering
- Continuous integration of Mashups #3 – HELP GOTTEN!!
- Continuous integration of Mashups #4 – Command line
9 thoughts on “Continuous integration of Mashups – HELP WANTED!!”
UPDATE: Added some details like ZIP/Pack.exe for the ISO step, and added the time complexity tending more to n2 and eventually reaching high constant.
UPDATE: Added note about the five environments + added note about the local vs. shared vs. global + re-structured the Scenarios chapter.
UPDATE: Mentioned the Grid database, H2, and in-memory cache.
UPDATE: corrected time complexity from polynomial to cubic