MQ Configuration tools

Having worked in the MQ labs at IBM Hursley for many years I was always struck by how complex MQ appeared. As time has gone by little has changed my mind on this. Recently I was asked to create a solution for the customer that required repetitive creation of an MQ environment. We chose to create a tool using JSON as a configuration language. I'll describe the tool here and welcome feedback on it.

The Problem with dmpmqcfg

The customers criteria were that they were given a brand new Golden image by their Ops team every few months. They then re-installed and configured all of their software on the new images from scratch - they put this under the umbrella of being "devops". They were windows users and liked powershell and MQ was one of the images they needed to create. 

Fundamentally, the customer had a complete lack of skills in MQ and wasn't looking to build them.

Usually, we would go to the likes of  dmpmqcfg and simply recreate the Queue manager objects from there. dmpmqcfg is an IBM supplied tool that can skim the configuration of MQ from one Queue Manager (QM) and produce an MQSC file. This MQSC file is then run against the new Queue Manager and voila, the QM is configured. So, why did we not use this method?

As I've already said, the customer had limited MQ skills. This is relevant because, although the dmpmqcfg would mean the system should be created accurately every time, it wouldn't help the customer in the long-run. For instance, say they needed the odd new queue here or there; or perhaps change the name of an ssl certificate. They would have to understand the MQSC file - which would be quite long and in MQ definition language. Not only that; but their OPS folk wanted transparency of what was being installed on MQ and, given their lack of MQ skills a solution had to be found that didn't involve them reading (usually) out of date documentation (we've all been there I'm sure). 

The solution? Create a configuration tool whereby the configuration of MQ was written in JSON. Now, don't get me wrong, this was not as easy as I had hoped it to be but that was mainly due to writing the tool in powershell - a new language to me at the time. Overall, writing a set of powershell scripts that read JSON and then configure a QM was not rocket-science. For the OPS folk, reading JSON config files is "immediate and obvious" versus reading MQ definition files.

Breaking down MQ configuration

In order to avoid one of the problems that dmpmqcfg has - one, very long, file - we decided to produce a number of scripts that read different schemas.

Logically they broke down into
  • Installing MQ
  • Creating instances
  • Creating MQ queues and channels
  • Securing MQ Queue and channels
  • Monitoring Queues
  • Create services

We decided to write the installation scripts in powershell with no JSON configuration. They had only a few basic commands which were unlikely to change. We did the same for the creation of the Queue Managers; just having scripts and no JSON configuration files. Although we did have to have a switch to say whether the script was running as the main QM or the standby QM in a multi-instance environment.

When it came to creating the queues and channels we used JSON files. In a good instance of being agile we started with a slightly monolithic schema but, as the types of objects became more varied, we broke them up into different types. As an example, part of the queue schema is outlined below:

Queues:[ 
                   <queueType> [
                                          {
                                               Name: <String>
                                               Description: <String>
                                                <otherproperties>: <String>
                                            }
                                       ]
              ]

Breaking down the <queueType> into their own array made the reading of the configuration an awful lot easier - and was one of the advantages we found using this method.

The <queueType> we supported were the usual suspects such as local, alias, remote, transmission.

The <otherProperties> changed according to the type of queue that was being described. For instance, properties "Remote Queue Manager" and "Remote Queue" for the queueType of "remote". 


Here's an example file, using a couple of basic queues:


Queues:{
"local": [
{
"Name":"MY.FIRST.LOCAL.QUEUE",
"Description" : "Example of a local queue created using JSON configuration"
}
,
{
"Name":"ANOTHER.LOCAL.QUEUE",
"Description" : "Showing how an array of queues work"
}
]
,
"remote": [
{
"Name":"A.REMOTE.QUEUE",
"Description" : "Example of a remote queue created using JSON configuration",
"RemoteQueue" : "THE.REMOTE.QUEUE",
"TransmissionQueue" : "THE.TX.QUEUE",
"RemoteQM" : "THE.REMOTE.QM"
}
                     ..
                     ..
                     ..


One of the other advantages of this system is that we could make the names of the properties as obvious as possible for the context and were not constrained by the IBM definitions. This made it easier for the customer to understand and change the configuration.


Channels, listeners and even triggers were also defined in this schema. 
For the triggers we could add in checks to ensure that all the objects for the trigger had been defined correctly, Something that can be a little awkward in MQ.

Conclusion

We created a JSON based configuration language run by powershell to configure Queue Managers. This system helped the customer understand their MQ system without understanding the more complex MQ definition language. They can also change their system in the future in some basic ways by simply configuring JSON.

Creating a low-level scripting language could help many people in their adoption of MQ and we have used this solution with other customers. However, we understand that it's not right for everyone.

Comments

Popular posts from this blog