Share This:

It won’t come as a surprise to you that Catalog integrates seamlessly with ITSM. It’s easy to create work order and incident records. As a backend fulfilment system, integration is key and Catalog interacts with various systems using out-of-the-box connectors like Active Directory, Flexera or MS Office. For most other needs Catalog uses Integration Service which makes it possible to interact with a whole range of other connectors and even allows you to write your own.

 

This should take care of the majority of your integration needs, so is there a need for the ability to integrate directly with remote servers? It’s a recent feature, added only in 18.11, which makes it possible for Catalog to connect to external servers using a HTTP-based interface. Is there any benefit to this? I’d argue there is: while there’s greater work involved in building your own interface, there is also greater flexibility.

 

I really like the solution for connecting different cloud-based applications, but it has its drawbacks: Integration Service is a cloud-based solution, so even if you want to connect your on-premise Catalog server to your local application, it requires you to do this via the cloud. Integration Service also requires a license, and the solution might not be right for you if you’re looking for a small integration or extension.

 

Direct integration with remote servers resolves this. It puts you in control of the integration, both in terms of design and development, but also the hosting location, without any need for an intermediate connection to the cloud. It also suits small-scale extensions, if you want to add some specific functionality or want to integrate with a local application, this might be the solution for you.

 

True, in most cases the combination of out-of-the-box connectors plus Integration Service is enough. But for a backend fulfilment system, integration is paramount and we’ll make every effort to ensure you’re successful.

 

Does this mean that it’s going to be easy to set up? I can’t promise you that. The truth is that you have to build everything yourself, there’s no framework and there are no connectors. You build the server, design the API and all of this has to adhere to a strict standard. Daunting? I agree, but still doable. And I’m here to help.

 

Let’s first talk about the various components. I’m going to deviate slightly from the official documentation as I find it easier to explain it this way. Here’s how I see the integration:

DWP uses Catalog as its backend fulfilment system: users submit requests and Catalog handles the workflow process as part of the service. Catalog has the ability to integrate using out-of-the-box components, Integration Service and Remote Connectors. The Remote Connector is a web-based application which runs on a web server. The key part of this is the HTTP-based API, that’s the part Catalog interacts with. The interface adheres to a specific standard: there are certain requests that will be made and Catalog expects responses in a certain format. The API in turn uses the connector logic to do whatever is required to process: calculate something, return data, create requests, etc. It’s up to you how you want to do this. Notice I used a dashed line for the external system. The primary reason would to communicate with the external system, but this doesn’t necessarily have to be the case. If you want to return week numbers based on a date, that’s absolutely possible.

 

As far as Catalog is concerned it’s communicating with the API, it doesn’t really matter to Catalog what’s happening further upstream, so what we need to do first is define this interface. I’m going to show you how to build a full connector, but I will spread this over two articles, in this first article I’m only going to build the API part. I’m not actually building the connector logic or integrate with an external system, this will just overcomplicating things and stand in the way of explaining the connector properly. We’ll leave this to the second article. This what we’re going to build:

We have to do everything from scratch, in order to get this API working we need to write a web based application which accepts certain HTTP requests and responds in a certain way. These are the calls we need to get the service registered:

 

GET /rcf/description

 

This request will be sent when you set up a remote connector in the Catalog configuration, it’s the first request that goes out. If you’re familiar with SOAP web services, then this is the equivalent of a WSDL: it describes the service, lists what is on offer, what it expects and what it will send back in return. Catalog will use this information to build the service. As with the WSDL, the format is specific and you need to make sure you follow the format exactly.

 

GET /rcf/health

 

This is the second call that goes out when registering the server, it validates the server. It should respond with a simple OK.

 

Those are all the calls we need to get everything registered. So before we go any further let’s get these calls defined. I’m using the Spring framework for the web server. It’s easy enough to spin up and it offers the flexibility I am looking for. I am using Maven as my build tool-of-choice, but just to clarify, there are no requirements here, if you want to use something else, you can. My interface communicates via JSON and to keep it simple and readable I decided to use the org.json library, there are of course different options.

 

If you decide to follow me, download/install Maven from here and set up an empty project with the appropriate Spring references (see below for references). Then run these commands to build and start everything:

 

mvn clean package
java -jar target/gs-rest-service-0.1.0.jar 

 

This starts a web server which I can access via http://myserver:8080/

 

Here are the files I am using:

  • src\main\java\rcf\RCFController.java: main controller, contains request definitions
  • src\main\java\rcf\Application.java: Boot application, used by Spring to start application

 

In Spring HTTP requests are handled by a controller identified by the @RestController annotation. My controller handles the various requests for the RCF web application. Let’s start with the simplest call, /rcf/health:

 

@RequestMapping(value = "/rcf/health", method = RequestMethod.GET)
@ResponseBody
public String getHealth() {
  return "OK";
}

 

I specified that I accept a GET request, I define the URL and I set the response. In our case a simple ‘OK’. That’s enough for this to work.

Let’s turn our attention to /rcf/description. This is an important call, here’s the overview of what we need:

 

{
  "connectors": [
    {
      "name": "Machine Connectivity",
      "version": "0.1",
      "type": "com.example.jb.connectivity",
      "path": "jbconnectivity",
      "capabilities": [],
      "activeConnectionInstances": [],
      "actions": [
        "name": "pingMachine",
        "displayName": "pingMachine",
          "path": "pingMachine",
          "inputs": [
            {
              "name": "machineName",
              "type": "String",
              "required": true
            }
          ],
          "outputs": [
            {
              "name": "Result",
              "type": "String"
            }
          ],
      ]
    }
  ]
}

 

We start with an element called connectors which is an array of the different connectors. Each connector translates to a category in the workflow palette.

 

There are a few properties you need to set, notice the path which you’ll need to use later during run-time. Capabilities is a list of tags describing what the API offers, we’ll look into this later. ActiveConnectionInstances is a list of the available instances like Dev and Prod, this is used in case you use different environments. Next, we've got the actions, this is an array so each connector can have more than one action. This part is fairly self-explanatory: there's the name and you've got the inputs and the outputs. Here's what this looks like on the workflow panel:

 

 

Like the health check we need to define the request as part of the controller. Since we’re dealing with json code I am using org.json. Here’s the code:

 

@RequestMapping(value="/rcf/descriptor", method = RequestMethod.GET)
  public String descriptor() {

  JSONObject jsonDescription = new JSONObject();
  JSONArray jsonArrConnectors = new JSONArray();

  JSONObject jsonConnectionInstance = new JSONObject();
    jsonConnectionInstance.put("id", "jbconnectivity-1");
    jsonConnectionInstance.put("name", "Production");
  JSONArray jsonArrConnectionInstance = new JSONArray();
    jsonArrConnectionInstance.put(jsonConnectionInstance);

  JSONArray jsonArrCapabilities = new JSONArray();
    jsonArrCapabilities.put("com.bmc.dsm.catalog:datasetProvider");

  JSONObject jsonAction = new JSONObject();
    jsonAction.put("name", "pingMachine3");
    jsonAction.put("displayName", "pingMachine3");
    jsonAction.put("path", "pingMachine3");
  JSONArray jsonArrAction = new JSONArray();
  jsonArrAction.put(jsonAction);

  JSONObject jsonActionInput = new JSONObject();
    jsonActionInput.put("name", "machineName");
    jsonActionInput.put("type", "String");
    jsonActionInput.put("required", true);
  JSONArray jsonArrActionInput = new JSONArray();
    jsonArrActionInput.put(jsonActionInput);
  JSONObject jsonActionOutput = new JSONObject();
    jsonActionOutput.put("name", "Result");
    jsonActionOutput.put("type", "String");
  JSONArray jsonArrActionOutput = new JSONArray();
    jsonArrActionOutput.put(jsonActionOutput);

  jsonAction.put("inputs", jsonArrActionInput);
  jsonAction.put("outputs", jsonArrActionOutput);

  jsonAction.put("com.bmc.dsm.catalog", JSONObject.NULL);

  JSONObject jsonIndividualConnector = new JSONObject();
    jsonIndividualConnector.put("name", "Machine Connectivity 3");
    jsonIndividualConnector.put("version", "0.1");
    jsonIndividualConnector.put("type", "com.example.jb.connectivity");
    jsonIndividualConnector.put("path", "jbconnectivity");
    jsonIndividualConnector.put("capabilities", jsonArrCapabilities);
    jsonIndividualConnector.put("activeConnectionInstances", jsonArrConnectionInstance);
    jsonIndividualConnector.put("actions", jsonArrAction);
    
  jsonArrConnectors.put(jsonIndividualConnector);
  jsonDescription.put("connectors", jsonArrConnectors);

  return jsonDescription.toString();
}

 

I’m hardcoding a lot of the values, that’s a deliberate choice to make it readable. In the next blog post we’ll add the connector logic. For now, let’s have a look at some of the details of the API:

 

If we access the URL via /rfc/description, this is the JSON code I get back:

 

{
  "connectors": [
    {
      "name": "Machine Connectivity",
      "version": "0.1",
      "type": "com.bmc.example.jbconnectivity",
      "path": "jbconnectivity",
      "capabilities": [
        "com.bmc.dsm.catalog:datasetProvider"
      ],
      "activeConnectionInstances": [
        {
          "id": "jbconnectivity-1",
          "name": "Production"
        }
      ],
      "actions": [
        {
          "name": "pingMachine",
          "displayName": "pingMachine",
          "path": "pingMachine",
          "inputs": [
            {
              "name": "machineName",
              "type": "String",
              "required": true
            }
          ],
          "outputs": [
            {
              "name": "Result",
              "type": "String"
            }
          ],
          "com.bmc.dsm.catalog": null
        }
      ]
    }
  ]
}

 

That’s enough for Catalog to identify the connector and add it to the workflow. Obviously, it won’t do anything as we haven’t defined any requests for the actions yet, but let’s add it anyway to see what it does:

The name is just for you to identify the server, notice that the URL uses the root. The username and password are used in case of Basic HTTP authentication. If you’re not using them, just enter some dummy values. If all goes well, the actions are added to the palette and I can use it in my workflow. You might have noticed the field Connection Instance Id which is a required field added automatically. This has to have the ID of the active connection instance (not the path) as defined in the descriptor, it needs to be an exact match, else the requests will be not be sent. In my example it’s jbconnectivity-1.

 

That takes care of the design, Catalog can read everything and we can start building our workflow.

 

 

But we’re not there yet, we haven’t anything defined that’s happening at run-time. When Catalog processes the workflow it executes the actions we defined, this results in Catalog sending out requests to our API and we need make sure the API responds appropriately. Here are the calls:

 

POST /{connector_path}/com.bmc.dsm.catalog:checkHealth

 

This is called periodically on each connection to verify its availability. The server is supposed to run a check to see if everything is okay and reports back to Catalog. An Administrator can trigger the check as well.

 

POST /{connector_path}/${action_path}

 

If the activity is executed, this call will go out. The action_path matches the path you defined in the description JSON for the respective action.

 

Let's look at this in more detail. checkHealth is a POST request so there’s a HTTP Body with the payload:

 

POST /jbconnectivity/com.bmc.dsm.catalog:checkHealth 

{  
  "connectionInstanceId": "jbconnectivity-1",  
  "request": {}  
}  

 

Notice that this isn't part of /rcf anymore. The server is supposed to check the connectionInstanceId (you defined this in description) and verify if everything is okay. Always respond with HTTP200 and with the following JSON:

 

{
  "connectionInstanceId": "jbconnectivity-1",
  "response": {
    "status": "CONNECTION_SUCCESS", //CONNECTION_FAILURE if there’s a problem.
    "message": null
  }
}

 

Here’s how I coded this in the controller. To keep it simple I’m returning always CONNECTION_SUCCESS. We’ll work this out in more detail later.

 

@RequestMapping(value = "/jbconnectivity/com.bmc.dsm.catalog:checkHealth", method = RequestMethod.POST)
@ResponseBody
public String checkHealth() {

  JSONObject jsonResponse = new JSONObject();
  jsonResponse.put("status", "CONNECTION_SUCCESS");
  jsonResponse.put("message", JSONObject.NULL);

  JSONObject jsonHealth = new JSONObject();
  jsonHealth.put("connectionInstanceId", "DEV");
  jsonHealth.put("response", jsonResponse);

  return jsonHealth.toString();
}

 

Which leaves us with the connector actions. This is what my Ping action request looks like. This will be generated by Catalog:

 

POST /jbconnectivity/pingMachine

{
  "inputs": {
    "machineName": "clm-aus-12345"
  },
  "connectionInstanceId": "jbconnectivity-1"
}

 

Notice again that there's no /rcf prefix. Catalog bases the values on the values of the input fields. What I need to do is respond with the following JSON:

 

{
  "outputs": {
    "result": "OK"
  }
}

 

The exact format depends on what you defined in descriptor. I just have one field defined here, a field of type String called result.

 

Here’s the code in the controller. Again, the hardcoded values are purely for illustrative purpose. We’ll make this more dynamic in the next article.

 

@RequestMapping(value = "/jbconnectivity/pingMachine", method = RequestMethod.POST)
@ResponseBody
public String checkPing() {

  JSONObject jsonInputs = new JSONObject();
  jsonInputs.put("result", "OK");

  JSONObject jsonPing = new JSONObject();
  jsonPing.put("outputs", jsonInputs);

  return jsonPing.toString();
}

 

That’s all you need to get this working. My examples are all fairly straightforward but I hope you appreciate the flexibly and more importantly the possibilities the connector offers. It can be as simple as a TCP ping and as complicated as the front for a SOAP integration with a custom payroll system. The principles remain the same.

 

What would you do with Remote Server Integration? It’s very versatile, if you define the API correctly you can get it to execute your Java code. It doesn’t necessarily have to be a connection to an external system, it could also be an extension of the functionality. All you need to define are inputs and outputs.

 

The API we’ve written so far is basic and mostly contains hardcoded values, we’d needs to build the logic to get it to do anything. Let’s continue with the example we’ve been using so far: a network ping to determine if machines are alive or not. I’ve used this example before for an Innovation Suite action, let’s check how this would work for a Remote Server integration. But we need to leave this for the next article.

 

I am interested in your feedback. What do you think of the integration possibilities? Would you use this, does this article help you on your way? Maybe you have different or better use cases. I’d love to hear from you, leave a comment below.

 

Best Wishes,

Justin

 

Further Reading: