Skip navigation

Digital Workplace

4 Posts authored by: Justin Bakker Employee
Share:|

This article follows up on a previous article where we looked at setting up the API to integrate DWP Catalog with remote servers. We looked at the connector description and the various calls that need to be defined. If you followed along you hopefully ended up with a working interface.

 

But that’s all it is for now: an interface. A lot of the values we’re returning are hardcoded and it doesn’t do anything at the moment. What I want to do next is explain how to build a full integration. We’ll use the same interface, so if you haven’t already done so, make sure to read my first blog article.

 

What are we going to build? It can’t be too complicated, so what I propose building is a simple ping activity. The basic idea is that you can invoke the action to check if a server is online. It will send out a ping request and report back if the server can be reached or not. You could use this in your workflow to check if a server can be reached, and depending on the outcome you can engage a specific team or escalate it further. We’re going to output a Boolean so that will work nicely with Exclusive Gateways.

 

Just to remind ourselves, this is how the integration works:

 

 

What we are interested in is the connector logic. Our action will return a Boolean indicating if the machine is alive or not. Using the same overview, these are the relevant components:

 

 

Notice I’m not going to an external server, I’m just executing code to do the ping which is all Java. Let’s first write a simple Java class, just so we know what the code looks like and how it works. This is what I came up with:

 

package pingservicetest;

import java.io.IOException;
import java.net.InetAddress;


public class PingServiceTest {

  public static boolean isMachineAlive (String machineName, int machinePort) {
    try {
      InetAddress inet;
      inet = InetAddress.getByName(machineName);
      return inet.isReachable(500);
    } catch (java.net.UnknownHostException e) {
    } catch (IOException e) {
    }  
        
    if (machinePort == 0) {
      machinePort = 111;
    }        
        
    try (Socket s = new Socket(machineName, machinePort)) {
      return true;
    } catch (IOException ex) {
    }
    return false;
  }


  public static void main(String[] args) {
    System.out.println(isMachineAlive("clm-aus-01234"));
  }
}

 

I want to check if servers are available or not, so I need to execute some form of ping command to confirm that they're alive. There's no TCP ping command in Java, but I can just use InetAddress. If I don’t get a response I try an additional Socket call. If I can get a successful connection I return true, if not I return false. I included the example at the end of this article, which you can run if you want to. It’s nothing fancy but it’s a good example of the way you can extend Catalog’s functionality.

 

I don’t like to put everything in one big file, so to make it manageable (and mirror the original diagram) I’m splitting this in two classes:

 

  • RCFController handles the interface. Its primary role is to accept HTTP requests from Catalog and respond with the correct JSON code.
  • RCFLogic is new, this is the class containing the code which does the actual processing. In our case that’s the isMachineAlive method.

 

 

Let’s start with RCFLogic. We already wrote and tested our code so we just need to define the class:

 

package rcf;

import java.io.IOException;
import java.net.*;

public class RCFLogic {

  public static boolean isMachineAlive(String machineName, int machinePort) {
         …
  }
}

 

Nothing new there, it’s a typical Java class, no references to Spring or Catalog. All of this is separate, but needs to be referenced properly. We need to know what the service accepts, what goes out and what format this is in. The interface we used in the first article can stay largely unchanged but I need to add the Port as an input and return a Boolean instead of a String. Here’s the action part of descriptor:

 

"name": "isMachineAlive",
"displayName": "isMachineAlive",
"path": "isMachineAlive",
"inputs": [
  {
    "name": "machineName",
    "type": "String",
    "required": true
  }
  {
    "name": "tcpPort",
    "type": "Integer",
    "required": false
  }
],
"outputs": [
  {
    "name": "Result",
     "type": "Boolean"
  }
]

 

I accept machineName and tcpPort and I return a Boolean called Result. I use the same code based on org.json as last time. Based on this I know I get this HTTP request in from Catalog:

 

POST http://server:8080/jbconnectivity/pingMachine HTTP/1.1

{
  "inputs": {
    "machinePort": null,
    "machineName": "clm-test-12345"
  },
  "connectionInstanceId": "jbconnectivity-1"
}

 

The request is generated by Catalog, but it bases this on the definition which I supplied and uses the values set during workflow creation. I write my code in the controller to handle this. I also know what the HTTP response should be:

 

HTTP/1.1 200

{
  "outputs":
  {
    "result":true
  }
}

 

With some help from org.json and Spring this is the code I came up with:

 

@RequestMapping(value = "/jbconnectivity/pingMachine", method = RequestMethod.POST)
@ResponseBody
public String checkPing(@RequestBody String payload) {
  JSONObject jsonPayload = new JSONObject(payload);
  String machineName = jsonPayload.getJSONObject("inputs").getString("machineName");
  int machinePort = 0;

  try {
    machinePort = jsonPayload.getJSONObject("inputs").getInt("machinePort");
  } catch (JSONException e) {}

  JSONObject jsonInputs = new JSONObject();
  jsonInputs.put("result", RCFLogic.isMachineAlive(machineName, machinePort));
  JSONObject jsonPing = new JSONObject();
  jsonPing.put("outputs", jsonInputs);


  return jsonPing.toString();
}

 

I first accept the payload from the POST request which I convert into a JSON object (jsonPayload). I then extract the machine name and the port (the try/catch clause is just to deal with the null value). Once I have all the values I create the JSON I want to respond with (jsonPing and jsonInputs). The point where I call my method in RCFLogic is line 13.

 

That’s all I’m going to do. Because I don’t connect to an external system, I don’t have to do any health checks, so I’m just going to keep the hardcoded values for checkHealth. Sure, there are things that could be improved, I’ve done some housekeeping in the actual example which I’ve attached at the end of the article, but I hope you get the principle.

Okay, let’s give it a go! I’m keeping the workflow as simple as possible, no fancy conditions, just two questions and one activity.

 

 

Curious if it’s going to work? Let’s log into DWP and create a new service request:

 

 

Because my workflow doesn’t actually create any requests in a backend system, I’m just going to have a look at the Service Requests report on the Catalog side:

And lo and behold, it works! Catalog executed the action, sent a POST request to the API as per the definition. My code picked up the POST request and called the isMachineAlive method in the RCFLogic class. And that was all passed back to DWP.

 

I know, this is fairly basic but I hope you agree that it works quite well. We’re separating Catalog from the external integration by using an API which adheres to strict standards. As long as I make sure I define the interface correctly, it’s up to me to respond to the incoming HTTP requests. It doesn’t really matter what you do in the backend to process the request. It can be as simple as extending the functionality by adding a few custom actions to calculate a week number based on a date, or it can be as complicated as integrating using SOAP-based web services. But keep in mind that this doesn't allow you to interact with DWP, you can't write a service which pushes values to DWP directly, it's all in the background. But it’s up to you and I look forward to seeing what you will come up with. Any questions or feedback? Leave a comment below.

 

Best wishes,

Justin

Share:|

It won’t come as a surprise to you that Catalog integrates seamlessly with ITSM. It’s easy to create work order and incident records. As a backend fulfilment system, integration is key and Catalog interacts with various systems using out-of-the-box connectors like Active Directory, Flexera or MS Office. For most other needs Catalog uses Integration Service which makes it possible to interact with a whole range of other connectors and even allows you to write your own.

 

This should take care of the majority of your integration needs, so is there a need for the ability to integrate directly with remote servers? It’s a recent feature, added only in 18.11, which makes it possible for Catalog to connect to external servers using a HTTP-based interface. Is there any benefit to this? I’d argue there is: while there’s greater work involved in building your own interface, there is also greater flexibility.

 

I really like the solution for connecting different cloud-based applications, but it has its drawbacks: Integration Service is a cloud-based solution, so even if you want to connect your on-premise Catalog server to your local application, it requires you to do this via the cloud. Integration Service also requires a license, and the solution might not be right for you if you’re looking for a small integration or extension.

 

Direct integration with remote servers resolves this. It puts you in control of the integration, both in terms of design and development, but also the hosting location, without any need for an intermediate connection to the cloud. It also suits small-scale extensions, if you want to add some specific functionality or want to integrate with a local application, this might be the solution for you.

 

True, in most cases the combination of out-of-the-box connectors plus Integration Service is enough. But for a backend fulfilment system, integration is paramount and we’ll make every effort to ensure you’re successful.

 

Does this mean that it’s going to be easy to set up? I can’t promise you that. The truth is that you have to build everything yourself, there’s no framework and there are no connectors. You build the server, design the API and all of this has to adhere to a strict standard. Daunting? I agree, but still doable. And I’m here to help.

 

Let’s first talk about the various components. I’m going to deviate slightly from the official documentation as I find it easier to explain it this way. Here’s how I see the integration:

DWP uses Catalog as its backend fulfilment system: users submit requests and Catalog handles the workflow process as part of the service. Catalog has the ability to integrate using out-of-the-box components, Integration Service and Remote Connectors. The Remote Connector is a web-based application which runs on a web server. The key part of this is the HTTP-based API, that’s the part Catalog interacts with. The interface adheres to a specific standard: there are certain requests that will be made and Catalog expects responses in a certain format. The API in turn uses the connector logic to do whatever is required to process: calculate something, return data, create requests, etc. It’s up to you how you want to do this. Notice I used a dashed line for the external system. The primary reason would to communicate with the external system, but this doesn’t necessarily have to be the case. If you want to return week numbers based on a date, that’s absolutely possible.

 

As far as Catalog is concerned it’s communicating with the API, it doesn’t really matter to Catalog what’s happening further upstream, so what we need to do first is define this interface. I’m going to show you how to build a full connector, but I will spread this over two articles, in this first article I’m only going to build the API part. I’m not actually building the connector logic or integrate with an external system, this will just overcomplicating things and stand in the way of explaining the connector properly. We’ll leave this to the second article. This what we’re going to build:

We have to do everything from scratch, in order to get this API working we need to write a web based application which accepts certain HTTP requests and responds in a certain way. These are the calls we need to get the service registered:

 

GET /rcf/description

 

This request will be sent when you set up a remote connector in the Catalog configuration, it’s the first request that goes out. If you’re familiar with SOAP web services, then this is the equivalent of a WSDL: it describes the service, lists what is on offer, what it expects and what it will send back in return. Catalog will use this information to build the service. As with the WSDL, the format is specific and you need to make sure you follow the format exactly.

 

GET /rcf/health

 

This is the second call that goes out when registering the server, it validates the server. It should respond with a simple OK.

 

Those are all the calls we need to get everything registered. So before we go any further let’s get these calls defined. I’m using the Spring framework for the web server. It’s easy enough to spin up and it offers the flexibility I am looking for. I am using Maven as my build tool-of-choice, but just to clarify, there are no requirements here, if you want to use something else, you can. My interface communicates via JSON and to keep it simple and readable I decided to use the org.json library, there are of course different options.

 

If you decide to follow me, download/install Maven from here and set up an empty project with the appropriate Spring references (see below for references). Then run these commands to build and start everything:

 

mvn clean package
java -jar target/gs-rest-service-0.1.0.jar 

 

This starts a web server which I can access via http://myserver:8080/

 

Here are the files I am using:

  • src\main\java\rcf\RCFController.java: main controller, contains request definitions
  • src\main\java\rcf\Application.java: Boot application, used by Spring to start application

 

In Spring HTTP requests are handled by a controller identified by the @RestController annotation. My controller handles the various requests for the RCF web application. Let’s start with the simplest call, /rcf/health:

 

@RequestMapping(value = "/rcf/health", method = RequestMethod.GET)
@ResponseBody
public String getHealth() {
  return "OK";
}

 

I specified that I accept a GET request, I define the URL and I set the response. In our case a simple ‘OK’. That’s enough for this to work.

Let’s turn our attention to /rcf/description. This is an important call, here’s the overview of what we need:

 

{
  "connectors": [
    {
      "name": "Machine Connectivity",
      "version": "0.1",
      "type": "com.example.jb.connectivity",
      "path": "jbconnectivity",
      "capabilities": [],
      "activeConnectionInstances": [],
      "actions": [
        "name": "pingMachine",
        "displayName": "pingMachine",
          "path": "pingMachine",
          "inputs": [
            {
              "name": "machineName",
              "type": "String",
              "required": true
            }
          ],
          "outputs": [
            {
              "name": "Result",
              "type": "String"
            }
          ],
      ]
    }
  ]
}

 

We start with an element called connectors which is an array of the different connectors. Each connector translates to a category in the workflow palette.

 

There are a few properties you need to set, notice the path which you’ll need to use later during run-time. Capabilities is a list of tags describing what the API offers, we’ll look into this later. ActiveConnectionInstances is a list of the available instances like Dev and Prod, this is used in case you use different environments. Next, we've got the actions, this is an array so each connector can have more than one action. This part is fairly self-explanatory: there's the name and you've got the inputs and the outputs. Here's what this looks like on the workflow panel:

 

 

Like the health check we need to define the request as part of the controller. Since we’re dealing with json code I am using org.json. Here’s the code:

 

@RequestMapping(value="/rcf/descriptor", method = RequestMethod.GET)
  public String descriptor() {

  JSONObject jsonDescription = new JSONObject();
  JSONArray jsonArrConnectors = new JSONArray();

  JSONObject jsonConnectionInstance = new JSONObject();
    jsonConnectionInstance.put("id", "jbconnectivity-1");
    jsonConnectionInstance.put("name", "Production");
  JSONArray jsonArrConnectionInstance = new JSONArray();
    jsonArrConnectionInstance.put(jsonConnectionInstance);

  JSONArray jsonArrCapabilities = new JSONArray();
    jsonArrCapabilities.put("com.bmc.dsm.catalog:datasetProvider");

  JSONObject jsonAction = new JSONObject();
    jsonAction.put("name", "pingMachine3");
    jsonAction.put("displayName", "pingMachine3");
    jsonAction.put("path", "pingMachine3");
  JSONArray jsonArrAction = new JSONArray();
  jsonArrAction.put(jsonAction);

  JSONObject jsonActionInput = new JSONObject();
    jsonActionInput.put("name", "machineName");
    jsonActionInput.put("type", "String");
    jsonActionInput.put("required", true);
  JSONArray jsonArrActionInput = new JSONArray();
    jsonArrActionInput.put(jsonActionInput);
  JSONObject jsonActionOutput = new JSONObject();
    jsonActionOutput.put("name", "Result");
    jsonActionOutput.put("type", "String");
  JSONArray jsonArrActionOutput = new JSONArray();
    jsonArrActionOutput.put(jsonActionOutput);

  jsonAction.put("inputs", jsonArrActionInput);
  jsonAction.put("outputs", jsonArrActionOutput);

  jsonAction.put("com.bmc.dsm.catalog", JSONObject.NULL);

  JSONObject jsonIndividualConnector = new JSONObject();
    jsonIndividualConnector.put("name", "Machine Connectivity 3");
    jsonIndividualConnector.put("version", "0.1");
    jsonIndividualConnector.put("type", "com.example.jb.connectivity");
    jsonIndividualConnector.put("path", "jbconnectivity");
    jsonIndividualConnector.put("capabilities", jsonArrCapabilities);
    jsonIndividualConnector.put("activeConnectionInstances", jsonArrConnectionInstance);
    jsonIndividualConnector.put("actions", jsonArrAction);
    
  jsonArrConnectors.put(jsonIndividualConnector);
  jsonDescription.put("connectors", jsonArrConnectors);

  return jsonDescription.toString();
}

 

I’m hardcoding a lot of the values, that’s a deliberate choice to make it readable. In the next blog post we’ll add the connector logic. For now, let’s have a look at some of the details of the API:

 

If we access the URL via /rfc/description, this is the JSON code I get back:

 

{
  "connectors": [
    {
      "name": "Machine Connectivity",
      "version": "0.1",
      "type": "com.bmc.example.jbconnectivity",
      "path": "jbconnectivity",
      "capabilities": [
        "com.bmc.dsm.catalog:datasetProvider"
      ],
      "activeConnectionInstances": [
        {
          "id": "jbconnectivity-1",
          "name": "Production"
        }
      ],
      "actions": [
        {
          "name": "pingMachine",
          "displayName": "pingMachine",
          "path": "pingMachine",
          "inputs": [
            {
              "name": "machineName",
              "type": "String",
              "required": true
            }
          ],
          "outputs": [
            {
              "name": "Result",
              "type": "String"
            }
          ],
          "com.bmc.dsm.catalog": null
        }
      ]
    }
  ]
}

 

That’s enough for Catalog to identify the connector and add it to the workflow. Obviously, it won’t do anything as we haven’t defined any requests for the actions yet, but let’s add it anyway to see what it does:

The name is just for you to identify the server, notice that the URL uses the root. The username and password are used in case of Basic HTTP authentication. If you’re not using them, just enter some dummy values. If all goes well, the actions are added to the palette and I can use it in my workflow. You might have noticed the field Connection Instance Id which is a required field added automatically. This has to have the ID of the active connection instance (not the path) as defined in the descriptor, it needs to be an exact match, else the requests will be not be sent. In my example it’s jbconnectivity-1.

 

That takes care of the design, Catalog can read everything and we can start building our workflow.

 

 

But we’re not there yet, we haven’t anything defined that’s happening at run-time. When Catalog processes the workflow it executes the actions we defined, this results in Catalog sending out requests to our API and we need make sure the API responds appropriately. Here are the calls:

 

POST /{connector_path}/com.bmc.dsm.catalog:checkHealth

 

This is called periodically on each connection to verify its availability. The server is supposed to run a check to see if everything is okay and reports back to Catalog. An Administrator can trigger the check as well.

 

POST /{connector_path}/${action_path}

 

If the activity is executed, this call will go out. The action_path matches the path you defined in the description JSON for the respective action.

 

Let's look at this in more detail. checkHealth is a POST request so there’s a HTTP Body with the payload:

 

POST /jbconnectivity/com.bmc.dsm.catalog:checkHealth 

{  
  "connectionInstanceId": "jbconnectivity-1",  
  "request": {}  
}  

 

Notice that this isn't part of /rcf anymore. The server is supposed to check the connectionInstanceId (you defined this in description) and verify if everything is okay. Always respond with HTTP200 and with the following JSON:

 

{
  "connectionInstanceId": "jbconnectivity-1",
  "response": {
    "status": "CONNECTION_SUCCESS", //CONNECTION_FAILURE if there’s a problem.
    "message": null
  }
}

 

Here’s how I coded this in the controller. To keep it simple I’m returning always CONNECTION_SUCCESS. We’ll work this out in more detail later.

 

@RequestMapping(value = "/jbconnectivity/com.bmc.dsm.catalog:checkHealth", method = RequestMethod.POST)
@ResponseBody
public String checkHealth() {

  JSONObject jsonResponse = new JSONObject();
  jsonResponse.put("status", "CONNECTION_SUCCESS");
  jsonResponse.put("message", JSONObject.NULL);

  JSONObject jsonHealth = new JSONObject();
  jsonHealth.put("connectionInstanceId", "DEV");
  jsonHealth.put("response", jsonResponse);

  return jsonHealth.toString();
}

 

Which leaves us with the connector actions. This is what my Ping action request looks like. This will be generated by Catalog:

 

POST /jbconnectivity/pingMachine

{
  "inputs": {
    "machineName": "clm-aus-12345"
  },
  "connectionInstanceId": "jbconnectivity-1"
}

 

Notice again that there's no /rcf prefix. Catalog bases the values on the values of the input fields. What I need to do is respond with the following JSON:

 

{
  "outputs": {
    "result": "OK"
  }
}

 

The exact format depends on what you defined in descriptor. I just have one field defined here, a field of type String called result.

 

Here’s the code in the controller. Again, the hardcoded values are purely for illustrative purpose. We’ll make this more dynamic in the next article.

 

@RequestMapping(value = "/jbconnectivity/pingMachine", method = RequestMethod.POST)
@ResponseBody
public String checkPing() {

  JSONObject jsonInputs = new JSONObject();
  jsonInputs.put("result", "OK");

  JSONObject jsonPing = new JSONObject();
  jsonPing.put("outputs", jsonInputs);

  return jsonPing.toString();
}

 

That’s all you need to get this working. My examples are all fairly straightforward but I hope you appreciate the flexibly and more importantly the possibilities the connector offers. It can be as simple as a TCP ping and as complicated as the front for a SOAP integration with a custom payroll system. The principles remain the same.

 

What would you do with Remote Server Integration? It’s very versatile, if you define the API correctly you can get it to execute your Java code. It doesn’t necessarily have to be a connection to an external system, it could also be an extension of the functionality. All you need to define are inputs and outputs.

 

The API we’ve written so far is basic and mostly contains hardcoded values, we’d needs to build the logic to get it to do anything. Let’s continue with the example we’ve been using so far: a network ping to determine if machines are alive or not. I’ve used this example before for an Innovation Suite action, let’s check how this would work for a Remote Server integration. But we need to leave this for the next article.

 

I am interested in your feedback. What do you think of the integration possibilities? Would you use this, does this article help you on your way? Maybe you have different or better use cases. I’d love to hear from you, leave a comment below.

 

Best Wishes,

Justin

 

Further Reading:

Share:|

Of course services are supposed to work correctly. After all, you designed them yourself, put a lot of thought into them. You did your homework and it shows: they look the way you want them to and they create work orders to set up new PCs, for example, or alert teams on infrastructure changes, initiate onboarding processes, even book your holidays requests.  You might even have the questions translated into various languages, from Welsh to Chinese. But something has gone amiss. The work order records are created, but they always show up with the wrong status. You check again, it’s definitely not you, everything looks right. What could it be? The whole service is down, and fixing it is time-critical. Tick tock.

 

Luckily for you, Catalog is pretty good at telling you what’s happening to those services your users are creating. I would maybe even go as far as to say that it’s really good at it. If your tool-of-choice is SRM you know that logs are difficult to avoid. The wrong status shows up in your request? Start with the combined SQL, API and Filter logs. Requests get stuck? Give me the combined SQL, API and Filter logs. The value for the Summary field not populated correctly? SQL, API and Filter logs will tell us why.

 

Is this any better in Catalog? As a matter of fact, it is. Logs have their place but it's not the place you start in Catalog. Not convinced? Let’s have a look. The core of your service is the workflow, this tells you how the questions provided by the end users are handled and, more importantly, how the backend process is fulfilled. This is where the work order records and incidents are created. Here’s my workflow:

 

myit-sb_Phaser bank alignment request (3).png

 

Looks pretty good, right? If my phaser banks are not aligned properly, my service will make sure the right team gets engaged straight away. Let’s create a request for the phaser relay on deck 47A:

 

 

dwpc blog 2 s 2.PNG

 

According to my workflow a few things should happen: a work order record should be created if the priority is deemed low or medium. If it’s high or critical a record should be submitted to the external senior command system for further evaluation. But there’s something going wrong here, when the work order is cancelled, the Catalog request does not reflect this and is still stuck in In Progress.

 

dwpc blog 2 s 3.PNG

 

How can you work out what’s happening? You need to look at the flow and understand how the service requests are handled. But the first step doesn’t have to be logs, it’s handled differently in Catalog. First things first, let’s have a look at the report with all the recent requests:

 

dwpc blog 2 s 4.PNG

 

Now go to the Actions menu and click on View Process. This will show you the workflow it’s currently processing. Here’s mine:

 

 

 

download.png

 

 

The activities with a dark grey border have been processed successfully, the activity with the green border is where we currently are. We can easily see that it informed the security maintenance team, then navigated the gateway to create the work order. This also was successful, but it’s still stuck at Wait for Completion.

This now allows me to see what values are passed between the activities. If I click on Create Work Order I can see the output:

 

{
  "instanceId": "AGGAA5V0FLH5PAP9WE46P8ZAJTDPZ3",
  "requestId": "000000000000611",
  "workOrderId": "WO0000000001016"
}

 

It’s creating the work order record correctly and looking at it I can see it’s already cancelled, so I need to look at the workflow in more detail. The Wait for Completion construction is basically a loop which pauses the workflow until it meets the condition. So let’s have a look at the variable that’s used for this. I can do this via the tab with the gear icon:

 

dwpc blog 2 s 6.PNG

 

That should match my condition. Let’s go back to the original workflow and double check this:

 

dwpc blog 2 s 7.PNG

 

So there’s our problem, my condition used the American spelling (one L) whereas ITSM favours British spelling (two Ls). It never meets this condition so we’re stuck in the loop.

 

I realise this is a fairly straightforward root cause but I hope you appreciate it didn’t take us long to find it. Using the graphical overview of the workflow process we quickly established the flow and the point where it got stuck. From there onwards we were able to work out what the various variable values were which allowed us to concentrate on the reason why it’s not progressing.

 

But what if looking at the workflow isn’t enough? What if this doesn’t explain why it’s not processing? What if you need to go deeper? There’s always the option to go through the logs. Catalog’s main log file is bundle.log which is generated in the db folder. This will record the service requests from DWP with all the values but it won’t track the workflow process. So you can find the JSON code generated by the DWP request with all the values and the ID of the request but not what’s happening afterwards. It will however record any exception that might occur. We are planning to introduce some dedicated process logs in a future release which will allow us to track the workflow processes via logs. As soon as this is available I’ll dedicate another blog article to debugging your workflow processes via logs, so stay tuned!

 

But at least I hope this gives you some idea how to get started when there’s a problem with the workflow processes. The visual approach allows you to quickly assess the point where its stuck before you drill down into more details. So the next time your services don’t quite behave the way you expect them to, you know where to look. And if you’re stuck? Well, you just have to give us a shout.

 

Best wishes,

Justin

Share:|

We now call it DWP Catalog, but you probably know it as Service Broker. Like MyIT it has been renamed. But don’t worry, it’s still the same product you know and love. Right, now that we’ve got that clarified, let’s have a look at one of the big changes: what are we to do with Integration Service?

 

As a seasoned DWP Catalog veteran I’m sure you’ve built your fair share of services. The meaty part of designing services is the workflow designer which offers a flexible way of describing fulfilment processes.  The graphic representation of the workflow allows us to construct and visualise rather complicated processes. Among the collection of activities, tasks and links there are of course the connectors.

 

Connectors make it possible to interact with external systems which opens DWP beyond just ITSM and gives us greater flexibility. That’s nothing new – the connectors have been part of the product since it was first introduced. But since 18.02 we’ve made it possible to use Integration Service. Question is, what are you going to do with it?

 

First things first: the existing built-in connectors won’t go away, they’ll stay right where they are and they will be maintained. All your existing workflows will continue to work and there’s nothing you need to change. But it’s now also possible to use Integration Service.

 

Confusing? Perhaps, let me try to clarify this for you: Integration Service is not part of DWP, it’s a separate cloud-based solution which allows for integration between different platforms and applications. The service connects different applications, residing on cloud and on-premise environments. For example, you can connect ITSM to RemedyForce, SalesForce to Gmail, etc. These connections run on the platform (we call them flows), you don’t have to define them from an application perspective, meaning they’re not built into ITSM or Remedyforce, they reside on the platform. I could dedicate the whole article to Integration Service, but I’ll save that for another time. If you want to know more about what the platform has to offer read my other blog.

 

Besides defining connections on the platform itself, it’s also possible to use Integration Service from different BMC applications. Innovation Suite is the best-known example of this where we use the Service to connect to external (on-premise) applications. The same principle is used for DWP Catalog: we can now use connectors running on Integration Service directly from the workflow designer.

 

Want to know how to do this? Let’s have a look! Since Integration Service is not part of DWP, so you need to have an account for an Integration Service Instance. With DWP Advanced, customers that are either SaaS customers or on-premise customers that have the latest version installed and are current with maintenance have access to Integration Service.

 

Log into Integration Studio using the Integration Service URL. There are a few things we need to do here first. We use the concept of Sites, this provides the connection to a particular customer environment. Since Integration Service is cloud based it needs a way to connect to systems and applications in local (on-premise) networks. This is what a Site provides, it's the gateway to your network. If you log into Integration Studio, go to Sites and set up a Production Site:

 

dwp1.png

 

The Controller is a client which you install on a Linux server in your local network. This sets up the secure gateway to Integration Service. It allows for outbound only traffic so customers do not need to allow for inbound egress traffic. This is a modern cloud-based approach to integration. The Get Controller option will help you to install this. Once that's done you need to configure the connectors. Integration Service offers a variety of connectors which you can use. To do this you need to configure the connector so it knows what site to use and what the server details are. To see what's available, go to Catalog and click on Connectors:

 

dwp2.png

 

These are the connectors which you can use.  Let's have a look at the Remedy ITSM Connector. Look it up in the list and check the details. You can see the Triggers and the Actions, since we're using the connector from DWP Catalog we can only use the Actions. The Triggers only work if you build a Flow directly on Integration Service, you can ignore these for now. Before we can use it we need to configure it. Click on Configuration and click on Add Connection Configuration:

 

dwp3.png

 

This is where you tell the connectors how to connect to your Instance. You select your Site and specify the AR Server and Port. The Connector also needs to know what user account to use, you do this by adding an Account:

 

dwp4.png

 

The credentials are validated at this point, so we know for sure everything works okay. And once that's all done, we're ready to go. That’s everything we need to do on the Integration Service side. Let’s check what’s required on the DWP side.

 

DWP Catalog needs to know where it can find Integration Service, that's done via a configuration option in the Application Settings menu:

 

dwp5.png

 

You use the URL of the Integration Service instance; I'm using the developer instance for demonstration purposes. Once that is done you're ready to use the connectors in your workflow, so let's create a new Service and attach a new process. Notice the Palette on the left, there's a new Activity added called Connector. This links directly to the Integration Service Connectors, you can use multiple connectors here and link them to different Integration Service connectors.

 

dwp6.png

 

Just to keep it simple, I'm creating a simple process which creates an Incident. I add the Connector and tell it what Integration Service connector I want it to use (Connector). The Configuration refers to the Configuration we set up in Integration Service (site, AR server, username, etc). Next I choose the Action and fill in the parameters.

 

dwp7.png

 

And that's it! Save, publish and start using it. Obviously my example here is very basic, in a real-world scenario the process would look a lot more complex. But I hope this at least gives you an idea how to get started with this.

 

I know what you're thinking: that’s all very well, but why would you want to use Integration Service at all? Why not just stick to the existing connectors? The main problem with the in-process connectors is that there’s no SDK, you can’t develop your own connectors. If you need a connector for a new external application or need specific functionality, that wasn’t possible prior to 18.02. But since we use an external platform we have far greater freedom. You are free to write your own connectors and call these from DWP Catalog. Besides connecting to external systems we are also working on a SDK which will allow you to extend DWP Catalog’s integration capabilities, I'll write another blog article on this when that's available. With more people using Integration Service we’ll see more and more connectors extending the capabilities even more.

 

Just a few things to be aware of: Integration Service is optional to DWP Advanced customers and it’s available for both on-premise customers and DWP Advanced Cloud. It's a cloud-based platform so even if you’re using DWP on premise the API calls will flow through the cloud. In terms of security we make sure all transmissions of the data is encrypted using TLS encryption and nothing is stored during transit - the connectors are designed to be non-invasive.

 

And that’s all I have for now. I’d encourage you to consider using Integration Service when connecting with external applications. If you plan to write your own connectors, make sure to read my other blog article. And if you get stuck using the existing connectors in DWP Catalog? Well, you just have to let us know and we’ll do our very best to help you out. You can use the Integration Service community page, the DWP community page or raise a support case.

 

We'll have a more detailed look at DWP Catalog and Integration Service in a next article, so stay tuned!

 

Until next time,

Justin

 

Eager to get started? Here's how:

 

  • For existing DWPA customers: Create a support ticket and the support team will provide access
  • For Professional Services: Register at developer.bmc.com and request access to a sandbox of Integration Service and Innovation Suite
  • For new customers without DWPA: Customers can get trial instance after getting approval from Product Management

Filter Blog

By date:
By tag: