Skip navigation
1 2 3 Previous Next

Digital Workplace / MyIT

41 posts
Share:|

The new branding interface replaces the previous process, and eliminates the procedure that required you to update and copy the CSS to apply your branding changes. Use a convenient interface to customize the logo, name, and colors as per your requirements. The new responsive user interface allows your changes to be seamlessly made available across various desktop and mobile devices. Since the previous tenant.css is no longer applicable, you will need to apply your branding using the new branding interface.

Rebrand_preview.png

In addition to the above branding options, you can use the BMC Digital Workplace Admin Console to easily hide the following features from your end users:

  • Shopping Cart
  • Catalog Ratings and Reviews
  • Knowledge Article Feedback
  • Similar Knowledge Articles
  • Social Activities
  • Posts

For more information, see Rebranding BMC Digital Workplace - Documentation for BMC Digital Workplace Advanced 19.02. For a complete list of new and updated features that have been released, see 19.02.00 enhancements - Documentation for BMC Digital Workplace Advanced.

Share:|

BMC Digital Workplace provides a new responsive user interface in the 19.02 release. The new UI is designed based on the Progressive Web App that supports desktop and mobile devices.

  • All features and branding changes are seamlessly made available on desktop and mobile devices.
  • An iOS native app that supports push notifications and QR code scanning is available.

Client deployment flowchart

Use the following chart to see the available client deployment routes.

Clients deployment options in 19.02.png

Mobile device options

All features such as push notifications, and QR code scan will continue to work with this release on the supported mobile clients. Please see below for a consolidated list of available options on your mobile devices:

Mobile deployments.png

For a complete list of new and updated features that have been released, see 19.02.00 enhancements - Documentation for BMC Digital Workplace Advanced.

Share:|

BMC Digital Workplace 19.02 is Now Available. This release delivers significant new functionalities that customers have requested, including enhancements to the end user experience and catalog capabilities.

 

All of the following enhancements are available for BMC Digital Workplace Advanced. Enhancements marked with an asterisk (*) are also available for BMC Digital Workplace Basic.

Revamped User Interface

 

 

Progressive Web Application (PWA)*

 

Dynamic "app-like" experience on your Device browser or Desktop.

 

 

Active and Past Events*

 

Quickly view current and past events via the My Activity page.

 

 

Bundle details

 

View in-depth bundle details.

 

 

Multiple approval view*

 

View all pending approvals simultaneously.

 

 

Search functionality*

 

Select previous search results and refine by filtering.

 

 

Optional Social functionality*

 

Open application preferences to view social activity.

 

 

 

Service health tab

 

Services now have their own dedicated tab.

 

 

 

Digital Workplace Admin enhancements

 

Rebrand the PWA User Interface*

 

Quickly rebrand the PWA User Interface and changes will be automatically reflected across desktop and mobile clients.

 

 

PWA on iOS doesn't currently have the ability to receive push notifications. Please download the "Digital Workplace" app from the App Store

to enable this functionality. The rebranding changes will be visible on login.

Mobile view

 

 

Desktop view

 

 

Build a custom home page

 

Build your own custom page and modify layouts.

 

 

Hide functionality*

 

Hide additional functionality via the DWP Client Administration console.

 

 

Collaborator preferences

 

Create and load a default collaboration Group.

 

Change Log Level*

 

Quickly change the log level in both the DWP Client* and Catalog

 

 

Questionnaire Tags

 

Add internal tags for use in workflow.

 

 

 

Copy Catalog questionnaire

 

Optionally copy questionnaires with workflow.

 

 

Date restrictions for questionnaires

 

Choose active duration.

 

 

Compatibility

 

For version details of Remedy ITSM Remedy AR System  HR Case ManagementClient ManagementAtrium CMDB Cloud Lifecycle Mgmt please refer to our Compatibility Matrix.

Next Steps

 

For a comprehensive summary of all enhancements please refer to our Digtal Workplace Advanced and Basic documentation

Share:|

This article follows up on a previous article where we looked at setting up the API to integrate DWP Catalog with remote servers. We looked at the connector description and the various calls that need to be defined. If you followed along you hopefully ended up with a working interface.

 

But that’s all it is for now: an interface. A lot of the values we’re returning are hardcoded and it doesn’t do anything at the moment. What I want to do next is explain how to build a full integration. We’ll use the same interface, so if you haven’t already done so, make sure to read my first blog article.

 

What are we going to build? It can’t be too complicated, so what I propose building is a simple ping activity. The basic idea is that you can invoke the action to check if a server is online. It will send out a ping request and report back if the server can be reached or not. You could use this in your workflow to check if a server can be reached, and depending on the outcome you can engage a specific team or escalate it further. We’re going to output a Boolean so that will work nicely with Exclusive Gateways.

 

Just to remind ourselves, this is how the integration works:

 

 

What we are interested in is the connector logic. Our action will return a Boolean indicating if the machine is alive or not. Using the same overview, these are the relevant components:

 

 

Notice I’m not going to an external server, I’m just executing code to do the ping which is all Java. Let’s first write a simple Java class, just so we know what the code looks like and how it works. This is what I came up with:

 

package pingservicetest;

import java.io.IOException;
import java.net.InetAddress;


public class PingServiceTest {

  public static boolean isMachineAlive (String machineName, int machinePort) {
    try {
      InetAddress inet;
      inet = InetAddress.getByName(machineName);
      return inet.isReachable(500);
    } catch (java.net.UnknownHostException e) {
    } catch (IOException e) {
    }  
        
    if (machinePort == 0) {
      machinePort = 111;
    }        
        
    try (Socket s = new Socket(machineName, machinePort)) {
      return true;
    } catch (IOException ex) {
    }
    return false;
  }


  public static void main(String[] args) {
    System.out.println(isMachineAlive("clm-aus-01234"));
  }
}

 

I want to check if servers are available or not, so I need to execute some form of ping command to confirm that they're alive. There's no TCP ping command in Java, but I can just use InetAddress. If I don’t get a response I try an additional Socket call. If I can get a successful connection I return true, if not I return false. I included the example at the end of this article, which you can run if you want to. It’s nothing fancy but it’s a good example of the way you can extend Catalog’s functionality.

 

I don’t like to put everything in one big file, so to make it manageable (and mirror the original diagram) I’m splitting this in two classes:

 

  • RCFController handles the interface. Its primary role is to accept HTTP requests from Catalog and respond with the correct JSON code.
  • RCFLogic is new, this is the class containing the code which does the actual processing. In our case that’s the isMachineAlive method.

 

 

Let’s start with RCFLogic. We already wrote and tested our code so we just need to define the class:

 

package rcf;

import java.io.IOException;
import java.net.*;

public class RCFLogic {

  public static boolean isMachineAlive(String machineName, int machinePort) {
         …
  }
}

 

Nothing new there, it’s a typical Java class, no references to Spring or Catalog. All of this is separate, but needs to be referenced properly. We need to know what the service accepts, what goes out and what format this is in. The interface we used in the first article can stay largely unchanged but I need to add the Port as an input and return a Boolean instead of a String. Here’s the action part of descriptor:

 

"name": "isMachineAlive",
"displayName": "isMachineAlive",
"path": "isMachineAlive",
"inputs": [
  {
    "name": "machineName",
    "type": "String",
    "required": true
  }
  {
    "name": "tcpPort",
    "type": "Integer",
    "required": false
  }
],
"outputs": [
  {
    "name": "Result",
     "type": "Boolean"
  }
]

 

I accept machineName and tcpPort and I return a Boolean called Result. I use the same code based on org.json as last time. Based on this I know I get this HTTP request in from Catalog:

 

POST http://server:8080/jbconnectivity/pingMachine HTTP/1.1

{
  "inputs": {
    "machinePort": null,
    "machineName": "clm-test-12345"
  },
  "connectionInstanceId": "jbconnectivity-1"
}

 

The request is generated by Catalog, but it bases this on the definition which I supplied and uses the values set during workflow creation. I write my code in the controller to handle this. I also know what the HTTP response should be:

 

HTTP/1.1 200

{
  "outputs":
  {
    "result":true
  }
}

 

With some help from org.json and Spring this is the code I came up with:

 

@RequestMapping(value = "/jbconnectivity/pingMachine", method = RequestMethod.POST)
@ResponseBody
public String checkPing(@RequestBody String payload) {
  JSONObject jsonPayload = new JSONObject(payload);
  String machineName = jsonPayload.getJSONObject("inputs").getString("machineName");
  int machinePort = 0;

  try {
    machinePort = jsonPayload.getJSONObject("inputs").getInt("machinePort");
  } catch (JSONException e) {}

  JSONObject jsonInputs = new JSONObject();
  jsonInputs.put("result", RCFLogic.isMachineAlive(machineName, machinePort));
  JSONObject jsonPing = new JSONObject();
  jsonPing.put("outputs", jsonInputs);


  return jsonPing.toString();
}

 

I first accept the payload from the POST request which I convert into a JSON object (jsonPayload). I then extract the machine name and the port (the try/catch clause is just to deal with the null value). Once I have all the values I create the JSON I want to respond with (jsonPing and jsonInputs). The point where I call my method in RCFLogic is line 13.

 

That’s all I’m going to do. Because I don’t connect to an external system, I don’t have to do any health checks, so I’m just going to keep the hardcoded values for checkHealth. Sure, there are things that could be improved, I’ve done some housekeeping in the actual example which I’ve attached at the end of the article, but I hope you get the principle.

Okay, let’s give it a go! I’m keeping the workflow as simple as possible, no fancy conditions, just two questions and one activity.

 

 

Curious if it’s going to work? Let’s log into DWP and create a new service request:

 

 

Because my workflow doesn’t actually create any requests in a backend system, I’m just going to have a look at the Service Requests report on the Catalog side:

And lo and behold, it works! Catalog executed the action, sent a POST request to the API as per the definition. My code picked up the POST request and called the isMachineAlive method in the RCFLogic class. And that was all passed back to DWP.

 

I know, this is fairly basic but I hope you agree that it works quite well. We’re separating Catalog from the external integration by using an API which adheres to strict standards. As long as I make sure I define the interface correctly, it’s up to me to respond to the incoming HTTP requests. It doesn’t really matter what you do in the backend to process the request. It can be as simple as extending the functionality by adding a few custom actions to calculate a week number based on a date, or it can be as complicated as integrating using SOAP-based web services. But keep in mind that this doesn't allow you to interact with DWP, you can't write a service which pushes values to DWP directly, it's all in the background. But it’s up to you and I look forward to seeing what you will come up with. Any questions or feedback? Leave a comment below.

 

Best wishes,

Justin

Share:|

It won’t come as a surprise to you that Catalog integrates seamlessly with ITSM. It’s easy to create work order and incident records. As a backend fulfilment system, integration is key and Catalog interacts with various systems using out-of-the-box connectors like Active Directory, Flexera or MS Office. For most other needs Catalog uses Integration Service which makes it possible to interact with a whole range of other connectors and even allows you to write your own.

 

This should take care of the majority of your integration needs, so is there a need for the ability to integrate directly with remote servers? It’s a recent feature, added only in 18.11, which makes it possible for Catalog to connect to external servers using a HTTP-based interface. Is there any benefit to this? I’d argue there is: while there’s greater work involved in building your own interface, there is also greater flexibility.

 

I really like the solution for connecting different cloud-based applications, but it has its drawbacks: Integration Service is a cloud-based solution, so even if you want to connect your on-premise Catalog server to your local application, it requires you to do this via the cloud. Integration Service also requires a license, and the solution might not be right for you if you’re looking for a small integration or extension.

 

Direct integration with remote servers resolves this. It puts you in control of the integration, both in terms of design and development, but also the hosting location, without any need for an intermediate connection to the cloud. It also suits small-scale extensions, if you want to add some specific functionality or want to integrate with a local application, this might be the solution for you.

 

True, in most cases the combination of out-of-the-box connectors plus Integration Service is enough. But for a backend fulfilment system, integration is paramount and we’ll make every effort to ensure you’re successful.

 

Does this mean that it’s going to be easy to set up? I can’t promise you that. The truth is that you have to build everything yourself, there’s no framework and there are no connectors. You build the server, design the API and all of this has to adhere to a strict standard. Daunting? I agree, but still doable. And I’m here to help.

 

Let’s first talk about the various components. I’m going to deviate slightly from the official documentation as I find it easier to explain it this way. Here’s how I see the integration:

DWP uses Catalog as its backend fulfilment system: users submit requests and Catalog handles the workflow process as part of the service. Catalog has the ability to integrate using out-of-the-box components, Integration Service and Remote Connectors. The Remote Connector is a web-based application which runs on a web server. The key part of this is the HTTP-based API, that’s the part Catalog interacts with. The interface adheres to a specific standard: there are certain requests that will be made and Catalog expects responses in a certain format. The API in turn uses the connector logic to do whatever is required to process: calculate something, return data, create requests, etc. It’s up to you how you want to do this. Notice I used a dashed line for the external system. The primary reason would to communicate with the external system, but this doesn’t necessarily have to be the case. If you want to return week numbers based on a date, that’s absolutely possible.

 

As far as Catalog is concerned it’s communicating with the API, it doesn’t really matter to Catalog what’s happening further upstream, so what we need to do first is define this interface. I’m going to show you how to build a full connector, but I will spread this over two articles, in this first article I’m only going to build the API part. I’m not actually building the connector logic or integrate with an external system, this will just overcomplicating things and stand in the way of explaining the connector properly. We’ll leave this to the second article. This what we’re going to build:

We have to do everything from scratch, in order to get this API working we need to write a web based application which accepts certain HTTP requests and responds in a certain way. These are the calls we need to get the service registered:

 

GET /rcf/description

 

This request will be sent when you set up a remote connector in the Catalog configuration, it’s the first request that goes out. If you’re familiar with SOAP web services, then this is the equivalent of a WSDL: it describes the service, lists what is on offer, what it expects and what it will send back in return. Catalog will use this information to build the service. As with the WSDL, the format is specific and you need to make sure you follow the format exactly.

 

GET /rcf/health

 

This is the second call that goes out when registering the server, it validates the server. It should respond with a simple OK.

 

Those are all the calls we need to get everything registered. So before we go any further let’s get these calls defined. I’m using the Spring framework for the web server. It’s easy enough to spin up and it offers the flexibility I am looking for. I am using Maven as my build tool-of-choice, but just to clarify, there are no requirements here, if you want to use something else, you can. My interface communicates via JSON and to keep it simple and readable I decided to use the org.json library, there are of course different options.

 

If you decide to follow me, download/install Maven from here and set up an empty project with the appropriate Spring references (see below for references). Then run these commands to build and start everything:

 

mvn clean package
java -jar target/gs-rest-service-0.1.0.jar 

 

This starts a web server which I can access via http://myserver:8080/

 

Here are the files I am using:

  • src\main\java\rcf\RCFController.java: main controller, contains request definitions
  • src\main\java\rcf\Application.java: Boot application, used by Spring to start application

 

In Spring HTTP requests are handled by a controller identified by the @RestController annotation. My controller handles the various requests for the RCF web application. Let’s start with the simplest call, /rcf/health:

 

@RequestMapping(value = "/rcf/health", method = RequestMethod.GET)
@ResponseBody
public String getHealth() {
  return "OK";
}

 

I specified that I accept a GET request, I define the URL and I set the response. In our case a simple ‘OK’. That’s enough for this to work.

Let’s turn our attention to /rcf/description. This is an important call, here’s the overview of what we need:

 

{
  "connectors": [
    {
      "name": "Machine Connectivity",
      "version": "0.1",
      "type": "com.example.jb.connectivity",
      "path": "jbconnectivity",
      "capabilities": [],
      "activeConnectionInstances": [],
      "actions": [
        "name": "pingMachine",
        "displayName": "pingMachine",
          "path": "pingMachine",
          "inputs": [
            {
              "name": "machineName",
              "type": "String",
              "required": true
            }
          ],
          "outputs": [
            {
              "name": "Result",
              "type": "String"
            }
          ],
      ]
    }
  ]
}

 

We start with an element called connectors which is an array of the different connectors. Each connector translates to a category in the workflow palette.

 

There are a few properties you need to set, notice the path which you’ll need to use later during run-time. Capabilities is a list of tags describing what the API offers, we’ll look into this later. ActiveConnectionInstances is a list of the available instances like Dev and Prod, this is used in case you use different environments. Next, we've got the actions, this is an array so each connector can have more than one action. This part is fairly self-explanatory: there's the name and you've got the inputs and the outputs. Here's what this looks like on the workflow panel:

 

 

Like the health check we need to define the request as part of the controller. Since we’re dealing with json code I am using org.json. Here’s the code:

 

@RequestMapping(value="/rcf/descriptor", method = RequestMethod.GET)
  public String descriptor() {

  JSONObject jsonDescription = new JSONObject();
  JSONArray jsonArrConnectors = new JSONArray();

  JSONObject jsonConnectionInstance = new JSONObject();
    jsonConnectionInstance.put("id", "jbconnectivity-1");
    jsonConnectionInstance.put("name", "Production");
  JSONArray jsonArrConnectionInstance = new JSONArray();
    jsonArrConnectionInstance.put(jsonConnectionInstance);

  JSONArray jsonArrCapabilities = new JSONArray();
    jsonArrCapabilities.put("com.bmc.dsm.catalog:datasetProvider");

  JSONObject jsonAction = new JSONObject();
    jsonAction.put("name", "pingMachine3");
    jsonAction.put("displayName", "pingMachine3");
    jsonAction.put("path", "pingMachine3");
  JSONArray jsonArrAction = new JSONArray();
  jsonArrAction.put(jsonAction);

  JSONObject jsonActionInput = new JSONObject();
    jsonActionInput.put("name", "machineName");
    jsonActionInput.put("type", "String");
    jsonActionInput.put("required", true);
  JSONArray jsonArrActionInput = new JSONArray();
    jsonArrActionInput.put(jsonActionInput);
  JSONObject jsonActionOutput = new JSONObject();
    jsonActionOutput.put("name", "Result");
    jsonActionOutput.put("type", "String");
  JSONArray jsonArrActionOutput = new JSONArray();
    jsonArrActionOutput.put(jsonActionOutput);

  jsonAction.put("inputs", jsonArrActionInput);
  jsonAction.put("outputs", jsonArrActionOutput);

  jsonAction.put("com.bmc.dsm.catalog", JSONObject.NULL);

  JSONObject jsonIndividualConnector = new JSONObject();
    jsonIndividualConnector.put("name", "Machine Connectivity 3");
    jsonIndividualConnector.put("version", "0.1");
    jsonIndividualConnector.put("type", "com.example.jb.connectivity");
    jsonIndividualConnector.put("path", "jbconnectivity");
    jsonIndividualConnector.put("capabilities", jsonArrCapabilities);
    jsonIndividualConnector.put("activeConnectionInstances", jsonArrConnectionInstance);
    jsonIndividualConnector.put("actions", jsonArrAction);
    
  jsonArrConnectors.put(jsonIndividualConnector);
  jsonDescription.put("connectors", jsonArrConnectors);

  return jsonDescription.toString();
}

 

I’m hardcoding a lot of the values, that’s a deliberate choice to make it readable. In the next blog post we’ll add the connector logic. For now, let’s have a look at some of the details of the API:

 

If we access the URL via /rfc/description, this is the JSON code I get back:

 

{
  "connectors": [
    {
      "name": "Machine Connectivity",
      "version": "0.1",
      "type": "com.bmc.example.jbconnectivity",
      "path": "jbconnectivity",
      "capabilities": [
        "com.bmc.dsm.catalog:datasetProvider"
      ],
      "activeConnectionInstances": [
        {
          "id": "jbconnectivity-1",
          "name": "Production"
        }
      ],
      "actions": [
        {
          "name": "pingMachine",
          "displayName": "pingMachine",
          "path": "pingMachine",
          "inputs": [
            {
              "name": "machineName",
              "type": "String",
              "required": true
            }
          ],
          "outputs": [
            {
              "name": "Result",
              "type": "String"
            }
          ],
          "com.bmc.dsm.catalog": null
        }
      ]
    }
  ]
}

 

That’s enough for Catalog to identify the connector and add it to the workflow. Obviously, it won’t do anything as we haven’t defined any requests for the actions yet, but let’s add it anyway to see what it does:

The name is just for you to identify the server, notice that the URL uses the root. The username and password are used in case of Basic HTTP authentication. If you’re not using them, just enter some dummy values. If all goes well, the actions are added to the palette and I can use it in my workflow. You might have noticed the field Connection Instance Id which is a required field added automatically. This has to have the ID of the active connection instance (not the path) as defined in the descriptor, it needs to be an exact match, else the requests will be not be sent. In my example it’s jbconnectivity-1.

 

That takes care of the design, Catalog can read everything and we can start building our workflow.

 

 

But we’re not there yet, we haven’t anything defined that’s happening at run-time. When Catalog processes the workflow it executes the actions we defined, this results in Catalog sending out requests to our API and we need make sure the API responds appropriately. Here are the calls:

 

POST /{connector_path}/com.bmc.dsm.catalog:checkHealth

 

This is called periodically on each connection to verify its availability. The server is supposed to run a check to see if everything is okay and reports back to Catalog. An Administrator can trigger the check as well.

 

POST /{connector_path}/${action_path}

 

If the activity is executed, this call will go out. The action_path matches the path you defined in the description JSON for the respective action.

 

Let's look at this in more detail. checkHealth is a POST request so there’s a HTTP Body with the payload:

 

POST /jbconnectivity/com.bmc.dsm.catalog:checkHealth 

{  
  "connectionInstanceId": "jbconnectivity-1",  
  "request": {}  
}  

 

Notice that this isn't part of /rcf anymore. The server is supposed to check the connectionInstanceId (you defined this in description) and verify if everything is okay. Always respond with HTTP200 and with the following JSON:

 

{
  "connectionInstanceId": "jbconnectivity-1",
  "response": {
    "status": "CONNECTION_SUCCESS", //CONNECTION_FAILURE if there’s a problem.
    "message": null
  }
}

 

Here’s how I coded this in the controller. To keep it simple I’m returning always CONNECTION_SUCCESS. We’ll work this out in more detail later.

 

@RequestMapping(value = "/jbconnectivity/com.bmc.dsm.catalog:checkHealth", method = RequestMethod.POST)
@ResponseBody
public String checkHealth() {

  JSONObject jsonResponse = new JSONObject();
  jsonResponse.put("status", "CONNECTION_SUCCESS");
  jsonResponse.put("message", JSONObject.NULL);

  JSONObject jsonHealth = new JSONObject();
  jsonHealth.put("connectionInstanceId", "DEV");
  jsonHealth.put("response", jsonResponse);

  return jsonHealth.toString();
}

 

Which leaves us with the connector actions. This is what my Ping action request looks like. This will be generated by Catalog:

 

POST /jbconnectivity/pingMachine

{
  "inputs": {
    "machineName": "clm-aus-12345"
  },
  "connectionInstanceId": "jbconnectivity-1"
}

 

Notice again that there's no /rcf prefix. Catalog bases the values on the values of the input fields. What I need to do is respond with the following JSON:

 

{
  "outputs": {
    "result": "OK"
  }
}

 

The exact format depends on what you defined in descriptor. I just have one field defined here, a field of type String called result.

 

Here’s the code in the controller. Again, the hardcoded values are purely for illustrative purpose. We’ll make this more dynamic in the next article.

 

@RequestMapping(value = "/jbconnectivity/pingMachine", method = RequestMethod.POST)
@ResponseBody
public String checkPing() {

  JSONObject jsonInputs = new JSONObject();
  jsonInputs.put("result", "OK");

  JSONObject jsonPing = new JSONObject();
  jsonPing.put("outputs", jsonInputs);

  return jsonPing.toString();
}

 

That’s all you need to get this working. My examples are all fairly straightforward but I hope you appreciate the flexibly and more importantly the possibilities the connector offers. It can be as simple as a TCP ping and as complicated as the front for a SOAP integration with a custom payroll system. The principles remain the same.

 

What would you do with Remote Server Integration? It’s very versatile, if you define the API correctly you can get it to execute your Java code. It doesn’t necessarily have to be a connection to an external system, it could also be an extension of the functionality. All you need to define are inputs and outputs.

 

The API we’ve written so far is basic and mostly contains hardcoded values, we’d needs to build the logic to get it to do anything. Let’s continue with the example we’ve been using so far: a network ping to determine if machines are alive or not. I’ve used this example before for an Innovation Suite action, let’s check how this would work for a Remote Server integration. But we need to leave this for the next article.

 

I am interested in your feedback. What do you think of the integration possibilities? Would you use this, does this article help you on your way? Maybe you have different or better use cases. I’d love to hear from you, leave a comment below.

 

Best Wishes,

Justin

 

Further Reading:

Share:|

Is this piece of Catalog workflow we have two possible outcomes. The first (long route) will create a Change Request and an Incident and the second (short route) will create just a Work Order. As you can see, we control the flow by using a number of exclusive gateways. In this example the users initial question response will determine the actual path. From the Remedy ITSM side the status of each fulfillment application is routed back by a combination of Remedy workflow and integration JAR file which is triggered by the remoteaction.bat file on the host of the AR Server.

The Catalog Remedy workflow & JAR file (along with the manager approval chain) is implemented by the Catalog ITSM integration patch which is available on our EPD site.

We use "Receive Task" here to listen for the fulfillment status and continue the flow (if required).

If the status isn't being sent back to the DWP Client, check the remoteaction.log and the SB:ServiceRequestStub form for errors on the AR Server.

To begin with, we create a number of input variables which will be used to map question responses to fields within each application. Another question is added here for workflow direction.

 

As you can see, the flow will only proceed on the branch defined if the highlighted condition is met. If any other answer is provided it will choose the other path.

 

 

Now that we've completed our design, we can start mapping our question responses in addition to manually setting all of other required fields. For the purposes of this example I'm only mapping the summary however we can of course create separate questions to map against all of the other fields ( just make sure to select the correct question type).

Don't forget to use the service Broker context variable for common fields !

We also need to set our Process Correlation ID to associate the request with the relevant fulfillment application in ITSM.

 

Once our Change request has completed, we need to add "Get Change Request by Identifiers" to collect it's identity.

 

 

To keep the end user (DWP Client) informed of what's going on we're using the CRQ ID from above to pass back to the request along with a custom message.

Remember, the status values need to match up to those defined in the enumeration table of our documentation (Setting the service request status).

Now that the status has been mapped to the request, we use "Receive Task" to stop the workflow and wait for a signal from the Change Request.

 

On the other side of the branch, our exclusive gateway will only allow an Incident to be created once the status (completed) has been received.

 

 

Once again, we need to collect information about our generated Incident for use in subsequent activities.

 

 

Just like the Change Request, here we're feeding back the status and another custom message along with the actual number of the generated incident.

 

 

We can leave the receive task parameter name as it is and use it for the same purpose as our Change Request.

 

Once the Incident has been "Resolved", we close the request in the Digital Workplace client and pass our custom message.

You could also use the actual Incident status reason for resolution from "Get Incident By Identifiers" here.

 

 

Finally, let's add our input questions and tie it all together. As you can see, I've added simple conditional questions to map the summary depending on the flow route.

 

 

Now let's submit the request choosing the "Long route" to generate our Change Request and Incident !

 

 

As you can see, the fulfillment activity has been mapped for each step of the way !

 

 

In summary, with just a few simple workflow actions we have provided a dynamic view of fulfillment application activity !

 

That's all for now but I'll be back with more workflow Blogs soon...

 

Happy new year to all !

Share:|

December 2018 list of top viewed Digital Workplace knowledge articles, this will be regularly updated.

 

 

Article Number

Title

000097555

Configuring MyIT/SmartIT with an AR Server Group

000138820

MyIT Integration with RSSO not working

000095505

mongoDB Support for Smart IT & MyIT FAQ

000101032

MyIT- SmartIT consoles not accessible gives HTTP 404 ERROR

000130901

Changing AR Server Reference in SmartIT / MyIT Application

000101167

MyIT Tenant Configuration.

000139210

How to upgrade MyIT 3.3.02 Digital Workplace Basic to Advanced?

000100010

MyIT-SmartIT Social Service not starting in windows server 2012 and 2012R2

000084341

MyIT/SmartIT iOS App Download - Certificate Error

000112467

Entries in SHR:SHRCAI_SocialBridge with error "error": HTTP Error 403 : {"error":"MOBILITY_ERROR_SESSION_EXPIRED"} MyIT 3.1

000107846

MyIT Troubleshooting

000117311

CRSF and Digital Workplace

000050661

MyIT Login error

000084415

MyIT: How to Enable Notification in MyIT Universal Client

000117395

Data archiving from SmartIT \ MyIT form - SHR:SHRCAI_SocialBridge

000072088

How to Migrate/Restore MyIT Mongo data to another server

000101716

Kerberos issue with MyIT & SSO configuration on mobile device

000100150

MyIT admin Console not accessible after an upgrade

000076445

MyIT Service restart

000131058

How to configure 'On-Behalf-Of' rule in MyIT

000102570

How to change the SMARTIT_BUSINESS & SMARTIT_SYSTEM users password

000122490

Failed to log into MyIT right after application of patch due to error "Unable to fetch API key"

000101541

Cannot access the MyIT admin console even with MyIT Admin permissions

000100928

Can we configure Internal URL In MyIT How To in MyIT Admin Console

000138250

MyIT deep link url is not redirecting to SR if the user is not logged into MyIT

Share:|

Of course services are supposed to work correctly. After all, you designed them yourself, put a lot of thought into them. You did your homework and it shows: they look the way you want them to and they create work orders to set up new PCs, for example, or alert teams on infrastructure changes, initiate onboarding processes, even book your holidays requests.  You might even have the questions translated into various languages, from Welsh to Chinese. But something has gone amiss. The work order records are created, but they always show up with the wrong status. You check again, it’s definitely not you, everything looks right. What could it be? The whole service is down, and fixing it is time-critical. Tick tock.

 

Luckily for you, Catalog is pretty good at telling you what’s happening to those services your users are creating. I would maybe even go as far as to say that it’s really good at it. If your tool-of-choice is SRM you know that logs are difficult to avoid. The wrong status shows up in your request? Start with the combined SQL, API and Filter logs. Requests get stuck? Give me the combined SQL, API and Filter logs. The value for the Summary field not populated correctly? SQL, API and Filter logs will tell us why.

 

Is this any better in Catalog? As a matter of fact, it is. Logs have their place but it's not the place you start in Catalog. Not convinced? Let’s have a look. The core of your service is the workflow, this tells you how the questions provided by the end users are handled and, more importantly, how the backend process is fulfilled. This is where the work order records and incidents are created. Here’s my workflow:

 

myit-sb_Phaser bank alignment request (3).png

 

Looks pretty good, right? If my phaser banks are not aligned properly, my service will make sure the right team gets engaged straight away. Let’s create a request for the phaser relay on deck 47A:

 

 

dwpc blog 2 s 2.PNG

 

According to my workflow a few things should happen: a work order record should be created if the priority is deemed low or medium. If it’s high or critical a record should be submitted to the external senior command system for further evaluation. But there’s something going wrong here, when the work order is cancelled, the Catalog request does not reflect this and is still stuck in In Progress.

 

dwpc blog 2 s 3.PNG

 

How can you work out what’s happening? You need to look at the flow and understand how the service requests are handled. But the first step doesn’t have to be logs, it’s handled differently in Catalog. First things first, let’s have a look at the report with all the recent requests:

 

dwpc blog 2 s 4.PNG

 

Now go to the Actions menu and click on View Process. This will show you the workflow it’s currently processing. Here’s mine:

 

 

 

download.png

 

 

The activities with a dark grey border have been processed successfully, the activity with the green border is where we currently are. We can easily see that it informed the security maintenance team, then navigated the gateway to create the work order. This also was successful, but it’s still stuck at Wait for Completion.

This now allows me to see what values are passed between the activities. If I click on Create Work Order I can see the output:

 

{
  "instanceId": "AGGAA5V0FLH5PAP9WE46P8ZAJTDPZ3",
  "requestId": "000000000000611",
  "workOrderId": "WO0000000001016"
}

 

It’s creating the work order record correctly and looking at it I can see it’s already cancelled, so I need to look at the workflow in more detail. The Wait for Completion construction is basically a loop which pauses the workflow until it meets the condition. So let’s have a look at the variable that’s used for this. I can do this via the tab with the gear icon:

 

dwpc blog 2 s 6.PNG

 

That should match my condition. Let’s go back to the original workflow and double check this:

 

dwpc blog 2 s 7.PNG

 

So there’s our problem, my condition used the American spelling (one L) whereas ITSM favours British spelling (two Ls). It never meets this condition so we’re stuck in the loop.

 

I realise this is a fairly straightforward root cause but I hope you appreciate it didn’t take us long to find it. Using the graphical overview of the workflow process we quickly established the flow and the point where it got stuck. From there onwards we were able to work out what the various variable values were which allowed us to concentrate on the reason why it’s not progressing.

 

But what if looking at the workflow isn’t enough? What if this doesn’t explain why it’s not processing? What if you need to go deeper? There’s always the option to go through the logs. Catalog’s main log file is bundle.log which is generated in the db folder. This will record the service requests from DWP with all the values but it won’t track the workflow process. So you can find the JSON code generated by the DWP request with all the values and the ID of the request but not what’s happening afterwards. It will however record any exception that might occur. We are planning to introduce some dedicated process logs in a future release which will allow us to track the workflow processes via logs. As soon as this is available I’ll dedicate another blog article to debugging your workflow processes via logs, so stay tuned!

 

But at least I hope this gives you some idea how to get started when there’s a problem with the workflow processes. The visual approach allows you to quickly assess the point where its stuck before you drill down into more details. So the next time your services don’t quite behave the way you expect them to, you know where to look. And if you’re stuck? Well, you just have to give us a shout.

 

Best wishes,

Justin

Share:|

Out of the box the Remedy connector presents a number of common fields to map to a fulfillment application but what if your Remedy ITSM application has been customized ? Well, this is where the other remedy connector actions come into play. In this Blog I'll cover how we can pass question responses to custom fields within BMC Remedy Incident Management. I'll also describe passing Incident notes to the Request in the Digital Workplace Client. The same principle applies to all BMC Remedy ITSM applications.

 

The overall workflow is designed by using a combination of elements as illustrated below.

 

 

 

First of all, add your custom fields into the "HPD:Help Desk" & "HPD:IncidentInterface" forms.

 

 

Now add the same fields into the "HPD:IncidentInterface_Create" form and update the push fields action of the "HPD:HII:CreateIncident_100`! Filter". I'm using a text and drop-down type  question here.

 

Don't forget to flush the Mid-Tier cache and update the fields with the correct permissions !

Now we're ready to begin building our workflow. In this example I'm going to create four text questions and one context type, the context variable is used to pass common field values. This saves you the trouble of creating unnecessary questions for mapping.

 

 

 

The following table is used to describe each action.

 

Action
Purpose
Create Incident with IdentifiersCreate an incident with identifying variables for use in subsequent activities.
Get Incident by IdentifiersCollect information about the generated Incident
Build Input SetBuild custom fields
Set EntryUpdate Remedy form with custom fields
Receive TaskWait for a response
Set Service Request Status

Change the Request Status in the Digital Workplace Client

 

After we've moved our actions to the canvas we begin mapping possible field values while determining the best flow for our questionnaire. In this example I'm mapping "Describe the Incident" question (input variable) to the "Summary" field, "Impact field" to the Incident Impact and two questions to my custom fields of the Incident. The Correlation ID is used by the workflow to route values from an external system and match within the Catalog.

 

For the workflow to understand the relevant Incident details, we map the values from the data collected with "Create Incident with Identifiers"

 

With "Build Input Set" my key values represent the actual custom field names from the "HPD:Help Desk" form.

 

 

Now that we have our Incident identity and custom fields, we're ready to map these values with "Set Entry" to the "HPD:IncidentInterface" form.

 

At this point the workflow has created the Incident along with the responses for the "Summary", "Impact" and custom field values. We use receive task here to block the flow until our "Resolved" status has been returned from the Incident itself. The exclusive gateway allows us to control the flow and continue when "Work Status" = Resolved.

 

Now, let's collect the new Incident values from the generated Incident by using "Get Incident with Identifiers" again.

 

Optionally, we can pass the end users Incident Notes values to the status reason field and close the Request

There are may other useful fields to choose form "Get Incident by Identifiers" such as "Status Reason" which would map the users status value before they "resolved" the Incident.

 

Now I'm ready to build my questionnaire for the above questions.

 

 

Now let's see the complete flow in action by submitting the Request !

 

 

Watching the transitions, we can see the status moving to "Closed" and the Incident Management users notes update being passed back to the Request.

 

 

 

Stay tuned as I'll be updating this space with more examples soon...

Share:|

BMC Digital Workplace 18.11 is now available. This release introduces even more enhancements to end user experience as well as additional catalog capabilities.

 

All of the following enhancements are available for BMC Digital Workplace Advanced. Enhancements marked with an asterisk (*) are also available for BMC Digital Workplace Basic.

 

Dependent services

 

Define prerequisite services by mapping dependencies.

Add collaborators to Requests

 

Define Collaborators to view, edit and delete a Request.

 

 

Assign actions to class attributes

 

Add conditions to restrict the number of assets associated with a Service Action.

Questionnaire enhancements

 

Return additional fields based on fixed or dynamic responses to a question.

 

Use action triggers to display responses based on Dynamic lookup questions.

Reassign Approvals*

 

Send requests to other business users for approval.

Clear multiple notifications*

 

Remove all notifications from both the Digital Workplace Client and Catalog.

 

 

Connector Development

 

Develop custom (on-premise) connectors to transfer data to and from external systems via REST calls.   

 

Compatibility

 

 

For version details of Remedy ITSM Remedy AR System  HR Case ManagementClient ManagementAtrium CMDB Cloud Lifecycle Mgmt please refer to our Compatibility Matrix.

Next Steps

 

For a comprehensive summary of all enhancements please refer to our Digtal Workplace Advanced and Basic documentation.

Share:|

BMC Digital Workplace 18.08 is now available ! In this release we’re continuing to focus on new features to improve end-user experience.

 

All of the following enhancements are available for BMC Digital Workplace Advanced. Enhancements marked with an asterisk (*) are also available for BMC Digital Workplace Basic.

 

 

Multi-user requests

 

Quickly submit a request on behalf of multiple users.

 

 

 

Alternate Approver*

 

Select an alternate approver from the Digital Workplace Client.

 

 

Choose time/date interval or edit current setting.

 

 

Appover assignee will also receive an In App Notification.

 

 

My Stuff impersonation

 

Launch asset actions on behalf of someone else.

 

 

 

Search logic*

 

Configure search logic for services and knowledge articles

 

 

 

 

Quick requests

 

Bypass the Request profile page by enabling "Quick request".

 

 

 

Save for later

 

Save individual Requests for later submission.

Checkout flow

 

Change "On behalf of" or Quantity in the checkout flow.

 

 

 

Workflow Designer

 

Manipulate text using "String Utils" and return multiple records with "Get Entry by Queries".

 

 

 

Search Catalog requests by submitted answers

 

We can now also search catalog requests based on question response.

 

 

 

 

Innovation Suite views

 

Cross launch into an Innovation Suite application.

 

 

User Profile Synchronization*

 

Job title is transparently synchronized into your DWP Profile.

 

 

Sub-catalog for business units

 

Add a Sub-catalog and assign internal supplier or Administrator roles.

 

 

 

Broadcasts*

 

Broadcasts from ITSM or Smart IT are now also visible within Digital Workplace.

 

 

Branding*

Search filter names

 

Rebrand results filter tabs.

 

New filters are immediately visible on search.

 

 

Android

 

Customize the client without coding ability.

 

 

 

 

Compatibility

 

For version details of Remedy ITSM Remedy AR System  HR Case ManagementClient ManagementAtrium CMDBCloud Lifecycle Mgmt please refer to our Compatibility Matrix.

 

 

Next Steps

 

For a comprehensive summary of all enhancements please refer to our Digtal Workplace Advanced and Basic documentation.

Share:|

We are pleased to announce BMC Digital Workplace 18.05 is generally available and packed with new features !

 

 

*All features are applicable to Digital Workplace Advanced with the exception of the three levels of browse categories.

 

Simplified Architecture

 

  • Digital Workplace and SmartIT are now separate installers.
  • MongoDB and node.js (MyITSocial) has been removed.
  • Social data is now stored on the Digital Workplace RDBMS (Oracle & SQL).

Digital Workplace 18.05 is a required upgrade before any later releases. If you would like to upgrade to 18.08 when it's released you must first upgrade to DWP 18.05. There is no ability to skip this upgrade if you would like to move to later releases.

 

Save Multiple Carts

 

Save Carts for a later checkout.

Large Subcategory view

 

Subcategories now include a larger view.

Third Tier Category

 

Browse through three levels of Categories.

Remedy with Smart IT Integration

 

Comments to Catalog Requests will be reflected in their associated fulfillment applications and visa versa. Digital Workplace Requests are also visible within Smart IT.

Asset Management Integration

 

Leverage BMC Asset Management to view and action on Assets (CI's).

 

Owned Assets are visible under the "MyStuff" tab.

Add Classes to Asset Groups

 

Create Asset Groups and retrieve data from BMC Asset Management.

Create Asset Action questions using data from Asset Management

 

Data retrieved from BMC Asset Management can also be used wihthin Asset Actions.

As soon as the Action is launched, questions are populated from the relevant form within BMC Asset Management !

Amazon Web Services Catalog

 

 

Import services from the AWS Service Catalog.

 

Submit as a Request in the Digital Workplace Client.

 

 

Chatbot Requests

 

Enable Chatbot for a Request in the Catalog.

Automatically create the Request in Digital Workplace via Chatbot.

Search for submitted answers

 

Retrieve a Service Request based on question response.

Remove services from My Stuff

 

Remove unwanted services from the My Stuff tab.

Export and import Translated Services

 

 

Quickly localize a service by exporting and re-importing again.

View and edit imported Workflow

 

View imported workflow from BMC Service Request Management and add other activities.

 

View Workflow Relationship

Quickly check what questionnaires and services are related to workflow.

Rebranding

 

We now provide an xarchive binary file which you can open in the re-branding tool and create an IPA mobile application. You must sign this file using the rebranding tool and your digital cert from Apple.

Compatibility

 

For version details of Remedy ITSM Remedy AR  System Atrium Orchestrator HR Case ManagementClient ManagementAtrium CMDBCloud Lifecycle Mgmt please refer to the Digital Workplace Basic and Advanced Compatibility Matrix for further details.

 

 

Next Steps

 

For a comprehensive summary of all enhancements please refer to our documentation.

 

Please comment with your favorite feature

Share:|

JVisualVM is a powerful tool that provides a visual interface to see deep and detailed information about local and remote Java applications while they are running on a Java Virtual Machine (JVM).

jVisualVM is used to track memory leaks, analyze the heap data, monitor the garbage collector and CPU profiling. It also helps to improve the application performance and ensure that memory usage is optimized. With features like thread analysis and head dump analysis, it is very handy in solving run-time problems.

VisualVM is free, and you don’t need to pay a separate cost to get this.

In this blog, we are going to look how jVisualVM can be used to monitor the performance of DWP/Smart IT Tomcat.

 

Recommendation on Java Heap Space.

https://docs.bmc.com/docs/smartit20/smart-it-configuration-settings-749669506.html

 

To remotely monitor the health of tomcat, please add below parameters in tomcat

 

-Dcom.sun.management.jmxremote

-Dcom.sun.management.jmxremote.port=8086

-Dcom.sun.management.jmxremote.ssl=false

-Dcom.sun.management.jmxremote.authenticate=false

-Datsso.log.level=SEVERE

-Djava.awt.headless=true

 

You can add above settings as below

 

Windows: C:\Program Files\Apache Software Foundation\Tomcat7.0\bin\SmartITMyITTomcat8w

 

 

Unix:

File Name: /opt/apache/tomcat8.5/bin/setenv.sh

CATALINA_OPTS="-Dcom.sun.management.jmxremote -Dcom.sun.management.jmxremote.port=8086 -Dcom.sun.management.jmxremote.ssl=false -Dcom.sun.management.jmxremote.authenticate=false -Djava.rmi.server.hostname=DWP/SmartITServerName"

 

Make sure you have JDK installed on one of your client workstation or on your laptop. JVisualVM is available with JDK.

 

 

Optional:

Launch JVisualVM

Open Tool > Plugins

Install Jbeans

 

To connect to your SmartIT/DWP tomcat instance:

Open JvisualVM

Click remote > Add remote host

 

Enter MyIT/SmartIT Tomcat Server details

 

 

 

 

Once you connected, you can see the Heap Chart showing the Max Heap Space. You can monitor this chart during peak load. In the event, you see an issue like unable to login to application, application performing very slow, click on the heap dump available in the chart that you can share with BMC to identify what is consumed within the tomcat heap during runtime. This heap dump is crucial to find the root cause analysis of performance or application login issue.

 

There is also a chart for GC which helps narrow down the memory leak issues.  The last chart about Threads will help you narrow down whether the application maxed out the threads or not in the event of performance.  You can get the thread dump along with heap dump to identify the busy threads.

 

 

Things to check:

 

Our Recommendation is to keep the Min and Max Heap should be same.

 

Windows:  You can put the initial memory and Max memory pool by opening SmartITMyITTomcat8w within C:\Program Files\Apache Software Foundation\Tomcat7.0\bin folder.

 

Unix: You can put this in setenv.sh and pass as one of the JAVA_OPTS or just append to above parameter where we define the remote monitoring within setenv.sh

 

-Xms6144m

 

-Xmx6144m

 

 

 

If you install VisualVM- MBeans Plugin, you can review the values set for the tomcat

 

For Example: MaxThread parameter set to 500 threads for http connector

 

 

Benefits

 

There are many important features that VisualVM supports, such as:

 

1.       Visual interface for local and remote java applications running on JVM.

 

2.       Monitoring of application’s memory usage and application’s runtime behavior.

 

3.       Monitoring of application threads.

 

4.       Analyzing the memory allocations to different applications.

 

5.       Thread dumps – very handy in case of deadlocks and race conditions.

 

6.       Heap dumps – very handy in analyzing the heap memory allocation.

 

 

I hope you find this blog useful. Please comment on this blog with your suggestions to make the blog more productive and appreciate your general comments as well on this blog.

 

See more content like this one BMC Remedy products.

Share:|

We now call it DWP Catalog, but you probably know it as Service Broker. Like MyIT it has been renamed. But don’t worry, it’s still the same product you know and love. Right, now that we’ve got that clarified, let’s have a look at one of the big changes: what are we to do with Integration Service?

 

As a seasoned DWP Catalog veteran I’m sure you’ve built your fair share of services. The meaty part of designing services is the workflow designer which offers a flexible way of describing fulfilment processes.  The graphic representation of the workflow allows us to construct and visualise rather complicated processes. Among the collection of activities, tasks and links there are of course the connectors.

 

Connectors make it possible to interact with external systems which opens DWP beyond just ITSM and gives us greater flexibility. That’s nothing new – the connectors have been part of the product since it was first introduced. But since 18.02 we’ve made it possible to use Integration Service. Question is, what are you going to do with it?

 

First things first: the existing built-in connectors won’t go away, they’ll stay right where they are and they will be maintained. All your existing workflows will continue to work and there’s nothing you need to change. But it’s now also possible to use Integration Service.

 

Confusing? Perhaps, let me try to clarify this for you: Integration Service is not part of DWP, it’s a separate cloud-based solution which allows for integration between different platforms and applications. The service connects different applications, residing on cloud and on-premise environments. For example, you can connect ITSM to RemedyForce, SalesForce to Gmail, etc. These connections run on the platform (we call them flows), you don’t have to define them from an application perspective, meaning they’re not built into ITSM or Remedyforce, they reside on the platform. I could dedicate the whole article to Integration Service, but I’ll save that for another time. If you want to know more about what the platform has to offer read my other blog.

 

Besides defining connections on the platform itself, it’s also possible to use Integration Service from different BMC applications. Innovation Suite is the best-known example of this where we use the Service to connect to external (on-premise) applications. The same principle is used for DWP Catalog: we can now use connectors running on Integration Service directly from the workflow designer.

 

Want to know how to do this? Let’s have a look! Since Integration Service is not part of DWP, so you need to have an account for an Integration Service Instance. With DWP Advanced, customers that are either SaaS customers or on-premise customers that have the latest version installed and are current with maintenance have access to Integration Service.

 

Log into Integration Studio using the Integration Service URL. There are a few things we need to do here first. We use the concept of Sites, this provides the connection to a particular customer environment. Since Integration Service is cloud based it needs a way to connect to systems and applications in local (on-premise) networks. This is what a Site provides, it's the gateway to your network. If you log into Integration Studio, go to Sites and set up a Production Site:

 

dwp1.png

 

The Controller is a client which you install on a Linux server in your local network. This sets up the secure gateway to Integration Service. It allows for outbound only traffic so customers do not need to allow for inbound egress traffic. This is a modern cloud-based approach to integration. The Get Controller option will help you to install this. Once that's done you need to configure the connectors. Integration Service offers a variety of connectors which you can use. To do this you need to configure the connector so it knows what site to use and what the server details are. To see what's available, go to Catalog and click on Connectors:

 

dwp2.png

 

These are the connectors which you can use.  Let's have a look at the Remedy ITSM Connector. Look it up in the list and check the details. You can see the Triggers and the Actions, since we're using the connector from DWP Catalog we can only use the Actions. The Triggers only work if you build a Flow directly on Integration Service, you can ignore these for now. Before we can use it we need to configure it. Click on Configuration and click on Add Connection Configuration:

 

dwp3.png

 

This is where you tell the connectors how to connect to your Instance. You select your Site and specify the AR Server and Port. The Connector also needs to know what user account to use, you do this by adding an Account:

 

dwp4.png

 

The credentials are validated at this point, so we know for sure everything works okay. And once that's all done, we're ready to go. That’s everything we need to do on the Integration Service side. Let’s check what’s required on the DWP side.

 

DWP Catalog needs to know where it can find Integration Service, that's done via a configuration option in the Application Settings menu:

 

dwp5.png

 

You use the URL of the Integration Service instance; I'm using the developer instance for demonstration purposes. Once that is done you're ready to use the connectors in your workflow, so let's create a new Service and attach a new process. Notice the Palette on the left, there's a new Activity added called Connector. This links directly to the Integration Service Connectors, you can use multiple connectors here and link them to different Integration Service connectors.

 

dwp6.png

 

Just to keep it simple, I'm creating a simple process which creates an Incident. I add the Connector and tell it what Integration Service connector I want it to use (Connector). The Configuration refers to the Configuration we set up in Integration Service (site, AR server, username, etc). Next I choose the Action and fill in the parameters.

 

dwp7.png

 

And that's it! Save, publish and start using it. Obviously my example here is very basic, in a real-world scenario the process would look a lot more complex. But I hope this at least gives you an idea how to get started with this.

 

I know what you're thinking: that’s all very well, but why would you want to use Integration Service at all? Why not just stick to the existing connectors? The main problem with the in-process connectors is that there’s no SDK, you can’t develop your own connectors. If you need a connector for a new external application or need specific functionality, that wasn’t possible prior to 18.02. But since we use an external platform we have far greater freedom. You are free to write your own connectors and call these from DWP Catalog. Besides connecting to external systems we are also working on a SDK which will allow you to extend DWP Catalog’s integration capabilities, I'll write another blog article on this when that's available. With more people using Integration Service we’ll see more and more connectors extending the capabilities even more.

 

Just a few things to be aware of: Integration Service is optional to DWP Advanced customers and it’s available for both on-premise customers and DWP Advanced Cloud. It's a cloud-based platform so even if you’re using DWP on premise the API calls will flow through the cloud. In terms of security we make sure all transmissions of the data is encrypted using TLS encryption and nothing is stored during transit - the connectors are designed to be non-invasive.

 

And that’s all I have for now. I’d encourage you to consider using Integration Service when connecting with external applications. If you plan to write your own connectors, make sure to read my other blog article. And if you get stuck using the existing connectors in DWP Catalog? Well, you just have to let us know and we’ll do our very best to help you out. You can use the Integration Service community page, the DWP community page or raise a support case.

 

We'll have a more detailed look at DWP Catalog and Integration Service in a next article, so stay tuned!

 

Until next time,

Justin

 

Eager to get started? Here's how:

 

  • For existing DWPA customers: Create a support ticket and the support team will provide access
  • For Professional Services: Register at developer.bmc.com and request access to a sandbox of Integration Service and Innovation Suite
  • For new customers without DWPA: Customers can get trial instance after getting approval from Product Management
Share:|

BMC Digital Workplace 18.02 is now available. In this release, we’re introducing new features which continue to focus on improving end user experience and Integration Services.

 

We no longer use the 3.x format for versioning. 18.02 signifies February 2018

*With the exception of the healthcheck, all of these features are available with BMC Digital Workplace Advanced only*

 

My Stuff Actions

 

Submit an Asset Request from the Catalog.

 

Launch a Request on closure or from an existing Service which you are following.

 

 

Select workflow to be associated with an Action.

 

 

Add or remove Service Actions.

 

BMC Chatbot Integration

 

Request a Service using Skype for Business, BMC Chatbot Client, Text message or Slack !

 

 

Integration Service Support

 

Create your own connector using BMC Integration Services or choose from any of the available connectors.

 

 

In this example we're using the Jira Connector to create a new issue.

 

 

Multi-language

 

We can now also localize Catalog Services.

 

Activate localization by using the toggle button.

 

Choose the language and you are ready to localize !

 

 

Connector Instances

 

Configure multiple instances of a Connector.

 

In this example we're connected to three different ITSM systems !

 

 

Enhanced Request Report

 

Learn more about your Service Requests.

 

 

Enhanced Health Check

 

Additional checks for notification, SMTP and log level.

 

 

Compatibility

 

For version details of Remedy ITSM  Remedy AR System  Atrium Orchestrator  HR Case Management  Client Management  Atrium CMDB  Cloud Lifecycle Mgmt please refer to the Digital Workplace Basic and Advanced Compatability Matrix for further details.

 

 

Next Steps

 

For a comprehensive summary of all enhancements please refer to our documentation here. To learn how to build your own connectors using BMC Integration Services please refer to the following Blog.

 

Integration Service: Getting started with writing your own connectors

Please comment with your favorite feature

Filter Blog

By date:
By tag: