Share This:

In this post, we will discuss about BMC Atrium Integrator, which is now also referred as ‘Data Imports’ with the introduction of the new CMDB User Interface offered by latest CMDB versions.  Below is a reference link of the blog from November 2019 which partially discusses ‘Data Imports’ module too along with CMDB Reconciliation.


Almost every Administrator User of BMC CMDB uses Atrium Integrator / Pentaho Spoon for data import tasks, and some of them by now have gained expertise in creating complex jobs using those tools.Through this blog, we will focus only on the troubleshooting aspect of Atrium Integrator.  Creating of a job or a transformation, whether a basic or a complex one, isn’t covered in this write-up. Following are the sub-topics covered:


[1] Atrium Integrator Overview

[2] Various Atrium Integrator Components

[3] Carte Server – Start and Stop

[4] Atrium Integrator Configuration Forms (UDM:xxxxxx)

[5] Common Error Scenarios

[6] Atrium Integrator Logging


[1] Atrium Integrator Overview (AI Overview)


Atrium Integrator is an integration engine that helps you transfer data from External Datastores to CMDB classes and AR System forms. The purpose of having Atrium Integrator is to transfer data from a variety of input sources such as flat files, complex XML, JDBC, ODBC, JMS, Native Databases, Webservices and others using connectors. Atrium Integrator provides the ability to clean and transform data before transferring it to CMDB classes or AR forms.


Break-up of the Atrium Integrator components per installations:


  • BMC Remedy ARServer Installation installs:

    1. Atrium Integrator Carte Server

    2. Atrium Integrator Spoon

    3. ARSystem Adapters

    4. AR PDI Plugins


  • Atrium Integrator Server installs:

    1. Atrium Integrator Console

    2. CMDB Adapters


  • Atrium Integrator Spoon Client Installs:

       Remote Atrium Integrator Spoon


  Atrium Integrator uses Pentaho which is an ETL tool that enables you to extract, transform and load data. When you run a job from AI console it runs on the AI Carte server. You can also run jobs from Atrium Integrator Spoon client on Client machine/Desktops or directly on AR Server where Spoon is installed.


[2] Various Atrium Integrator Components


Please refer the Blog Post to know more about Data Import features in New CMDB UI console.

Helix Support: Using new CMDB UI - Class Manager & Atrium Integrator


2.1 Atrium Integrator Spoon Client:


The Atrium Integrator Spoon is a client side, user installable graphical user interface application which is used to design transformations and jobs. Atrium Integrator Spoon Client Installer is available on BMC EPD site from the standard BMC Remedy AR System installation program as one of the installable client program options.


For complete documentation on creating transformations and jobs using the Atrium Integrator Spoon client please refer:

Spoon User Guide - Pentaho Data Integration - Pentaho Wiki


2.2 Atrium Integrator Spoon:


Atrium Integrator Spoon gets installed along with AR Server Installation. BMC provides limited support for Spoon. Install and use only the Pentaho and Atrium Integrator Spoon version that is packaged with the BMC Remedy AR System installer.

Though Pentaho Spoon is supported on multiple platforms, BMC Supports Spoon only on Windows.


There are selected steps which BMC owns/supports : in Atrium Integrator transformation - Documentation for BMC CMDB 19.08 - BMC Documentation


For information about additional steps that you can add to your transformation, see the Pentaho documentation.

Spoon User Guide - Pentaho Data Integration - Pentaho Wiki


We support selected vendor databases which include - IBM DB2, MS SQL Server, Oracle, Sybase and My SQL.


[3] Carte Server – Start and Stop


The Carte server configuration entry is in ‘armonitor.cfg’ file. The path location of this file varies based on the Operating System used on AR server.


(Windows) ARSystemServerInstallDir\Conf\armonitor.cfg

(Linux / UNIX) /etc/arsystem/serverName/armonitor.conf

ARMonitor entry for Atrium Integrator.PNG


3.1 Starting Carte ServerA restart / start of BMC Remedy AR Server service will start the Carte Server as long as the Carte line is not commented (using # symbol) in armonitor config file.


3.2 Stopping Carte Server:  It’s important that Carte Server is not killed abruptly when AI job is being run.  In order to stop Carte Server for some time or permanently on a particular AR server, edit the armonitor configuration file by commenting out the line meant for Carte Server.  Save the file.  This must then be followed by killing the existing process for Carte Server.


On Windows server, the Task Manager must be scanned for the below highlighted line to identify the process of Carte Server. After selecting the process, use ‘End Process’ to kill the Carte Server, just like killing any other process in Windows server.


Atrium Integrator Windows Task Manager - locating Process.PNG


On Linux box, the same task can be achieved by first identifying the process ID for Carte Server, which can be done using following command.

  ps -ef | grep ‘diserver’


  Atrium Integrator process on Linux.PNG

This followed by kill command to kill the process

  kill -9 <process ID>


[4] Atrium Integrator Configuration Forms (UDM:xxxxxx)

Note: You will check all these UDM forms when an Atrium Integrator Job is not running.


Below mentioned tables maintain information related to metadata of Atrium Integrator tool.  It is important from troubleshooting perspective, to know what all data is stored in those tables.


  • UDM:Config
  • UDM:RAppPassword
  • UDM:ExecutionInstance
  • UDM:PermissionInfo


  1. UDM:Config: This form contains entries for all AR Servers in server group out of which one is the Default Carte Server to run AI jobs running on that server.  A few important things to note are:


  • The server port assigned to Atrium Integrator is 20000
  • Atrium Integrator can run in AR Server Group environment, and it will pickup the primary server from AR System Server Group Operation Ranking as the ‘Default’ server
  • Secondary server is the server ranked as 2nd in AR System Server Group Operation Ranking for Atrium Integrator operation



NOTE:  For Atrium Integrator to run in AR Server Group, it must be configured in AR System Server Group Operation Ranking form before it is to appear in UDM:Config.


Configure UDM in AR Server Group:


Before configuring this form for a server group environment, you must rank the Atrium Integrator servers by using the AR System Server Group Operation Ranking form. If you assign rank 1 to a server, that server becomes primary server and runs the jobs. If the primary server fails, the secondary server (failover server) runs the jobs. Failover server is the server to which you assigned rank 2. If you do not assign ranking to the servers in a server group environment, jobs run on the server which

receives the request first.

Server Group Ranking.PNG


2. UDM:RAppPassword:


This form stores a Remedy Application (RApp) Service password for a specific AR System server. The AR System server installer populates this regular form during AR System server installation. ARInput, AROutput, and CMDB steps provided by BMC use of this form to make connections to the AR System server.


In the event this password is changed in Remedy AR Server configurations or you restore or migrate DataBase from one server to other, then please make sure that you update this form with correct server names and its correspondent Rapp Password to avoid further issues while running AI/Spoon jobs.


If Atrium Integrator servers are configured to run in AR server group environment, then ensure that this form contains all the possible AR Server entries that includes Short names and FQDN. Remove the ones that aren’t needed or those holding incorrect AR server names.



Any incorrect information in this form leads to failure in Load steps used in Atrium Integrator jobs.


3. UDM:ExecutionInstance:


This regular form allows multiple instances of the same transformations to be run. For every instance, Atrium Integrator Engine provides Object ID; combination of Object ID and Transformation/Job Name is used as Unique Key.


This form has one very important field named as “Atrium Integrator Engine Server Name”  which hold AI server name. In server group environment, this field shows Primary server name. This field should hold the correct AR Server name (in case of DB Restore specifically).


Note: This form cannot be used to create a New Execution Instance manually , you can only use this form to view the created execution instances.


For few AI Job run related issues we suggest to delete existing UDM Execution entry for specific transformation/job and give it a try to trigger the job/transformation again.


4. UDM:PermissionInfo:


This form contains list of repository objects such as Transformations, Jobs, Database connections , Directories, Slave Servers, etc. By default all Jobs and Transformations are assigned Public permission. Users with access to this form can amend repository objects in this form.


During execution of Transformation/Job, query is being performed on this form and if user has access to this form then execution will happen successfully. If there is no permission then you may get errors.


[5] Common Error Scenarios


5.1 Atrium Integrator Job Schedules:


As we all know, an Atrium Integrator job can be scheduled to run at a specific time or an interval.  A few important things to know about the Job scheduler:

[a] Atrium Integrator job schedules are managed by AR Server and they are stored within the "AR System Job" form table.

[b] AR server runs an escalation by the name "ASJ:ScheduleJob" to trigger the job at a configured time.  BMC recommends to run this escalation on specific pool to avoid any Job schedule issues.

[c] One must also not schedule a reconciliation job and an Atrium Integrator job at the same time, because both the jobs could query or update the same data.


Errors like:


Error:- BMCATRIUM_NGIE000502: Failed to update job schedule.


BMCATRIUM_NGIE000501: Failed to create job schedule.


There can be several issues regarding Atrium Integrator job schedules such as schedules jobs not running on scheduled time or cannot modify created/existing schedules etc.


Resolution Approach:

--Verify "AR System Job" form entry to check if the valid job schedule exists with status as Active.

--Verify few important field values such as ‘Schedule Start time’, ‘Next Collection Time’ , ‘Type’ etc.

--You may try to run the job manually from AI console and if that works then verify if the specific job has UDM:ExecutionInstance created, if yes then delete that and see if the job triggers on specific time.

--Verify if escalation "ASJ:ScheduleJob" is enabled and running on specific pool number.


   To run the escalation on specific Pool number please refer following guidelines:

    Remedy AR System Server - How to assign a specific Pool to an Escalation


   Additionally you can visit:


--Our AI expert Gustavo del Gerbo has already covered few important Tips and Hot fixes to be applied:


DMT Schedule Jobs not working and/or failing inconsistently. Randomly schedules do not run. Memory leaks and high memory usage of AI. references in the following post regarding AI job scheduling and UDM Job getting stuck:


  1. Authentication Error when it tries to create a record in UDM:CartePending form.
  2. Unable to create webresult from XML. Error reading information from XML string: Premature end of file.
  3. 401 authentication error when it tries to publish the job to carte server.
  4. When you enable the carte server in debug mode you will get http 400 error when we publish the job to carte server and in arjavaplugin log you will get Socket rest connection error
  5. Error when DMT job console try to query the UDM:ExecutionStatus Form. Authentication error for AR_ESCALATOR or Remedy Application Service
  6. Issue where the first run of the scheduled job is successful and second run gets stuck.
  7. SW00515492 - AI Carte Server has a Memory Leaks and also pentaho ARDB plug-in have memory leaks.
  8. SW00515494 - Pentaho Spoon job received java.util.ConcurrentModificationException


Below is the list of Good to have, cumulative Atrium Integrator Hot fixes for different versions:


  • All versions: 8.1 all SPs, 9.0 all SPs, 9.1GA, 9.1 SP1 up to 9.1 SP2:

    - AI_9.1.00_29NOV2016_SW00518269_ALL


  • 9.1 GA (no service pack) or 9.1 SP1 version:

    - AI_9.1.00_12SEPT2016_SW00515492_SW00515494_ALL

    - AI_9.1.00_30DEC2016_SW00522054_ALL

    - FD_91_2016DEC14_SW00522122_ALL

    - Download and replace the kettle-engine.jar file from this blog to supersede the file from the hotfix package (the file from the package has some incompatibilities).


  • 9.1SP2:

    - AI_9.1.02_25JULY2017_SW00522054_All

    - AI_9.1.02_04MAY2018_SW00539413_All for Security (Version disclosure and other hacks)


  • 9.1SP3: For version 9.1.03 make sure to apply Patch 1 first and then apply the below Hot fixes:


    - AI_9.1.03.001_04JAN2018_SW00540959_ALL

    - AI_9.1.03.001_30JULY2018_SW00549235_ALL

    - AI_9.1.03.001_30Aug2018_SW00550547_All


  • 9.1SP4:

    - Use the attached file for

    - AI_9.1.04_18JAN2018_SW00543845 for Performance of AI console (Flex old UI).

    - AI_9.1.04_29JUNE_2018_SW00548512_All for Performance of Spoon.


  • 9.0 (all SP levels):

    -  AI_9.0.01_26OCT2016_SW00515994_ SW00516220_SW00516447_ALL


  • 8.1 SP2:

     - AI_8.1.02_05OCT2016_SW00516445_SW00515997_ALL


5.2 Out Of Memory Errors when running Atrium Integrator jobs


Sometimes we see out of memory error in arcarte logs


UnexpectedError: java.lang.OutOfMemoryError: GC overhead limit exceeded



-- AI jobs take long time to run/finish

-- Carte Server or at times AR Server crashes.


Resolution Approach:


----Enable -XX:+UseConcMarkSweepGC -XX:+HeapDumpOnOutOfMemoryError -XX:HeapDumpPath="c:\temp\MyDump.hprof" for both Spoon and Carte.

--For Spoon, add the line in Spoon.bat (ARSERVERHOME\diserver\data-integration\Spoon.bat)


--You can increase java heap size for Atrium Integrator and Spoon

When getting out of memory errors while running a job from AI console then increase java heap size in armonitor.conf file


"%BMC_JAVA_HOME%\java.exe "-Xmx1024m"-Djava.ext.dirs=C:\Program Files\Java\jre1.8.0_191\lib\ext;C:\Program Files\Java\jre1.8.0_191\lib;C:\Program Files\BMC Software\ARSystem\diserver\data-integration;C:\Program Files\BMC Software\ARSystem\diserver\data-integration\lib" "-Dorg.mortbay.util.URI.charset=UTF-8" "-DKETTLE_HOME=C:\Program Files\BMC Software\ARSystem\diserver" "-DKETTLE_REPOSITORY=" "-DKETTLE_USER=" "-DKETTLE_PASSWORD=" "-DKETTLE_PLUGIN_PACKAGES=" "-DKETTLE_LOG_SIZE_LIMIT=" "-DKETTLE_MAX_LOG_SIZE_IN_LINES=5000" "-DKETTLE_DISABLE_CONSOLE_LOGGING=Y" "-DKETTLE_COMPATIBILITY_MERGE_ROWS_USE_REFERENCE_STREAM_WHEN_IDENTICAL=Y" "-DKETTLE_LENIENT_STRING_TO_NUMBER_CONVERSION=Y" org.pentaho.di.www.Carte carteservername 20000 -i "C:\Program Files\BMC Software\ARSystem"



When running a job from Spoon: increase java heap size for Spoon:



"%PENTAHO_DI_JAVA_OPTIONS%"=="" set PENTAHO_DI_JAVA_OPTIONS="-Xms1024m" "Xmx2048m" "-XX:MaxPermSize=256m"



5.3 How to troubleshoot UDM / Atrium Integrator job related issues.


5.4 Atrium integrator Spoon and DataBase connectivity:

AI/Pentaho/Spoon/Carte MSSQL Connectivity. A mystery of many forms.


5.5 Atrium Integrator Performance best practices:

Best practices that can improve the performance of Atrium Integrator job


[6] Atrium Integrator Logging:


6.1 How to set DEBUG logs for Carte Server:


1. Locate the following file (in Linux it'll be in a similar directory from root):

<C:\Program Files\BMC Software\ARSystem\diserver\data-integration\pwd\>



2. Edit this file to change root logging level to Debug and save.  Here is the top part of the file where the change is made:

#Root logger log level


# Package logging level


6.2 Atrium Integrator adapter log files


In Windows systems, log files reside in <AR_system_installation_directory>/ARserver/db. In Unix systems, log files reside in <AR_system_installation_directory>/db. Carte server log files include:

  • arcarte.log
  • arcarte-stdout-<timestamp>.log
  • arcarte-stderr-<timestamp>.log


The ARDBC plug-in log file is arjavaplugin.log. All ARDBC plug-in messages are recorded in this file.


By default the log level is warn. If you want to log info or debug messages:

  1. Open <AR_system_installation_directory>/pluginsvr/log4j_pluginsvr.xml
  2. Search for logger com.bmc.arsys.pdi.
  3. Change the log level to info or debug.
  4. Restart the AR System server.


<logger name="com.bmc.arsys.pdi">
<level value="info" />


6.3 Spoon Logging:


While running a Job/Transformation from Spoon, you can set the logging level to one of the given options:



Thank you for reading this blog !