8 Replies Latest reply on Dec 3, 2019 1:39 PM by Phillip Brockhaus

    Requirement to improve performance

      Share This:

      Hi all,

       

      i am working on a customised remedy application in remedy 7.01. i have a form where if you make an update for a specified record. It will update another form with the same update. The problem here is the second form has lots of matches like more than 1000 records to update and sometimes we can get errors like

      "(ARERR 298) Too many filters processed during this operation." It takes more than one hour to update in certain cases

       

       

      Existing workflow :  there is a push field for updating the second form.

       

      Our suggested solution : Use Direct SQL instead of Push fields in Filters.

       

       

      problem will the arserver will not be used for tracking....

       

       

      Please let me know if there are any other solution

       

      Regards

      Sudeep

        • 1. Re: Requirement to improve performance
          Danny Kellett

          Hi,

           

          It comes down to one question: Do you need workflow to trigger on the second form when updating?

           

          If so then you have to use push fields and up the "Maximum filters for an operation". If you dont then you could use direct sql.

           

          Kind regards

          Danny

          • 2. Re: Requirement to improve performance
            Bjorn Calabuig

            Hi Sudeep,

            A probably reason for this error is that, on your destination form, there's a lot of workflow firing "on modify".

             

            In this scenario, I'd suggest you two options:

             

            - Increase the number of filters the server can process:

            Remedy Administrator->Server Information->Advanced->Maximum Filters for an Operation

             

            - Identify all the workflow that fires "on modify" on your destination form:

            In my opinion, this is more feasible.

             

            I'd do the following:

            - Turn on server side filter logging just when modifying something on your source form

            - Save the changes on source form

            - Turn off logging

            - Analyze log file searching all the workflow that fires "on modify" on destination form

             

            When done, create some workflow (sample):

            a) Add a zTmpfield on your source and destination forms

            b) When the modification happens on source form, mark zTmpfield to "Yes"

             

            c) Modify the filter that pushes modifications from source to destination form, including zTmpfield = "Yes"

             

            d) Review the log you analyzed to check that there's no filter "on modify" on destination form with Execution Order = 0

            e) If there's some workflow, try to increase a little bit its Execution Order

             

            f) Create a filter that fires "on modify" on destination form, with Execution Order = 0 and Run If something like zTmpfield = "Yes"

            f.1) this filter

            -> first action: Call Guide

            -> call a guide that contains the minimum workflow necessary to do your updating (you may have identified the really important filters you need when analyzing the log)

            f.2) this filter

            -> second action: Goto 1000

             

            We did something like this on ITSM6.x: a Customer was used to "mark as duplicate" thousands of Incidents.

            What happened when solving the master incident? A server crash...

             

            So we decided to improve this by increasing the number of filters (we didn't experienced any improvement) and designing the mentioned workflow (tests made using 3000 duplicates passed to solved in less than 40'').

             

            Hope this helps,

            Björn.

            2 of 2 people found this helpful
            • 3. Re: Requirement to improve performance
              Carl Wilson

              I agree with both suggestions, although if you are going to push a large number of records direct SQL would be preferable for performance as it will occur in one transaction.

              Increasing the Max Filters will still use a "nesting" to complete the transactions, thus causing server degredation.

              • 4. Re: Requirement to improve performance
                Bjorn Calabuig

                Hi Sudeep,

                Did you find useful the answers to your questions?

                 

                Kind Regards,

                Björn.

                • 5. Re: Requirement to improve performance

                  Hi Sudeep,

                   

                  If the child tickets which are attached in the master ticket are not in resolved status and in that case if you want to change the status of master ticket resolved then you will face that problem.

                   

                  So first change the status of child tickets resolved and then master ticket.

                   

                   

                  Regards,

                  Bhoomika

                  • 6. Re: Requirement to improve performance
                    LJ LongWing

                    Sudeep,

                    If you are planning on being at RUG this fall, I would suggest you look into the breakout session hosted by myself as it addresses this type of subject and some alternative ways to work around the scenario, especially if you need all of the workflow to fire on all of the records which would make the existing suggestions not feasable.

                    1 of 1 people found this helpful
                    • 7. Re: Requirement to improve performance
                      LJ LongWing

                      Abdul Moid Mohammed,

                      The topic I presented that year at RUG ended up turning into this plugin

                      Transaction Isolation – A Programming Legacy

                      It's not a tool that has much wide appeal, but if you need it, not having it is extremely painful.

                      4 of 4 people found this helpful
                      • 8. Re: Requirement to improve performance
                        Phillip Brockhaus

                        Wow - 7 year wait to reply LJ!

                         

                        There is an odd way to do this 100% in Remedy with built-in functionality. But, it's weird.

                         

                        1. Make a new display only field named Filter Trigger Keyword with a unique field ID on all forms (Same field ID on all forms) that you want to be able to run workflow on that fires in a new transaction instead of in the transaction that's initiating it.

                         

                        2. Make a new form that contains the following fields: Target Form Name, Target Request ID, Filter Trigger Keyword

                         

                        3. Create a new filter on the new form, On Submit - Delete the submitted record

                         

                        4. Create a second new filter on the new form, On Delete - Push fields - Push the Filter Trigger Keyword to the destination form where request ID matches

                         

                        5. Modify your filter / Make a new filter that runs on the destination form where Filter Trigger Keyword = whatever value you are passing in.

                         

                        The delete action spawns a new transaction.

                        (If desired for debug / troubleshooting, you can make a copy of the delete form for archive purposes and push to it as a second push fields action in step 4.)

                        Of course, now that Remedy isn't babysitting endless loops for you anymore, you'll have to make sure to be a little careful when using this.

                        2 of 2 people found this helpful