Why not make use of the APIs to retrieve that data into a dashboard? We do that and it works fairly well.
Hmm, I think our concern is that it would be a great deal of API hits. Like because they want it in near real time and we have millions of executions, how do we accomodate both the time requirements and the volume with the API?
I think for some customers, this would work, though. The ones that need info on 90 jobs, not 9000, for instance.
We have such a requirement. please provide more details on how to acheive this usingAPI....
why dont you use Self Service, to see the job status.
@ Robert. could you please give us some tips on the API , i am also looking forward to it.
I've faced this kind of request more than once. One sure thing is that you can forget about BIM and Self Service or any other addon to provide you with this kind of service.
I'll second Robert's advice and encourage you to go the API way for your "real time" requests. Why ? First because more than anything the only thing you'll be doing are SELECT statement and unless you plan on doing some very fancy stuff, the requests should not take that much time. I also encourage you not to use Oracle Dataguard to run your queries on the failover database unless you're ready to consider losing raw performance depending on the dataguard mode you'd chose (it can become quite important trust me).
Laurie Kenley wrote:
The solution we've found is to have a database trigger that writes out to another table whenever a job status updates. Then we let the customers mine that data. This is frowned upon by BMC, so don't do this if you don't have a super duper Oracle wizard who can make it work. There is a minor performance hit involved but it's minuscule compared to the risk of letting users into CMR_AJF.
Using triggers to duplicate data on another table will over complicate everything and would provide no benefits unless you copy the data to another table in another database via a dblink for example (and that still has to be proven). As for your security concerns, I don't see what kind of risks you are expecting. As long as you don't let your "reporting hungry" users access the database directly (this would be a major risk), there's no potential issue.
Think of it like a reporting "service" and build it according to your customer needs as a project and provide reporting models (in most situations it fullfills users general request) which can be triggered in many different ways. What matters the most if that you should always control the request flow.
I've dealt with your situation many times without any problem at all. I understand your skepticism regarding the load it could generate on your database but honestly if you have over 1 million of executions daily, that means you have easily over 100 millions requests per day on the CMR_AJF table itself (if not way more depending if you have 100 000 jobs running 10 times a day or 1000 jobs running 1000 times a day).
Thanks for all the advice, folks! I'm definitely going the API way.