Wednesday, June 29, 2011

Old Value - New Value

The requirement of auditing the value changes in key fields is one of the common requirements. One can use audit trail or resort to activities for maintaining log. We also faced the same requirement of capturing the Old Value-New value pair in activities, whenever status of Service Request is changed.

There are multiple script less solution available for this. But the one which excited us was the introduction of "GetOldFieldValue" method of buscomp. As per bookshelf

"This method can be called by using a script in the PreWriteRecord event to retrieve an old field value if needed. This method takes an input parameter, which must be a valid field name, and returns a string containing the old field value."

Primarily this method is used for EBC's but one can use this method for any buscomp with effective use. We created a workflow which is used to read older value and new value and then create activity.


1 - Fetch Old Value

Business Service Name: SIS OM PMT Service
Business Service Method:Invoke BC Method

Input Argument
BC Name : Servcie Request
MethodName: GetOldFieldValue
Param 0: Status

OutPut Argument
Type: Output Argument
Output Argument : Return Property Name
Property Name: OldStatus (input property set created for storing Older Value)

2 - Fetch New Value

Business Service Name: Workflow Utilities
Business Service Method: Echo

Input Argument

OutPut Argument
Type: Business component
Business component Name: Service Request
Business Component Field: Status
Property Name: NewStatus

3 - Create Activity

Business Service Name: Inbound E-mail Database Operations
Business Service Method: InsertRecord

Input Argument
BusComp : Action
BusObj: Service Request
Field: SR Id : Object Id
Field: Description: "Status has been Changed from" + [&OldStatus] + "to" + [&NewStatus]

Echo still remains one of the most mysterious yet powerful method. This workflow can be invoked via runtime event on Service Request buscomp.

Happy Auditing!!

Tuesday, June 21, 2011

50 Up!!

“Statistics are like a bikini. What they reveal is suggestive, but what they conceal is vital.” - Aaron

As i write my 50th post, it feels good to be quantum of blogosphere. Here we will discuss one of my favorite topic "Workflows". A lot has been suggested about activating multiple workflows but vital thing in this post is importing/exporting multiple workflows from one repository to another and deploying and activating them in one go.

I have posted two workflows here. One is for multiple export of workflows and other is to import and activate the exported workflows. One can download from below links.

1 - Multiple Workflow Export

Usage:

Multiple Workflow Export -

Import this workflow in your tools. Modify following input process properties as per your need.

ExportFolder: This is the name of the directory where you want to keep the export of XML files.For eg, "D:\Workflow_Export"
Repository: Name of the repository from where Workflow should be exported.For eg, "Siebel Repository"
SearchSpec: Search criteria to identify what all workflows should be exported.For eg, [Process Name] LIKE 'SR*' AND [Status] = 'Completed'

Once these parameters are modified simulate the workflow and check the export directory. This directory will house all the XML export files of the Workflows given in the searchspec. Once simulation is over check the "NumOffWF" parameter which will give us the count of workflows exported. The only glitch here is it doesn't list the name of workflow which has failed during export operation.

Multiple Workflow Deploy and Activate -

Import this workflow in your tools where you want to import the extracted files. Modify following input process properties as per your need.

ImportFolder: This is the name of the directory where input files are housed.For eg, "D:\Workflow_Export"
Repository: Name of the repository where workflows should be created.For eg, "Siebel Repository"
ProjectName: Name of the project to which all workflows will be associated.For eg, "Service Request Workflow"
SearchSpec: This searchpsec is used to filter out workflows we want to deploy.For eg, "[Process Name] Like 'SR*' AND [Status] = 'In Progress'"
ActivateSearchSpec: This searchspec is used to activate workflows.For eg, [Process Name] Like 'SR *' AND [Status] = 'Completed'

Once these parameters are modified simulate the workflow. "NumOffWF" parameter after each step will give the number of workflows which are processed. The only issue is it doesn't list name of workflows which has failed during the operation.

If you are looking only for multiple workflow activation and comprehensive error logging then one can visit vinay's World. Once again thanking all for your support and comments.

Happy Deployment!!



Wednesday, June 15, 2011

AM for life

It can be real pain for users to sit and assign requests manually when loads of requests are bombarding system at regular intervals. For them Assignment Manager(AM) comes as boon. One can leverage AM features to ease out assignment overhead either in batch mode or in dynamic mode. For those who still want to assign request manually we have interactive feature available.

In this post i am going to primarily discuss dynamic assignment, comparison methods available and components involved. Dynamic assignment primarily performs assignments of records automatically as users and server programs create records or modify existing records. It makes use of database triggers for assignment of request.

Well written rules can help us win half the battle of assignment. The Most interesting part while defining the rules is the usage of comparison method. This defines how attributes should be compared and candidate should be evaluated. There are 5 methods available .

1 - Compare to Object

This comparison method is used to filter out the records for assignment. This method checks if the record attribute matches the value specified in the rule criteria.

Example: If any service request is created with status of "Duplicate" then it should be always assigned to Particular Employee.


2 - Compare to Person/Compare to Organization

This comparison is to filter out the Potential Employee/Postion/Organization. This method checks if the Person/Organization Skills matches the value specified in the rule criteria.

Example: If any Service Request is created of High Priority then it should be assigned to employee with Skill of type "Industry" and Value "Services".



3 - Compare Object to Person/Compare Object to Organization

This comparison directly compares the record attribute with the potential Employee/Position/Organization skills. There is no Value defined in the criteria values.

Example: If any service request is created of specific Product then it should be assigned to employee with skill of type "Product" and value as given in the product associated with SR.


There are more advanced configuration options avialable involving scoring mechanism and workload distribution for more complex assignments.

Primary components involved in the Dynamic Assignment are:

1 - Generate Trigger
2 - Workflow Monitor Agent
3 - Assignment Manager

In order to run dynamic assignment we should create a Assignment Policy for the Object we need to assign. This policy should belong to a Policy Group. Create Assignment rules with criteria and potential candidates. Release the assignment rules.


Once rules are released, run "Generate Trigger" job. This will recreate triggers to be used during assignment. Now run a WorkMon task for the assignment policy group and ensure Assignment Manager component is up and running. Two key points to debug dynamic assignment are:

1 - S_ESCL_REQ table - Entry in this table suggest that triggers are properly generated

2 - WorkMon Log - WorkMon log comprehensively lists out all the steps taken during the assignment of request including rule evaulation/Candidate assignment.

Alex has already given terrific H.U.G. to AM. If used effectively AM can really ease out assignment tasks. We will discuss more on the other assignment modes in the future.

Happy Assignment!!

Wednesday, June 8, 2011

JMS Messaging Configuration

"If everything else fails, try Java" . Going by the notion it is imminent for siebel to support Java business services. Being inherently generous, Siebel also extends its support to send and receive messages by way of JMS servers. I found a doucment which beautifully explains configuring JMS messaging using AQ between Siebel and Oracle SOA suite. This can be downloaded from below link:


All said and done Siebel Bookshelf still remains mother of all documentation but for a quick refresh or basic orientation this guide is worth a look.

Thursday, June 2, 2011

S_SRM_TASK_HIST Table

One of my admin friend once told me that the problem with troubleshooting is that trouble shoots back. In my on going quest for controlling multiple Login-Logout sessions in Directory authentication environment i landed up in situation where we required to know the list of naughty users who are working in multiple sessions. Instead of shooting back to admin friend i thought lets-hit-google and really found something worth sharing.

Issue: We set Flag on Login and unset the same on Logout in order to determine availability for assignment purpose. In multiple login/logout scenario this approach fails. We have to maintain the exact status of Employee in order for correct assignments.

Solution: This could be accomplished in two ways. First option is, We can have batch script which executes command to list the number of sessions currently logged in each application server.

List Active sessions for comp PsCcObjMgr_enu show SV_NAME,CC_ALIAS,OM_LOGIN

Above command will list the active sessions. Based on the active sessions we can update flag (Available/Unavailable) in db for logins by using stored procedure.

But being KISS(Keep it short and simple) follower an alternate and effective solution could be to query DB for active sessions and update the flag as required. Now the question arises is how to fetch active sessions from db itself. "S_SRM_TASK_HIST" table comes as life saver in this scenario. This table lists all the sessions on the object manager along with start and end date. So we can easily determine which is active and which one is dead.

select SRVR_START_TS,SRVR_END_TS,SRVR_COMP_NAME,SRVR_USER_NAME
from siebel.S_SRM_TASK_HIST where SRVR_COMP_NAME IN ('PSCcObjMgr_enu')
Order by SRVR_USER_NAME

S_SRM_TASK_HIST stores the task information for all the object managers. By default it is set to maintain history for 24 hours but it is configurable.

There are not many ways to control multiple login in siebel but each fails with some shortcomings. More ideas to control multiple logins are welcome here.

Happy Crunching!!