Skip to main content

Posts

Identifying if verbose is enabled in a workflow

It's last day of Year 2013. So lets wind it up with a small topic. We usually enable verbose mode for a transformation in a workflow to check the actual data values that are being passed across that transformation. It is good for debugging purpose but in the final code it is advisable to not use verbose mode as the session log size increases invariably which consumes a lot of space. So in order to identify or cross check if we have enabled verbose mode for any transformation the simplest way is to export the workflow xml from repository manager and then search for "verbose" keyword. If we have used it anywhere we can easily identify for which transformation and in which session we have enabled it. Hope this articles helps you in understanding the above topic

Comparing Objects in Informatica

We might face a scenario where there may be difference between PRODUCTION v/s SIT version of code or any environment or between different folders in same environment. In here we go for comparison of objects we can compare between mappings,sessions,workflows In Designer it would be present under "Mappings" tab we can find "Compare" option. In workflow manger under "Tasks & Workfows" tab we can find "Compare" option for tasks and workflows comparison respectively. However the easiest and probably the best practice would be by doing using Repository Manager.In Repository Manager under "Edit" tab we can find "Compare" option. The advantage of using Repository manager it compares all the objects at one go i.e. workflow,session and mapping. Hence reducing the effort of individually checking the mapping and session separately. Once we select the folder and corresponding workflow we Can click compare for checking out ...

Columns to Rows Conversion - Simple Scenario

Scenario: We have a source having 10 records with 3 columns and target with 1 column. we have to design a mapping to load each column of each record into target. Example: Source: Name Father_Name Mother_Name RAGHU RAM RATHNA Target:   Name RAGHU RAM RATHNA Solutions: Create an Expression transformation exp_<Name> and then pull the required ports from source qualifier to expression. Make 3 Instances of the target and connect port1 from expression to target_1, port2 from  expression to target_2 and then port3 from expression to target_3.

'Update as Insert' and 'Update Else Insert'

Here the naming convention clearly gives the details about the row type conversion(Insert/Update) of records which will load into target. We can see the options in session properties under Mapping tab. ' Update as Insert ' - Integration Service treats  the records which are coming as update row type also  as Inserts. ' Update Else Insert ' - Integration service updates the record if exists in target, else Insert into target. And in simpler form we can say: Insert --> Flags the rows as inserts Update as Update --> Flags the rows as update Update as Insert --> Flags Updates as Inserts Update else insert --> Checks for the row in target ,Inserts if there is no record and update if the record exists. Always we will select the Insert property and we can Select only one out of three update properties.If you check one property then other properties will be unchecked automatically.

Target Load Type - Normal or Bulk in Session Properties

We can see the Target load type ( Normal or Bulk) property in session under Mapping tab and we will go for Bulk to improve the performance of session to load large amount of data. SQL loader utility will be used for Bulk load and it will not create any database logs(redolog and undolog), it directly writes to data file.Transaction can not be rolled back as we don't have database logs.However,Bulk loading is very as compared to Normal loading. In target if you are using Primary Key or Primary Index or any constraints you can't use Bulk mode. We can see this property in the below snap shot.

Loading records into multiple targets from single source

Scenario: Let's say we have one source with 20 records and 3 targets X,Y,Z.I have to load first record into X, 2nd record into Y, 3rd record into Z and again 4th record into X, 5th record into Y, 6th record into Z and so on up to last record. Solution: We can implement this using Sequence Generator,Expression,Router transformations. Pull all the required fields from Source Qualifier to Expression transformation (apply business logic if required), create one extra port (Let's say REC_NUM ) and then link the "Next Value" field from the Sequence Generator to  REC_NUM   in Expression. From Expression, Pull the required fields into Router transformation including  REC_NUM   and then create 3 groups in Router as we have 3 targets(m groups for m targets). Use the below conditions and then connect each group to respective target. Group 1 -- MOD(REC_NUM,3)=1  -- Connect to X Group 2 -- MOD(REC_NUM,3)=2  -- Connect to Y Group 3 -- MOD(REC_NUM,3)=0  -- Con...

Performance tuning with Re-cache in Lookup transformation

Suppose you have a requirement where you should look up on the same table in more than one mapping and the corresponding sessions are in same workflow. To improve performance of that workflow, you can check re-cache option in the mapping where the lookup will be executed for the first time. If you check Re-cache option in lookup, the lookup will create Cache in the mapping where you are using that lookup for the first time and other depending mappings will re-use the cache created by the first mapping.  With this re-usability of lookup cache, performance will be increased. To do this, we should set dependency between sessions in the workflow and check re-cache in first session. I will explain you with an example. I have below mappings: 1)       m_PRESTGtoXREF_mbr_xref 2)       m_PRESTGtoXREF_grp_contc_xref 3)       m_PRESTGtoRARI_sbgrp_contc And I have a lookup on table HCR3R_GRP_XREF and a...