Pages

Subscribe:

Thursday, February 24, 2011

Informatica Dynamic Lookup Cache

A LookUp cache does not change once built. But what if the underlying lookup table changes the data after the lookup cache is created? Is there a way so that the cache always remain up-to-date even if the underlying table changes?

Dynamic Lookup Cache

LookUpLet's think about this scenario. You are loading your target table through a mapping. Inside the mapping you have a Lookup and in the Lookup, you are actually looking up the same target table you are loading. You may ask me, "So? What's the big deal? We all do it quite often...". And yes you are right.
There is no "big deal" because Informatica (generally) caches the lookup table in the very beginning of the mapping, so whatever record getting inserted to the target table through the mapping, will have no effect on the Lookup cache. The lookup will still hold the previously cached data, even if the underlying target table is changing.

But what if you want your Lookup cache to get updated as and when the target table is changing? What if you want your lookup cache to always show the exact snapshot of the data in your target table at that point in time? Clearly this requirement will not be fullfilled in case you use a static cache. You will need a dynamic cache to handle this.


But why on earth someone will need a dynamic cache?

To understand this, let's next understand a static cache scenario.


STATIC CACHE SCENARIO

Let's suppose you run a retail business and maintain all your customer information in a customer master table (RDBMS table). Every night, all the customers from your customer master table is loaded in to a Customer Dimension table in your data warehouse. Your source customer table is a transaction system table, probably in 3rd normal form, and does not store history. Meaning, if a customer changes his address, the old address is updated with the new address. But your data warehouse table stores the history (may be in the form of SCD Type-II). There is a map that loads your data warehouse table from the source table. Typically you do a Lookup on target (static cache) and check with your every incoming customer record to determine if the customer is already existing in target or not. If the customer is not already existing in target, you conclude the customer is new and INSERT the record whereas if the customer is already existing, you may want to update the target record with this new record (if the record is updated). This is illustrated below,You don't need dynamic Lookup cache for this


Statis Cache


DYNAMIC LOOKUP CACHE SCENARIO

Notice in the previous example I mentioned that your source table is an RDBMS table. This ensures that your source table does not have any duplicate record. What if you had a flat file as source with many duplicate records? Would the scenario be same? No, see the below illustration.


Dynamic Lookup CacheImage: A Scenario illustrating the use of dynamic lookup cache

Here are some more examples when you may consider using dynamic lookup,

  • Updating a master customer table with both new and updated customer information as shown above
  • Loading data into a slowly changing dimension table and a fact table at the same time. Remember, you typically lookup the dimension while loading to fact. So you load dimension table before loading fact table. But using dynamic lookup, you can load both simultaneously.
  • Loading data from a file with many duplicate records and to eliminate duplicate records in target by updating a duplicate row i.e. keeping the most recent row or the initial row
  • Loading the same data from multiple sources using a single mapping. Just consider the previous Retail business example. If you have more than one shops and Linda has visited two of your shops for the first time, customer record Linda will come twice during the same load.

So, How does dynamic lookup work?

When the Integration Service reads a row from the source, it updates the lookup cache by performing one of the following actions:

Inserts the row into the cache: If the incoming row is not in the cache, the Integration Service inserts the row in the cache based on input ports or generated Sequence-ID. The Integration Service flags the row as insert.

Updates the row in the cache: If the row exists in the cache, the Integration Service updates the row in the cache based on the input ports. The Integration Service flags the row as update.

Makes no change to the cache: This happens when the row exists in the cache and the lookup is configured or specified To Insert New Rows only or, the row is not in the cache and lookup is configured to update existing rows only or,
the row is in the cache, but based on the lookup condition, nothing changes. The Integration Service flags the row as unchanged.

Notice that Integration Service actually flags the rows based on the above three conditions. This is a great thing, because, if you know the flag you can actually reroute the row to achieve different logic. This flag port is called "NewLookupRow" and using this the rows can be routed for insert, update or to do nothing. You just need to use a Router or Filter transformation followed by an Update Strategy.

Oh, forgot to tell you the actual values that you can expect in NewLookupRow port:
0 Integration Service does not update or insert the row in the cache.
1 Integration Service inserts the row into the cache.
2 Integration Service updates the row in the cache.

When the Integration Service reads a row, it changes the lookup cache depending on the results of the lookup query and the Lookup transformation properties you define. It assigns the value 0, 1, or 2 to the NewLookupRow port to indicate if it inserts or updates the row in the cache, or makes no change.

Example of Dynamic Lookup Implementation

Ok, I design a mapping for you to show Dynamic lookup implementation. I have given a full screenshot of the mapping. Since the screenshot is slightly bigger, so I link it below..



And here I provide you the screenshot of the lookup below. Lookup ports screen shot first,



And here is Dynamic Lookup Properties Tab



If you check the mapping screenshot, there I have used a router to reroute the INSERT group and UPDATE group. The router screenshot is also given below. New records are routed to the INSERT group and existing records are routed to the UPDATE group.

Router Transformation
Router Transformation Groups Tab


About the Sequence-ID

While using a dynamic lookup cache, we must associate each lookup/output port with an input/output port or a sequence ID. The Integration Service uses the data in the associated port to insert or update rows in the lookup cache. The Designer associates the input/output ports with the lookup/output ports used in the lookup condition.

When we select Sequence-ID in the Associated Port column, the Integration Service generates a sequence ID for each row it inserts into the lookup cache.
When the Integration Service creates the dynamic lookup cache, it tracks the range of values in the cache associated with any port using a sequence ID and it generates a key for the port by incrementing the greatest sequence ID existing value by one, when the inserting a new row of data into the cache.

When the Integration Service reaches the maximum number for a generated sequence ID, it starts over at one and increments each sequence ID by one until it reaches the smallest existing value minus one. If the Integration Service runs out of unique sequence ID numbers, the session fails.


About the Dynamic Lookup Output Port

The lookup/output port output value depends on whether we choose to output old or new values when the Integration Service updates a row:
Output old values on update: The Integration Service outputs the value that existed in the cache before it updated the row.
Output new values on update: The Integration Service outputs the updated value that it writes in the cache. The lookup/output port value matches the input/output port value.
Note: We can configure to output old or new values using the Output Old Value On Update transformation property.


Handling NULL in dynamic LookUp

If the input value is NULL and we select the Ignore Null inputs for Update property for the associated input port, the input value does not equal the lookup value or the value out of the input/output port. When you select the Ignore Null property, the lookup cache and the target table might become unsynchronized if you pass null values to the target. You must verify that you do not pass null values to the target.

When you update a dynamic lookup cache and target table, the source data might contain some null values. The Integration Service can handle the null values in the following ways:

Insert null values: The Integration Service uses null values from the source and updates the lookup cache and target table using all values from the source.

Ignore Null inputs for Update property : The Integration Service ignores the null values in the source and updates the lookup cache and target table using only the not null values from the source.

If we know the source data contains null values, and we do not want the Integration Service to update the lookup cache or target with null values, then we need to check the Ignore Null property for the corresponding lookup/output port.

When we choose to ignore NULLs, we must verify that we output the same values to the target that the Integration Service writes to the lookup cache. We can Configure the mapping based on the value we want the Integration Service to output from the lookup/output ports when it updates a row in the cache, so that lookup cache and the target table might not become unsynchronized

New values. Connect only lookup/output ports from the Lookup transformation to the target.
Old values. Add an Expression transformation after the Lookup transformation and before the Filter or Router transformation. Add output ports in the Expression transformation for each port in the target table and create expressions to ensure that we do not output null input values to the target.

When we run a session that uses a dynamic lookup cache, the Integration Service compares the values in all lookup ports with the values in their associated input ports by default.

It compares the values to determine whether or not to update the row in the lookup cache. When a value in an input port differs from the value in the lookup port, the Integration Service updates the row in the cache.


But what if we don't want to compare all ports? We can choose the ports we want the Integration Service to ignore when it compares ports. The Designer only enables this property for lookup/output ports when the port is not used in the lookup condition. We can improve performance by ignoring some ports during comparison.


We might want to do this when the source data includes a column that indicates whether or not the row contains data we need to update. Select the Ignore in Comparison property for all lookup ports except the port that indicates whether or not to update the row in the cache and target table.

Note: We must configure the Lookup transformation to compare at least one port else the Integration Service fails the session when we ignore all ports.

15 comments:

  1. Combine a data at the time of information you might follow us on :-The questionnaires are studied and Informatica interview questions prepared by professionals who are related to this field. With this website you can get an overview about all details and concept related to the topic that can help them to learn each and every informatica questions concept clearly.

    ReplyDelete
  2. Very useful information thank you for sharing. Informatica Tutorial

    ReplyDelete
  3. Wonderful bloggers like yourself who would positively reply encouraged me to be more open and engaging in commenting.So know it's helpful.

    digital marketing course in chennai
    software testing training in chennai

    ReplyDelete
  4. Actually your blog is very useful for me ..your blog easily understandable interview skills.

    Informatica Training in Chennai

    ReplyDelete
  5. Nice blog on: dataware housing.
    Keep blogging more thank you for the wonderful blog post.
    myTectra: Devops Training in Bangalore
    itEanz: Devops Training

    If you are Looking for Interview Questions for Popular IT Courses Refer the link Below
    Devops Interview Questions
    Informatica Interview Questions
    Docker Interview Questions
    Hive Interview Questions
    Talend Interview Questions
    As400 Interview Questions

    ReplyDelete
  6. Really cleared up some doubts regarding cache.
    It was a pleasure reading your article.
    Hope to read more from you.
    For quality courses on anything IT related give myTectra a try.
    myTectra is the Marketing Leader In Banglore Which won Awards on 2015, 2016, 2017 for best training in Bangalore:
    python interview questions

    python online training

    ReplyDelete
  7. TIB Academy is one of the best Informatica Training Institute in Bangalore. We Offers Hands-On Training with Live project.

    ReplyDelete
  8. This is quite educational arrange. It has famous breeding about what I rarity to vouch. Colossal proverb.
    This trumpet is a famous tone to nab to troths. Congratulations on a career well achieved. This arrange is synchronous s informative impolites festivity to pity. I appreciated what you ok extremely here 


    Selenium training in bangalore
    Selenium training in Chennai
    Selenium training in Bangalore
    Selenium training in Pune
    Selenium Online training

    ReplyDelete
  9. Thank you for sharing such great information very useful to us.
    Informatica Training in Noida

    ReplyDelete
  10. It is amazing and wonderful to visit your site.Thanks for sharing this information,this is useful to me...
    [URL=http://chennaitraining.in/salesforce-developer-training-in-chennai/]Salesforce Developer Training In Chennai[/URL]
    [URL=http://chennaitraining.in/sap-hana-training-in-chennai/]SAP HANA Training In Chennai[/URL]
    [URL=http://chennaitraining.in/sap-mm-training-in-chennai/]SAP MM Training In Chennai[/URL]
    [URL=http://chennaitraining.in/sap-sd-training-in-chennai/]SAP SD Training In Chennai[/URL]

    ReplyDelete
  11. I think Informatica is the best way in which one can keep on finding more solutions and focus on other similar aspects as well.

    Informatica Read Soap API

    ReplyDelete
  12. Your blog is in a convincing manner, thanks for sharing such an information with lots of your effort and time
    datastage online training India
    datastage online training Hyderabad

    ReplyDelete