Blogs | Hexaware

Blogs


Latest Post
The fact transactions that come in earlier than the dimension (master) records are not bad data, such fact records needs to be handled in our ETL process as a special case. Such situations of facts coming in before dimensions can occur quite commonly like in case of a customer opening a bank account and his transactions starting to flow into the data warehouse immediately. But the customer id creation process from the Customer Reconciliation System can get delayed and hence the customer data would reach the data warehouse after few days. How do we handle this scenario differs based on the business…
Posted by Muneeswara C Pandian
Comments (0)
June 29th, 2007
The story is a bit different in a non-production environment. Depending on your organization change control policies, developers might have Data Mover Access in non-production. In this case, we might want to prevent the OPRID from exploiting Data Mover Access to perform undesired DDL on the database. To tackle this problem, you can create a DDL trigger as shown below. This will ensure that no DDL operations are performed from Data Mover. CREATE OR REPLACE TRIGGER DATAMOVER_PREVENT_DDL BEFORE CREATE OR ALTER OR DROP OR GRANT OR RENAME OR REVOKE ON SCHEMA DECLARE VAR_DDLEVENT VARCHAR2(25); VAR_OBJ_NAME VARCHAR2(128); V_AUDIT_OPRID VARCHAR2(32); BEGIN DBMS_APPLICATION_INFO.READ_CLIENT_INFO(V_AUDIT_OPRID);…
Posted by Nitin Pai
Comments (0)
June 25th, 2007
Circa 2015 - 8 years from now CEO of a multinational organization enters the corner office overlooking the busy city down below. On flicking a switch near the seat, the wall in front is illuminated with a colorful dashboard, what is known in CEO circles then, as the Rainbow Chart. The Rainbow Chart is the CEO’s lifeline as it gives a snapshot of the current business position (the left portion) and also figures/colors that serves as a premonition of the company’s future (the right portion). The current state/left portion of the dashboard, on closer examination, reveals 4 sub-parts. On the…
Posted by Karthikeyan Sankaran
Comments (0)
June 25th, 2007
Business Intelligence (BI) is well & truly at the crossroads and so are BI practitioners like me. On one hand there is tremendous improvement in BI tools & techniques almost on a daily basis but on the other hand there is still a big expectation gap among business users on Business Intelligence's usage/value to drive core business decisions. This ensures that every BI practitioner develops a 'split' personality - a la Jekyll and Hyde, getting fascinated by the awesome power of databases, smart techniques in data integration tools etc. and the very next moment getting into trouble with a business…
Posted by Karthikeyan Sankaran
Comments (8)
June 25th, 2007
Continuing from my previous post - "Perils of DataMover Access - Part 2a" Let'slook at a scenario where the security was modified to enable Data Mover Access. The above results show that oprid NPAI ‘Added’ the security to enable DATA MOVER Access. It is possible that the OPRID’s who by-passed controls to modify Data Mover Access are smart enough to also delete the rows in the audit table using Datamover. This might go undetected unless you have some additional monitoring in place. Ideally, you might want to create a trigger to fire on any INSERT into sensitive audit records. The…
Posted by Nitin Pai
Comments (0)
June 18th, 2007
In a Data Integration environment which has multiple OLTP systems existing for same business functionality one of the scenarios that occur quite common is that of these systems ‘providing files of different formats with same subject content’. Different OLTP systems with same functionality may arise in organizations like in case of a bank having its core banking systems running on different products due to acquisition, merger or in a simple case of same application with multiple instances with country specific customizations. For example data about same subject like ‘loan payment details’ would be received on a monthly basis from different…
Posted by Muneeswara C Pandian
Comments (0)
June 14th, 2007
Chief Data Officer (CDO), the protagonist, who was introduced before on this blog has the unenviable task of understanding the data that is within the organization boundaries. Having categorized the data into 6 MECE sets (read the post dated May 29 on this blog), the data reconnaissance team starts its mission with the first step – ‘Profiling’. Data Profiling at the most fundamental level involves understanding of: 1) How is the data defined? 2) What is the range of values that the data element can take? 3) How is the data element related to others? 4) What is the frequency…
Posted by Karthikeyan Sankaran
Comments (0)
June 11th, 2007
In response to Charles' comments (shown below), I thought it will be nice to respond with a post which provides detailed instruction. In your example for PSAUTHITEM, does the trigger gets stored as part of the tools? In other words, if I run security export from this instance to another one, will the trigger get migrated too or should I run the create trigger SQL in the target system?" Here are the steps. 1. When you generate the trigger SQL (step 6 in my previous post), modify the SQL (as shown in step 7 in previous post) and click on…
Posted by Nitin Pai
Comments (0)
June 7th, 2007
Perils of DataMover Access– Part 2a PeopleSoft provides trigger-based auditing functionality as an alternative to the record-based auditing that PeopleSoft Application Designer provides. Perform the following steps to setup trigger based auditing for PSAUTHITEM. 1.  Create a custom table to store the audit data for PSAUTHITEM. And build the record in the database. 2.  Navigate to PeopleTools --> Utilities --> Audit --> Update Database Level Auditing. 3.  Add a New Value and select Record Name PSAUTHITEM 4.  Select the record AUDIT_AUTHITEM (we created in step 1) as the Audit Record. 5.  Check all the audit options. 6.  Click on Generate…
Posted by Nitin Pai
Comments (2)
June 5th, 2007
ETL represents the three basic steps: Extraction of data from a source system Transformation of the extracted data and Loading the transformed data into a target environment In general 'ETL' represented more of batch process and that of gathering data from either flat files or relational structure. When ETL systems started supporting data from wider sources like XML, industry standard format like SWIFT, unstructured data, real time feeds like message queues etc ‘ETL’ got evolved to ‘Data Integration’. That’s the reason why now all ETL product vendors are called Data Integrators. Now let us see how Data Integration or ETL…
Posted by Muneeswara C Pandian
Comments (0)
June 1st, 2007
Contact Us