We provide IT Staff Augmentation Services!

Sr. Etl Developer Resume Profile

2.00/5 (Submit Your Rating)

Highlights:-

  • Over 9 years of IT experiences in Data Warehousing ETL Requirements Gathering, Analysis, Design, Development, Integration, Implementation and Testing using Informatica PowerCenter/ Power Exchange 9.5.1HF4 to 7.1 for Health Care, Insurance, Banking IT, and Wireless Industry in different Methodologies.
  • Experienced in Migration of codes from Repository to Repository, wrote up Technical/Functional Mapping specification Documents for each Mapping along with unit testing for future development.
  • Proficient in Designing the Automation Process of Workflows and configuring/scheduling the Workflows for load frequencies. Skilled to Develop, Test, Tune and Debug the Mappings, Sessions.
  • Experienced to Profile, Analysis, Standardize, Clean, Integrate, Score Carding, Marge Match, Reference Data from various source systems using Informatica Data Quality IDQ Toolkits.
  • Worked with Address Doctor, different algorithms eg. Biogram, Jaro/Edit distance, Reverse algorithm in IDQ.
  • Proficient in interacting with the business users. Pioneered in different load strategies from heterogeneous source systems to target. Successfully implemented SCD Type1/Type2 load, Capture Data Changes and maintain the Data history. Monitored the system during data loading.
  • Experienced to identify the Bottlenecks of data load and Tuned the Bottlenecks for better performance.
  • Extensive experiences to create Logical Physical Data Modeling on Relational OLTP , Dimensional Data Modeling OLAP on Star schema for Fact Dimension tables using CA ERwin.
  • Experienced in Informatica Architecture/Administration for installing and configuring Repository, Web Hub services for configuration of Domain/Gateway Services Authentication, Configuration, Service Management etc using the Administration Consol, Repository Manager tools.
  • Excellent to write the Stored Procedures, Triggers, Indexes, Functions by PL/SQL, SQL Scripts. Developed various reports, dash board using MicroStrategy reporting tools.
  • Experienced as Database Administration in Oracle 9i, Flashback, RMAN to Recover the data, Database Design, Enterprise level Backup/ Recovery procedure, Performance Tuning, Table Partitioning, Database Architecture, Monitoring, and Database Migration, SQL Developer.
  • Excellent in communication and interpersonal skills vivid analytical ability to solve the problems.
  • Team player, motivated and dynamic with excellent oral and written communication skills.

Technologies:-

ETL/Data Model Tools

: Informatica PowerCenter/ Power Exchange 9.5.1HF4-7.1, SSIS, SSRS, CA Erwin

Database

: Oracle 11g-8i, MS SQL-Server 2005/2008r, DB2, Teradata, MySQL, Hadoop.

Reporting Tools

: OBIEE, MicroStrategy.

GUI Tools

: IDQ 9.5.1/8.6, RMAN, TOAD 9.5, SQL Plus, SQL Loader, XML Publisher, IIR, Web Services WSDL , SOAP, IDE, B2B, MDM, SAP, Putty, WinSCP, COBOL, BTEQ.

Programming Language

: SQL, PL/SQL, Java, C, C , C , T-SQL, XML, Unix Shell Scripting.

Operating System

: Windows 93/95/98/ME/NT/XP/ 7/8, Vista, Unix, Mac.

Professional Experiences:-

Sr. ETL Developer.

Confidential

Confidential, the largest healthcare industry in Los Angeles is implementing the new EDW over the old EDW in a multiple phases to acquire more accurate cost information. Previously the EDW were largely based on legacy application and annually the cost ratios were updated and then calculated for each visit summary during the DW build cycle. But there were exceptions to the ratios based upon specific CDMs and/or new CDMs, which might not be discovered in a timely fashion, leading to inaccurate costing analysis.

Responsibilities:-

  • Participated in daily/weekly meetings, monitored the work progresses of teams and proposed ETL strategies.
  • Based on requirements designed and coded new/old Mappings and validated/debugged old Mappings, tested Workflows, Sessions and figured out the better technical solutions on old new Mappings for Source/Target compatibility. Identified the bottlenecks in old Mappings and tuned them for better Performance.
  • Migrated the codes from Prod to Dev, Dev to Test, Test to Prod environment and wrote up the Team Based Development technical documents to smooth transfer of project. Prepared ETL technical Mapping documents along with test cases for each Mapping for future developments to maintain SDLC.
  • Worked on various complex SCD Type1/Type 2 Mappings in different layers to maintain the data history. Used Mappings Sessions Variables/Parameters, Parameters files, Reusable Transformations Mapplets to maintain the life cycle development and fixed other's Mappings.
  • For each Mapping prepared effective Unit, Integration and System test cases for various stages to capture the data discrepancies/inaccuracies to ensure the successful execution of accurate data loading.
  • Worked on Informatica PowerCenter 9.5.1HF4 Tools- Repository Manager, Informatica Designer, Workflow Manager/Monitor and carefully monitored the system during data loading.
  • Designed the automation process of Sessions, Workflows, scheduled the Workflows, created Worklets command, email, assignment, control, event wait/raise, conditional flows etc and configured them according to business logics requirements to load data from different Sources to Targets.
  • Created Pre Post-Sessions UNIX Scripts to merge the flat files and to create, delete temporary files, change the file name to reflect the file generated date etc.
  • Used Debugger to validate the Mappings and gained troubleshooting information about the data and error conditions. Involved in fixing the invalid Mappings. Wrote various Functions, Triggers and Stored Procedures to drop, re-create the indexes and to solve the complex calculations.

Environment:- Informatica PowerCenter 9.5.1 HF4, Oracle 11g/10g, SQL, PL/SQL, TOAD 9.5, Putty, WinSCP, UNIX Shell Scripting, MS-SQL Server 2005, SharePoint, HIPPA, IIR, Web Services WSDL , MySQL.

Sr. ETL / IDQ Developer.

Confidential

Confidential the largest health care industry, maintains more than 4000 hospital's data and provides the IT support. Currently the company is moving forward from ICD-9 to ICD-10 standard to consolidate the EDW onto one platform from various disparate EHR platforms in order to prepare for the mandatory CMS ICD-10 remediation. Project includes the development and testing of data integration into the new Enterprise Data Warehouse EDW .

Responsibilities:-

  • Coordinated in daily team meetings, technical code review meetings and interacted with business people for better technical solutions and proposed ETL strategy based on Agile Methodologies.
  • Worked on Informatica PowerCenter 9.1.0-9.5.1HF2 Tools- Repository Manager, Informatica Designer, Workflow Manager/Monitor, Informatica Data Quality IDQ Developer/Analyst Toolkits.
  • Validated, debugged old Mappings, tested Workflows Sessions and figured out the better technical solutions on old new Mappings for Source/Target compatibility due to version changes. Identified the Bottlenecks in old/new Mappings and tuned them for better Performance.
  • Developed various complex SCD Type1/Type 2 Mappings to maintain the data history in different layers. Used Mapping Sessions Variables/Parameters, Parameters files, Reusable Transformations Mapplets to maintain the life cycle development. Designed and coded of major change requests as per new requirements.
  • Worked on Automation process of Sessions, Workflows, scheduled the Workflows, created Worklets command, email, assignment, control, event wait/raise, conditional flows etc and configured them according to business logics requirements to load data from different Sources to Targets.
  • Worked on Team Based development for Migrating the codes from Development to Test, Test to Production environment and wrote up the Team Based Development technical documents to smooth transfer of project. Prepared ETL technical Mapping documents with test cases for each Mapping for future developments.
  • Prepared effective Unit, Integration and System test cases of Mappings for various stages to capture the data discrepancies/inaccuracies to ensure the successful execution of accurate data loading.
  • Created Pre Post-Sessions UNIX Scripts, Functions, Triggers and Stored Procedures to drop re-create the indexes and to solve the complex calculations on data.
  • Worked with Informatica Data Quality IDQ 9.5.1 Developer/Analyst Tools to remove the noise of data using different transformations like Standardization, Merge and Match, Case Conversion, Consolidation, Parser, Labeler, Address Validation, Key Generator, Lookup, Decision etc.
  • Created Reference/Master data for profiling using IDQ Analyst tools. Used the Address Doctor Geo-coding table to validate the address and performed exception handling, reporting and monitoring the data.
  • Built the Physical Data Objects and developed various mapping, mapplets/rules using the Informatica Data Quality IDQ based on requirements to profile, validate and cleanse the data. Identified and eliminated duplicate datasets and performed Columns, Primary Key, Foreign Key profiling using IDQ 9.5.1. for MDM.
  • Responsible to manage data coming from different sources like Hadoop, CSV, Flat file MS-SQL, Oracle etc. loading and transforming of large sets of structured, semi structured and unstructured data.
  • Worked on Web Services WSDL to extract data from different Web Links and created Mappings load the data into target tables. Used WSDL link to represent the data in MS-Word, MS-Excel in Ads-on menu.

Environments:-

Informatica Power Center 9.1.0-9.5.1HF2, Oracle 11g, PL/SQL, Toad 9.5, Dynamic SQL, UNIX Shell Scripting, SharePoint, HIPPA, Embercado, IIR, Web Services WSDL ,MDM, MySQL.

Sr. ETL / IDQ Informatica Developer.

Confidential

Confidential, the country's largest health care industry, planned to build an EDW in a multiple phase to maintain their historical data. This is a challenge for BCBS, because the data is coming from different source systems, is refreshing with two/three weeks delay. It is due to large volume of data also due to processing time, mainly due to large processes is running behind the scene. The company has planned to improve EDW and to get the most recent data Real-Time Data . All the data coming from different systems needs to be integrated transformed migrated to the target database.

Responsibilities:-

  • Participated in team meetings and proposed ETL strategy based on Agile Methodology.
  • Worked on Informatica PowerCenter 8.6.1-9.1.0.HF1 Tools- Repository Manager, Informatica Designer, Workflow Manager/ Monitor, Informatica Data Quality IDQ Developer and Analyst Toolkits.
  • Based on Subject Areas, provided concrete solutions for complex/critical Mappings and created various complex Mappings in different layers. Successfully implemented SCD Type1/ Type 2 for insert capture data changes and delete operation to maintain the data history. Created Mapping Sessions Variables/Parameters, Parameters files, Mapplets Reusable Transformations to reuse during life cycle development.
  • Created batches based on Subject Areas for different layers to run Workflows/Worklets and Sessions, scheduled the Workflows for load frequencies and configured them to load data.
  • Involved in debugging the invalid Mappings. Tasted Mappings, Sessions, and Workflows to figure out the Bottlenecks and tuned them for better performance. Built Unit test queries to test data accuracy.
  • Migrated the codes from Development to Test, Test to Production. Created effective Unit, System, Integration test of data on different layers to capture the data discrepancies/inaccuracies to ensure the successful execution of accurate data loading. Created technical documentations for each Mapping for future developments.
  • Designed and coded change request as per the new requirements. Created Pre Post-Sessions UNIX Scripts, Stored Procedures to drop re-create the indexes and to solve the complex calculation.
  • Worked with Informatica Data Quality IDQ 9.5.1 Developer/Analyst Tools to remove the noise of data using different transformations like Standardization, Merge and Match, Case Conversion, Consolidation, Parser, Labeler, Address Validation, Key Generator, Lookup, Decision etc.
  • Created Reference/Master data for profiling using IDQ Analyst tools. Used the Address Doctor Geo-coding table to validate the address and performed exception handling, reporting and monitoring the data.
  • Built the Physical Data Objects and developed various mapping, mapplets/rules using the Informatica Data Quality IDQ based on requirements to profile, validate and cleanse the data. Identified and eliminated duplicate datasets and performed Columns, Primary Key, Foreign Key profiling using IDQ 9.5.1. for the MDM.

Environments:-

Informatica Power Center 9.1.0/8.6.1, Teradata, Oracle 11g, PL/SQL,UNIX, Toad 9.5, Dynamic SQL, Shell Scripting. Web Services WSDL , SQL Navigator ILM, IDQ 9.5.1, IIR, BTEQ, MDM, Hadoop.

Sr. ETL Developer.

Confidential

Confidential Inc, the world's largest entertainment company took a Enterprise Data Warehouse EDW project to maintain their large volume of data. The project aim was mainly, the operational data coming from different source systems design the data model, clean, transform load them to an integrated repository database according to business requirements to develop analytical reporting system.

Responsibilities:-

Worked on data Integration from different sources using Informatica PowerCenter/Power Exchange 8.6.1/9.1 Tools- Repository Manager, Informatica Designer, Workflow Manager/ Monitor and Upgraded the system from 8.6.1 to 9.1. Participated in team meetings and proposed ETL strategies.

  • Developed various complex Mappings and successfully implemented SCD Type1/Type 2 to keep the data history changes. Designed and coded change request as per the new requirements.
  • Migrated the codes from Repository to Repository. Used debugger to validate the Mappings and gained troubleshooting information about the data and error conditions. Involved in fixing the invalid Mappings. Tested the Mappings, Sessions, Workflows and Worklets. Wrote the test queries to check if the data was loading to dimension tables and fact tables properly.
  • Created and reviewed the logical and physical data model for the fact and dimension tables according to business requirements to be used for EDW. Created DDL scripts to implement Data Model changes. Created ERwin reports in HTML, RTF format depending upon the requirements, published Data Model in Model Mart, co-coordinated with DBAs to apply the data model changes.
  • Created effective Unit, System and Integration Test cases for various stages of ETL to capture the data discrepancies and inaccuracies to ensure the successful execution of accurate data loading.

Wrote Functions, Stored Procedures to drop re-create the indexes and to solve the complex calculation as needed. Tested and maintained data integrity among various Sources and Targets. Worked on Performance Tuning to tune the data loading by identifying the Bottlenecks in Sources, Targets, Mappings, Transformations, Sessions, Database, Network then fixing them. Involved in providing Informatica Technical Support to the team members, as well as the business.

Environments:-

Informatica PowerCenter/Power Exchange 8.6.1/9.1, Teradata, Oracle 10g, MS-Server 2008, PL/SQL, ERwin 8.2, Toad 9.5, Putty, WinSCP, UNIX, SOAP, SAP, COBOL, Web Services WSDL , BTEQ, MySQL.

ETL Developer/ Informatica Data Modeler.

Confidential

Confidential LTD is the country's largest Mobile Company, provides the services to almost 50 million people. The Company planned to take multiple phase projects to maintain call records history. The purpose of this project was to design the DWH services to keep call record history of customers and information pertaining to customer demographics, transaction behavior, invalid SIM analysis of customer satisfactions.

Responsibility:-

  • Used Informatica 8.6.1 to Extract, Transform and Load data from Oracle, MS-SQL Server 2000, CSV and flat files to Oracle Server and Migrated the Informatica from 7.1 to 8.6.1.
  • Created several complex Mappings, Mapping Parameters/Variables, mapplets, Parameters files successfully implemented the SCD Type1/Type2 to load the data to maintain the history of data.
  • Migrated the codes from Repository to Repository. Created and Monitored Workflows/Sessions using Informatica Workflow Manager/Monitor to load data from different Sources to Target.
  • Tested all applications to smooth flow of data. Scheduled and ran the extraction load processes.
  • Maintained Metadata, naming and Warehouse standards for future application development.
  • Involved in writing Functions, Triggers, and Stored Procedures in PL/SQL as per the requirements.
  • Extensively worked on Performance Tuning to tune the data load by identifying the Bottlenecks in Sources, Targets, Mappings, Transformations, Sessions, Database, and Network.
  • Worked on data conversion, integration, load verification. Performed Unit, System, Integration. Tested Mappings in various stages of ETL to capture the data discrepancies/inaccuracies to ensure the successful execution of accurate data loading, and used debugger to troubleshoot logical errors.
  • Created and reviewed the logical and physical model for the fact and dimension tables from the source systems according to business requirements to be used for EDW. Created DDL scripts for implementing Data Modeling changes. Created ERwin reports in HTML, RTF format depending upon the requirement, published Data model in model mart, co-coordinated with DBAs to apply the data model changes.
  • Provided Informatica technical support to the team members, as well as the business.

Environments:-

Informatica Power Center 8.6.1, Oracle 9i, MS-SQL Server 2005, PL/SQL, UNIX, ERwin 7.2, Toad 9.5. MS-DOS, SIF, SQL Navigator, ILM, Putty, WinSCP, Unix Shell Scripting.

ETL Developer.

Confidential

Confidential LTD is a country's largest Heath Care Company providing a steady flow of innovative services/ medicines that improve health well being of people around the country. The project focused on gathering Customers, Insurance to customers, Products Orders etc and loading that data into EDW.

Responsibilities:-

  • Participated in team meeting to design, develop implement the project proposed ETL strategies.
  • Developed several complex Mappings and successfully implemented SCD Type1/Type 2 to keep the history changes. Worked with Variables/Parameters, Parameter Files, Workflows, Sessions and configured them for smooth transfer of data. Extensively worked on identifying the Bottlenecks and resolved them to accelerate the performance of loading the data. Performed tuning of by Text-Enhance >SQL queries for speedy extraction of data and troubleshooted the long running Sessions and fixed the issues.
  • Involved in the development of Stored Procedures, Functions, Views, Materialized Views, and Triggers, to drop re-create the indexes and to process business data according to requirements.
  • Worked with the testing team to resolve bugs related to ETL Mappings, created effective Test Cases Unit, Coding, and Integration Testing for various stages of ETL to capture the data discrepancies and inaccuracies to ensure the successful execution of accurate data loading.
  • Designed and developed ad hoc queries, analytics and dashboards through Micro Strategy solutions.
  • Reported to the senior Micro Strategy consultant about weekly and monthly project status.
  • Created Objects into repository by merging and importing into Micro Strategy Repositories.
  • Created the weekly project status reports, tracking the progress of tasks according to schedule and reporting any risks and contingency plan to management and business users.
  • Involved in meetings with production team for issues related to deployment, maintenance, future enhancements, backup and crisis management of DW.

Environment:- Informatica Power Center 7.1, ERwin 4.0, CDC, Windows 2000, Oracle 9i/, PL/SQL, TOAD. Jaspersoft, Micro Strategy, Putty, WinSCP, UNIX Shell Scripting.

ETL Developer Informatica .

Confidential

This Data warehouse Application was built for the Credit Cards Division and was aimed at building a data mart for Sales analysis which is called Sales Performance data mart . The project involves providing online credit and liability information to customers and management. The project aimed at facilitating faster credit processing for the customer and also helps in assessing and containing the credit risk for the bank. The project has interface with a number of subsystems, which provide it real time information to provide the global picture of a customer. This ETL initiative gets data feeds from various source systems across the globe.

Responsibilities:-

  • Designed developed ETL processes based on business rules, job control mechanism using Informatica Power Center 7.1. Re-engineered on existing Mappings to support new/changing business requirements.
  • Migrated the codes from Repository to Repository. Worked extensively on complex standard, non-reusable Mappings and successfully implemented SCD Type1/Type 2 to keep the history changes.
  • Used Workflow Monitor to monitor the jobs, reviewed Sessions/workflow logs that were generated for each Sessions to resolve issues, used Informatica debugger to identify issues in Mapping execution.
  • Monitored production jobs on a daily basis and worked on issues relating to the job failure and restarted failed jobs after correcting the errors. Performed administration jobs like user, privileges, migrations, starting, stopping pmrep/pmserver. Backup and restore repository service.
  • Used Mapping, Sessions Variables/Parameters, and Parameter Files to support change data capture and automate workflow execution process to provide 24x7 available data processing.
  • Involved in writing UNIX Shell scripts Pre/Post Sessions commands for the Sessions wrote Shell scripts to kickoff workflows, unscheduled workflows, get status of workflows.
  • Tuned SQL Statements, Mappings, Sources, Targets, Transformations, Sessions, Database, Network for the bottlenecks, used Informatica parallelism options to speed up data loading to target.
  • Developed PL/SQL Procedures to process business logics in the database and use them as a Stored Procedure Transformation. Created various Functions, Triggers, and Views to resolve the business needs.

Environment:- Informatica PowerCenter 7.1, Oracle 9i, SQL Server 2000, Flat Files, CSV files, PL/SQL, UNIX Shell Scripting, TOAD. Putty, WinSCP.

Database Administrator Oracle .

Confidential

Confidential IT World is involved in the business of multiple platforms. To run the business smoothly the company took the decision to maintain their database without any interruptions. The purpose of this project was to maintain the database integrity, 24/7 hour support and keep the backup of the data.

Responsibility:-

  • Created several databases, documented migration process, monitors, and load scripts. Altered, modified and created tables, views and sequences. Created Oracle instances with appropriate initialization Parameters as requested. Took RMAN backup of the data in Tape and Disk.
  • Created users and assigned them privileges and roles as requested by application developers. Controlled the Database User Maintenance, Access Control, Set up Oracle Data Guard, Tape Management Procedure Creation, and Schema Management ensuring Database Security.
  • Exported data from old systems and imported them into new databases using expdp and impdp. Customized Export/Import for large scale data transfer. Created and maintained of Recovery Catalog, Created Materialized Views, Indexes, rebuild Indexes and table partition to tune the databases.
  • Modified table spaces and object storage as needed. Managed archives/redo logs Creating control file, log file, Recovery the data, figured out the bottlenecks for Performance Tuning Activities.
  • Developed system program to automate database backup, monitoring and other DBA functions.
  • Automated routine DBA tasks of database monitoring by Shell / SQL scripts was responsible for proactive management. Cloning of the database with the help of EXPORT/IMPORT and RMAN.

Environments:-

Oracle 10g, Data-Guard, Application Server, Tuning, Flashback, RMAN, OEM, Linux, Data Warehousing, Partitioning etc.

We'd love your feedback!