Sr. Informatica Developer Resume
San Ramon, CA
SUMMARY
- Have nearly 10 years of Developer experience in Data warehouse & Business Intelligence platform using Informatica Power Center, Informatica Cloud, Salesforce.com, Informatica Data Quality, Oracle, Teradata, SQL Server, UNIX/Linux, Shell scripting, Business Objects, Cognos Reports, Pentaho Data Integration (Kettle), Python.
- Worked in Healthcare, Health insurance, Banking/Financial, Manufacturing domains.
- Expert in the Data warehouse concepts and methodologies such as Ralph Kimball and Bill Inmon models
- Have extensively worked on OLTP, OLAP, Dimensional modeling - Star schema, Snowflake schema etc, Operational Data store, Conceptual Data Base Designing,B2B exchange, Web services, SOA architecture
- Certified in Informatica mapping designer, Administration & Architecture, Informatica Data Quality and Oracle SQL.
- Have used most of the transformations in Informatica Power center, integrated with all kinds of heterogeneous systems such as Flat files, COBOL file, XML, RDBMS and CSV files.
- Worked on IDQ for Address cleansing using Address doctor and for source data profiling to generate scorecards.
- Have used Informatica Cloud for Data synchronization and Data replication between Salesforce.com and Oracle Server.
- Have good experience in Salesforce.com, CRM, Creation of objects, Validation of objects, Upsert, Delete.
- Also proficient in Sales force Data loader utility to carry out data scrubs such as Insert, update. Have used Force explorer to view the data in the backend.
- Expert in UNIX Shell Scripts
- Exposure to Informatica MDM, Netezza(trained), MySQL, NoSQL, BODI, ODI, OBIEE, Big data, Data Science, Hive, Hadoop, MapReduce, Spark etc
- Have worked and have good knowledge on the scheduling tools like Autosys, Tivoli, Tidal.
- Undergoing external trainings in Python scripting and Tableau
TECHNICAL SKILLS
ETL: Informatica Power Center 9.x/8.x/7.x, Informatica Data Quality(IDQ) 9.x/8.x, Pentaho Data Integration (Kettle), Informatica Power Exchange 9.x, Exposure to ODI, BODI
Cloud: Salesforce.com, Informatica Cloud integration with Salesforce.com
Reporting: Business Objects XIR2, Cognos 8.4, Jaspersoft iReports, Exposure to OBIEE
Programming: SQL, UNIX Scripting, Python(Beginner)
Databases: Oracle 12c/11g/10i/9i, Teradata V2R5, SQL Server 2008 R2, Netezza (Trained), Exposure to MySQL, NoSQL
Tools: /Utilities: Salesforce Data loader, SQL Developer, Toad, Tivoli, Autosys, Rapid SQL, JIRA, Jenkins, HP QC, HP ALM, Force Explorer
SCRM Tools: VSS, Star Team, Tortoise SVN, Tortoise CVS
PROFESSIONAL EXPERIENCE
Confidential, San Ramon, CA
Sr. Informatica Developer
Tools: Informatica Power Center 10.1/9.6.1, Salesforce.com, Informatica Developer 9.6 (IDQ), Oracle 12c/11g, flat files, RDBMS, Unix Shell Scripting, SVN, HP QC
Responsibilities:
- Developed Informatica mappings to build a data warehouse( Dimensional model) from various sources likeOracle,flat files, XML files
- Generated different kinds of targets such as Flat file and Oracle.
- Integrated with Salesforce.com using Informatica Power Center to pull accounts, contacts data.
- Created IDQ mappings and profiles for Source Data (KDE) profiling and produced score card, frequencies, values, charts etc to view number of null values, wrong date formats etc.
- Generated scorecards/trend charts
- Developed a reusable mapplet to generate PK/FK for the tables in any database.
- Extensively used Mapping Variables, Mapping Parameters to execute complex business logic
- Designing the dimensional model and data load process using SCD Type II.
- Design and development of complex ETL mappings making use of Connected/Unconnected Lookups, Normalizer, Stored Procedures transformations.
- Proficient in using Source Analyzer, Warehouse Designer, Transformation Designer, Mapping Designer and Mapplet Designer.
- Used debugger in debugging some critical mapping by setting breakpoints and trouble shot the issues by checking sessions and workflow logs.
- Developed UNIX shell scripts to create parameter files, rename files, compress files and developing script for scheduling periodic load processes.
- Worked closely in setting up the environment for various file transfer activities between the systems using SFTP as the file transfer protocol.
Confidential, San Francisco, CA
ETL Lead (Sr. Informatica Developer)
Tools: Informatica Power Center 9.6, IDQ 8.6, Informatica Cloud, Salesforce.com, Oracle 10g, Netezza, Flat files, XML, Unix Shell Scripting, JIRA, SVN, HP QC
Responsibilities:
- Developed number of Informatica mappings to consume data from different sources likeOracle,flat files, COBOL files and generated different kinds of targets such as Flat file, Oracle and XMLs.
- Created an automated utility to pick the Federal Error file from SFTP server and load into the RDBMS tables with error code and description at Subscriber level.
- Created Informatica web service mappings and configured/communicated with external web services.
- Developed ETL routines and loaded EDW using Informatica Power Center and created mappings involving transformations like Lookup, Aggregator, Ranking, Expressions, Mapplets, connected and unconnected stored procedures, SQL overrides usage in Lookups and source filter usage in Source qualifiers and data flow management into multiple targets using Routers.
- Integrated with Salesforce.com using Informatica cloud and replicated the accounts, contacts data to the existing oracle database and processed this data for EDW loads.
- Designed data load process using SCD Type II for the quarterly membership (Member Eligibility) reporting purposes.
- Trained in Netezza and loaded data into multiples tables. Proficient in Netezza SQLs and architecture.
- Used IDQ for address doctor reference to cleanse the Health insurance member address information for any Claims, COB, EOB related information
- Written multiple Shell scripts for file handling and archiving them.
Confidential, Pleasanton, CA
Sr. Informatica Developer
Tools: Informatica Power Center 8.6.1, Power exchange 9.5.1, Oracle 11g, Flat files, XML, Unix Shell Scripting.
Responsibilities:
- Extracted data from different sources likeOracle,flat files, XMLs and generated data in oracle tables and XMLs
- Used power exchange for Change data capture for real time. We used exchange for sending member/claim updates to customer analytics.
- Extensively used Mapping Variables, Mapping Parameters to execute complex business logic
- Worked closely in setting up the environment for various file transfer activities between the systems using SFTP as the file transfer protocol.
- Worked on Agile methodology in the project
Confidential, Minneapolis, MN
PL/SQL Developer
Tools: Oracle 10g, PL/SQL, Rapid SQL, Facets 4.71 application, Unix
Responsibilities:
- Gained proficient knowledge on Facets membership and claims module.
- Worked on PL/SQL development and upgrades
- Upgraded the existing inventories as per the latest Facets version 4.71 to support 5010and ICD -10.
- Worked as a ‘Release manager’ in the team for monitoring the production release of packages/patches of the inventories
Confidential, San Francisco, CA
Informatica Developer
Tools: Informatica Power Center 8.1.1, Oracle 9i, SQL Server 2008 R2, Flat files, SQL developer
Responsibilities:
- Involved in Facets implementation such Facets upgrade.
- Generating the required feeds from analytical warehouse to ETL logic to handle changes in source and target layout.
- Inclusion of new fields to source tables/files which are to be populated down to DW environment.
- The given ETL code was analyzed and performance bottlenecks were identified.
- Accordingly designed the required code changes and implemented them. This will ensure that the ETL job will run within the SLA time.
- Extracted data from multiple sources such as Oracle, and Flat Files and loaded the transformed data into targets in Oracle, Flat Files.
- Improving the database access performance by tuning the DB access methods like creating partitions, using SQL hints, and using proper indexes.
- Using Informatica variables and Parameters in the mappings to in order to automate the runs.
- Created UNIX shell script to FTP flat files from different ordering systems to the ETL server.
- Developed Test Cases according to Business and Technical Requirements and prepared SQL scripts to test data.
Confidential, Rochester, NY
DWBI Developer
Tools: Informatica 8.6.1, Cognos 8.4, Oracle 10g
Responsibilities:
- Involved in analysis and Interact with Client/onsite team on the queries coming up in the analysis
- Involved in providing Conceptual & Detail Design of the interface (solution from source to the target applying the transformation).
- Designed and developed Informatica mappings as per the business specifications and generated various file output and RDMS outputs.
- Developed Cognos reports as per the report specification and unit tested the same to meet the report specification
- A part of project was handled with SQL server as database where I gained good amount of knowledge on SQL server as well.
Confidential
DWBI Developer
Tools: J2EE Web application, Pentaho servers, Kettle, Jaspersoft iReport, Schema workbench, SQL developer
Responsibilities:
- Understanding the requirements and BRD given by business team
- Involved in analysis and Interact with onsite team on the queries coming up in the analysis
- Transformed data from one system to another system using Kettle
- Involved in development of reports using Pentaho, iReport and developed more than 25 reports using Pentaho, iReport.
- Involved in Testing and validating data in database via SQL Developer
- Worked BODI, Teradata with the same client for a different project