Sr. Etl Informatica Developer Resume
Boston, MA
SUMMARY
- Over 5+ years of IT experience in analysis, design, development and implementation of Data warehouses, data marts and Decision Support Systems (DSS) using Informatica Power Center with Oracle, MS SQL server, DB2 and Teradata databases.
- Over one year of experience with Salesforce CRM.
- Experience in SFDC development using Apex classes and Triggers, Visual Force, Force.com IDE, SOQL, SOSL.
- Extensive experience in data migration and integration using Data Loader.
- Strong data processing experience in designing and implementing Data Warehouse and Data Mart applications, mainly Transformation processes using ETL tool INFORMATICA POWER CENTER, and UNIX Shell Scripting
- Excellent proficiency in Data Extraction, Transformation and Loading, Database Modeling and Data ware housing tools and technologies such as Ab Initio, ODI and Erwin.
- Proficient in Dimensional Data modeling and Relational Star Join Schema/ Snowflake models, FACT & Dimensions tables, Physical & logical data modeling and Ralph Kimball Methodologies
- Proficient level working withInformaticaPowerCenter and MDM9.5(Master Database Management) Methodology. Hands on experience identifying the most critical information within organization and creation of a single source of its truth to power business processes.
- Strong database experience using Oracle, SQL Server, Teradata, SAP HANA, SQL, PL/SQL, SQL*Loader, Stored Procedures, TOAD, MS Access and UNIX kornshell scripting.
- Extensively work in ETL process consisting of data transformation, data sourcing, mapping, conversion and loading.
- Proficiency in data warehousing techniques for data cleansing, slowly changing dimensions’ types (1, 2 and 3).
- Expertise in SQL/PLSQL in developing & executing Stored Procedures, Functions, Triggers and tuning on queries while extracting and loading data.
- Extensively worked withTeradatautilities likeBTEQ, Fast Export, Fast Load, Multi Load to export and load data to/from different source systems including flat files.
- Monitoring actively Informatica job runs.
- Experience in Performance Tuning and Debugging of existing ETL processes.
- Worked withInformaticaData Quality toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities ofIDQ.
- Experience in UNIX shell scripting and configuring Cron jobs for Informatica job scheduling, backup of repository and folder
- Extensive experience and knowledge of the project lifecycle, including requirements gathering, development and execution.
- Coordinated offshore efforts and day - to-day activities,
TECHNICAL SKILLS
ETL Tools: Informatica Power Center 9.x/8.x/7.x, Data Stage 8.5. Power Exchange for Datacom(CDC), Informatica Data Quality, Informatica MDM Hub 9.1, Informatica MDM Hub 9.7,IDQ
DBMS: Oracle 11g/10g/9i/8i, Netezza, SQL Server 12, Green Plum, AWS redshift, SSIS
Data Modeling: Dimension Modellin
Software Tools: Toad, SQL developer, SQL *Plus, MS Office, MS Visio, IBM Rational Clear Quest, File Zilla, SQL Loader, ER Studio 10.0, PG Admin 1.22, enterprise scheduler
Operating System: Windows, UNIX, and Linux.
Languages: SQL, PL/SQL, Python 3.4.2
Big Data Technologies: Sqoop, Hive, spark, Hadoop
Reporting Tools: COGNOS 8, OBIEE 11g
PROFESSIONAL EXPERIENCE
Confidential - Boston, MA
Sr. ETL Informatica Developer
Responsibilities:
- Gathered user Requirements and designed Source to Target data load specifications based on Business rules.
- Used Informatica PowerCenter9.5 for extraction, loading and transformation (ETL) of data in the data mart.
- Designed and developed ETL Mappingsto extract data from flat files, MS Excel and Oracle to load the data into the target database.
- Developing several complex mappings in Informatica a variety of PowerCenter transformations, Mapping Parameters, Mapping Variables, Mapplets & Parameter files in Mapping Designer using Informatica PowerCenter.
- Extensively used ETL processes to load data from various source systems such as SQL Server and Flat Files, XML files into target system applying business logic on transformation mapping for inserting and updating records when loaded.
- Involved in designing, developing and deploying reports in MS SQL Server environment using SSRS-2008andSSIS in Business Intelligence Development Studio (BIDS).
- Used ETL (SSIS) to develop jobs for extracting, cleaning, transforming and loading data into data warehouse.
- Involved in the Migration of Databases from SQL Server 2005 to SQL Server 2008.
- Prepared the completedata mappingfor all the migrated jobs using SSIS.
- Designed SSISPackagesto transfer data from flat files to SQL Server using Business Intelligence Development Studio.
- Extensively used SSIS transformations such asLookup, Derived column, Data conversion, Aggregate, Conditional split, SQL task, Script task and Send Mail task etc.
- Developed PL/SQL triggers and master tables for automatic creation of primary keys.
- Performed SQL and PL/SQL tuning and Application tuning using various tools like EXPLAIN PLAN, SQL*TRACE, TKPROF and AUTOTRACE.
- Developed Advance PL/SQL packages, procedures, triggers, functions, Indexes and Collections to implement business logic using SQL Navigator. Generated server side PL/SQL scripts for data manipulation and validation and materialized views for remote instances.
- Implemented Slow changing dimensions SCD-1 and SCD-II mappings to upgrade Slowly Changing Dimension Tables.
- Experience in migration Informatica objects from 9.1 to 9.5.
- Experience in Informatica upgrade testing, troubleshoot and resolve issues.
- Created complex mappings in the designer and monitored them. The mappings involved extensive use of Aggregator, Filter, Router, Expression, Joiner, Union, Normalizer and Sequence generator transformations.
- Ran the workflows on a daily and weekly basis using Tidal Scheduling tool.
- Examined the workflow log files and assigning the ticket to the Informatica support based on the error.
- Experience in developing Unix Shell Scripts for automation of ETL process.
- Performed operational support and maintenance of ETL bug fixes and defects.
- Maintained the target database in the production and testing environments.
- Supported migration of ETL code from development to QA and QA to production environments.
- Migration of code between the Environments and maintaining the code backups.
Environment: Informatica PowerCenter 9.5, PL/SQL, Flat files, Facets, XML, SQL Server 2012R2, SQL Integration Services(SSIS), Ab Initio, Microsoft Visual Studio 2010, SVN, Tidal, Microsoft Visio.
Confidential, NY
Sr. ETL Informatica Developer
Responsibilities:
- Interacted with the Business Users to analyze the business requirements and transform the business requirements into the technical requirements.
- Prepared technical specifications for the development ofInformatica(ETL) mappings to load data into various target tables and defining ETL standards.
- Working as an Informaticadeveloperfor loading Mortgage data to the target SQL Server.
- Define solution and features and worked in the agile environment.
- Created ETL mapping documents for every mapping and Data Migration document for smooth transfer of project from development to testing environment and then to production environment.
- Created mappings using different transformations like Source Qualifier, Joiner, Aggregator, Expression, Filter, Router, Lookup, Update Strategy, and Sequence Generator etc to extract data from SQL server and load into SQL Server and generate FlatFiles.
- Used shared folders for having shared mappings, shared transformations, shared sources and shared targets.
- Worked on Agile Methodology and Involved in installingInformaticaand adding users and creating folders and giving permissions to users.
- Developed mapplets and worklets for reusability.
- Created post-session and pre-session shell scripts and mail-notifications.
- Scheduled workflows and shell scripts using Autosys.
- Designing and creation of complex mappings using SCD type II involving transformations such as expression, joiner, aggregator, lookup, update strategy, and filter.
- Designed and Implemented the ETL code for Address verification, Identity checking by working withIDQ.
- DesignedIDQmappings which is used as Maplets in Power center.
- Developed numerous mappings using the various transformations including Address Validator, Association, Case Converter, Classifier, Comparison, Consolidation, Match, Merge, and Parser inIDQ.
- Involved in Performance tuning in Mappings and Sessions by identifying the bottlenecks and Implemented effective transformation Logic.
- Involved in creating shell scripts and involved in production support.
- Involved in Performance tuning ofInformaticamappings, workflows and SQL queries/procedures.
- Worked with testing team in tracking and resolving the defects/Bug fixing and executing Ad-hoc Requests.
- Involved in Unit Testing, User Acceptance Testing (UAT) to check whether the data loads into target are accurate, which was extracted from different source systems according to the user requirements.
Confidential
Java Developer
Responsibilities:
- Involved in all the phases of SDLC including Requirements Collection, Design & Analysis of the Customer Specifications, Development and Customization of the Application.
- Worked on all phases of application development to implement the assigned use cases successfully.
- Deployed the application on IBM Web Sphere application server.
- Involved in coding for different modules by using JSP, Servlets.
- Involved Bug Fixing and Migrating the Application in Different Environments.
- Deployed application ears on IBM WebSphere Application Server Network Deployment in Development, System Test and Performance Test on a daily basis and troubleshoot various configuration and application issues.
- Configured WebSphere resources such as JDBC Providers, Data Sources, and Connection Pooling and administered performance tuning.
- Configured Admin console security on WebSphere and creating users with various roles to access the WebSphere admin console users and groups and added them in LDAP Console Groups using Admin console.
- Developed Crystal Reports based on user requirements
- Deploying and Configure Reports in BO Server
- Developed Crystal Reports and deployed in BO Server.
Environment: Java, J2EE, JSP, IBM Web Sphere Application server, My Eclipse, JavaScript, Oracle 9i, JDBC, SVN, Crystal Reports 9.