We provide IT Staff Augmentation Services!

Sr. Etl/informati Developer Resume

0/5 (Submit Your Rating)

CA

PROFESSIONAL SUMMARY:

  • Over 10 years of IT experience in Analysis, design, development and Implementation of Business Intelligence solutions using Data Warehouse/Data Mart Design, ETL, OLAP, Client/Server and Web applications on Windows and Unix platforms in various verticals in Data Integration technologies.
  • 10 years of experience in Dimensional Data modeling using Star & Snow Flake schema. Extensive Experience with Ralph Kimball and Bill Inmon Methodologies Designed models.
  • Thorough understanding of Software Development Life Cycle (SDLC) including requirements analysis, system analysis, design, development, documentation,, implementation and post - implementation review.
  • Expertise working for Banking and Retail data integration projects.
  • Experience working on Agile, Kanban, Sprint delivery methodology in multiple projects.
  • Expertise in Informatica PowerCenter 8.x,9.x in creating ETL jobs.
  • Expert in tasks associated with data such as data analysis, data validation, data modeling and data cleaning.
  • Extensively used SQL, PL/SQL in writing Stored Procedures, Functions, Packages and Triggers.
  • Great Expertise in using Exception Handling strategies to capture errors and referential integrity constraints of records during loading processes to notify the exception records to the source team.
  • Sound experience in Performance tuning of ETL process. Reduced the execution time for huge volumes of data for a billion dollar client.
  • Extensive experience in Tuning and scaling the DB scripts for better performance by running explain plan and using different approaches like hints, indexes,partitioning etc.
  • Good exposure to working with various scheduling tools like Autosys and Cronacle.
  • Expertise working on Big Data, Hadoop ecosystem like mapreduce, HDFS, hive, pig, NoSQL.
  • Extensive experience developing & deploying projects at enterprise level using Informatica’s different products.
  • Extensive integration experience in working on different databases like Oracle, Teradata, MS Sql, SAS, flat files.
  • Expertise in maintaining the versioning using Informatica Repository and deployment using deployment groups.
  • Experience designing for large scale, highly available, fault tolerant data management systems in a dynamic environment.
  • Extensive experience working on data governance projects encompassing Banking Industry.
  • Experience on working with Client, Customer and User interfacing projects.
  • Well versed in onsite-offshore model. Expertise in leading offshore teams from onsite.
  • Team player and self-starter with good communication and inter-personal skills.
  • Organized professional with immensely strong leadership capabilities.

TECHNICAL SKILLS:

Data Warehousing: Informatica PowerCenter 9.6//9.1/8.6/8.5/8.1 (Repository Manager, Designer, Workflow Manager), Informatica Developer (IDQ), Metadata Manager, ETL,OLAP,OLTP

Business Intelligence: SAS data miner, SAS Enterprise guide

Programming Languages: SQL, PL/SQL, UNIX, Shell Scripting, Python 2.6

Databases: Oracle 8, 9i, 10g,11g, Teradata, MySql 4, SQL Sever 2000

Platforms: Windows 9x/NT/2K/XP, UNIX, Linux

Scheduling Tools: Autosys R11, Redwood Cronacle Explorer

PROFESSIONAL EXPERIENCE:

Confidential, CA

Sr. ETL/Informatica Developer

Responsibilities:

  • Involved in Full Life cycle implementation of the project and participated in the preparation of business analysis documentation.
  • Working directly with client to capture requirements and preparation of FSD’s.
  • Responsible for delivering the objects as per the project plan.
  • Responsible for maintaining the versions of objects and deployments to all environments.
  • Populated the Staging tables with various Sources like Flat files (Fixed Width and Delimited) and Relational Tables.
  • Writing complex queries to transform the data.
  • Creating stored procs in Oracle which are called in backend from Angular JS for the UI initiative.
  • Creating resuable components in Informatica to validate the files, capture errors, log errors etc.
  • Expert in quick development for ETL’s.
  • Worked on Hierarchical relationship within table to build mappings per business rule.
  • Worked on data profiling activities for data quality checks for the Payments.
  • Used Informatica Developer to do Standarization (IDQ) of fields as per governance standards.
  • Created Infa mappings to load complex xml targets.
  • Worked on Address data validation transformation for Payment messages using IDQ.
  • Worked on ISO 20022 payment messages, CAMT messages, PAIN messages, PACS messages both as reading from them and writing to them.
  • Analysed the xsd’s to understand the format of payment messages and did validations in Informatica as per the xsd restrictions and formats.
  • Worked on data masking activities for the payment messages of personal identifiers using IDQ.
  • Worked on the NACHA payments messages in B2B DT (Data Transformation) studio.
  • Creating workflows that are capable of handling near real time integration of data between applications.
  • Created mappings using various Transformations such as Source qualifier, Aggregator, Expression, lookup, Router, Filter, Rank, Look up, Sequence Generator, and Update Strategy.
  • Interacting with other teams and stake holders on daily basis.
  • Handshaking with other processes on which Informatica was dependent and processes which depended on Informatica.
  • Working with business partners and Product managers to define key components of data governance policies and guidelines.
  • Worked on metadata manager identifying the lineages of various fields.
  • Worked with data modelers to introduce Partitions in the PDM to manage data effectively for insertion and purging as part of performance tuning.
  • Designing ETL as per the data model files from Erwin and Embarcadero modeling tools.
  • Worked with DBA in collections of stats and Exchanging table partitions to give users a true snapshot of data.

Confidential, Roseville

Lead Informatica Developer

Responsibilities:

  • Working directly with Business users to understand the requirements and key components of finance regulations.
  • Articulating and presenting the ETL architecture to IT managers, Architects and Project Sponsor.
  • Defining and Designing the Database for Extraction layer, staging layer, Historical schema and fact & dimension tables using data warehouse principles like snow flake schema, surrogate keys etc.
  • Developing ETL Mappings, sessions, tasks, workflows in Informatica Powercenter 9.6.1
  • Developing exception handling strategies to track the reconcillation errors and designing framework and process to fix them and load it into the data warehouse.
  • Developing checks and controls for various flat files consumed through the ETL framework and capturing the errors beforehand thus preventing any bad data getting loaded into DWH.
  • Creating and leveraging linux scripts to automate file validation, file movements from windows network shared drive to unix linked folders.
  • Creating reusable components for data quality in IDQ and Informatica PowerCenter.
  • Creating the framework for data quality encompassing field level validations,data type checks, range validations, subject validations, Address standardizations, Parser capabilities using IDQ.
  • Developing Errors and control framework for the data governance effort.
  • Writing complex database queries to join, aggregate, filter, partition etc to analyse and understand data.
  • Responsible for documentating the process workflows and charting out process for code promotion to QA, UAT and Production environments.
  • Worked on various Informatica transformation like Source Qualifier, Expression, Aggregator, Filter, look up, Joiner, Sorter, Update Strategy to apply logics around transforming of data.
  • Defining softlinks of files in unix in specific project folders and using them as Source.
  • Working on multiple project folders under Informatica repositories.
  • Maintaining Source and Target definitions under shared folders and using shortcuts in Project specific folders.
  • Exporting the xmls of workflows, mappings for promotion to various environments.
  • Writing DDL’s, creating indexes, synonymns, providing grants to objects to various roles and handing over the sql statements to DBA for production migration.
  • Working on the parametrization of mappings and using variables and parameters to build scalable ETL mappings.
  • Responsible for managing the parameter files and updating them based on the changes to parameters.
  • Working closely with team doing peer-peer code reviews and giving walk through of the best practices and solutions.
  • Maintaining user confidence by delivering the critical deliverables on time and handling adhoc requests simultaneously.
  • Helping users analysis data and reports using Excel spreadhseets or using sql and exporting data.
  • Working closely with Finance head office and participating in Q&A calls.

Environment: Informatica Powercenter 9.6, Oracle 11g, PL/SQL, SQLDeveloper, MS Excel, Putty, WinSCP, WinMerge, Unix

Confidential, Beaverton

Sr. Informatica Developer

Responsibilities:

  • Working directly with Business users to understand the requirements of various report requirements and preparation of Technical Design Documents.
  • Created physical & logical data models for the data marts.
  • Created Dashboards for quick analysis of data using Tableau Desktop 8.2.
  • Worked on Geo expansion of Confidential EU and created exception capture process of ETL with regards to different payment methods.
  • Wrote sql scripts and created documentation for Data profiling and data analysis tasks.
  • Creating AUTOSYS for Batch and command jobs, file watchers, table polling for scheduling in Autosys in various versions.
  • Extracted data from relational sources, flat files and excel workbooks, which were received periodically as well as on an ad hoc basis and loaded them into the Data Hub. Utilized various transformations like lookups, update strategy, routers, union etc to transform and map data to the target systems as per the specifications.
  • Wrote and executed stored procedures that were called through UNIX scripts to populate certain tables in the Data Hub. It also involved debugging and tuning SQL Queries and procedures, using HINTS and finding an optimum solution using the Explain Plan.
  • Created views and MVs for Adhoc reporting based on the requirements for China's ecommerce implementation.
  • Working on Hadoop hosted on AWS.
  • Created S3 buckets on AWS to store data as files.
  • Used Spark to perform map reduce on files to create data pipelines.
  • Used Boto3 to connect to manage EC2 instances on AWS.
  • Created ETL for coupons and promotion data to handle multiple scenario for Japan & China. Merged 2 existing ETL's for Japan & China into 1 ETL.
  • Created marketing channel aggregate ETL process to match visits on Confidential .com site and placed orders and automated 14 day(parametrized) look back join on Order Item Fact.
  • Performed data analysis, data profiling and prepared documentation as per Confidential standards.
  • Created unix scripts and stored procedures to automate the handling of the latency in data through Oracle golden gate. This script looks for data to cross over 00:00 hours before kicking the ETL Box.
  • Worked on ETL for Confidential .com to feed data for quarter end financial reports as per SOX compliance.
  • Worked with Cross-Functional teams during the design phase and development phase of ETL's.
  • Tuned Informatica Mappings and workflows for optimum performance and set a benchmark for LTTC aggregates by reducing run time by more than 90% for history re-loads.
  • Worked on the documentation of the ETL process with information about the various mappings, the order of execution for them andthe dependencies.
  • Worked on performance tuning of Rank, Lookup, Joiner, Sorter and Aggregator transformations.
  • Partitioned Informatica workflow’s on Region keys to improve performance of ETL jobs.
  • Involved in Unit testing along with progression testing of ETL's.
  • Worked on the performance improvement & enhancement to RPA SO Core ETL and improved performance of merge by adding a post sql procedure with Oracle hints and improved select by introducing partitions on the table .
  • Worked on RPA Sales ETL EU enhancements & bug fixes .
  • Designed & developed ETL to include natural key in RPA Sales ETL, SO CORE ETL & Inventory ETL and loaded data into partitioned tables..
  • Coherently associated with Production support team for any kind of issues in RPA US & EU Loads .

Environment: Informatica Powercenter 9.1, Oracle 11g, PL/SQL, Toad, SQLDeveloper, Tableau Desktop 8.2, SQL Loader, MS Excel, Putty, WinSCP, WinMerge, Unix

Confidential, CA

ETL Lead

Responsibilities:

  • Leading Team in development activities in Informatica 9.1.
  • Involved in Full Life cycle implementation of the project and participated in the preparation of business analysis documentation.
  • Working directly with client to capture requirements and preparation of FSD’s.
  • Responsible for delivering the objects as per the project plan.
  • Responsible for maintaining the versions of objects and deployments to all environments.
  • Populated the Staging tables with various Sources like Flat files (Fixed Width and Delimited) and Relational Tables.
  • Involved in performance tuning of SQL Queries, Sources, Targets and Informatica Sessions for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval.
  • Worked on Informatica SDK to create mappings and workflows.
  • Worked on Hierarchical relationship within table to build mappings per business rule.
  • Worked on metadata manager to define key components of bank’s wholesale business.
  • Creating workflows that are capable of handling near real time integration of data between applications.
  • Scheduling workflows to handle bi-directional and uni-directional real time data integration.
  • Created mappings using various Transformations such as Source qualifier, Aggregator, Expression, lookup, Router, Filter, Rank, Look up, Sequence Generator, and Update Strategy.
  • Interacting with other teams and stake holders on daily basis.
  • Handshaking with other processes on which Informatica was dependent and processes which depended on Informatica.
  • Working on creating web services and receiving data from web services through Informatica Web services.
  • Worked with data modelers to introduce Partitions in the PDM to manage data effectively for insertion and purging as part of performance tuning.
  • Designing ETL as per the data model files from Erwin and Embarcadero modeling tools.
  • Worked with DBA in collections of stats and Exchanging table partitions to give users a true snapshot of data.
  • Working with users in defining data lineage in metadata manager.
  • Developed ETL mappings to ensure conformity, compliance with standards and lack of redundancy, translated business rules and functionality requirements into ETL procedures.
  • Involved in the documentation of the ETL process with information about the various mappings, the order of execution for them and the dependencies.

Environment: Informatica Powercenter 9.1 Hotfix 5, Oracle 11g, PL/SQL, Teradata, SQL Server 2008, Toad, SQLDeveloper, SQL Loader, MS Excel, Putty, WinSCP, WinMerge, Unix

Confidential

Data Solutions Developer

Responsibilities:

  • Involved in requirement gathering, preparing TSD and responsible for development of ETL batches for data marts
  • Phase 1of Project involved Informatica hand shaking with the database sources on the Ms SQL Server, extracting data from there and loading into a datawarehouse.
  • Phase 2 of Project involved generating statistical reports in SAS which could be read by application software used by the analytical Team.
  • Involved in extracting the data from the data warehouse and creating 3 different data marts using complex ETLs.
  • Understanding the Stored Procedures to define certain conditions and logic for Phase1.
  • Designed the mapping document.
  • Involved in creating the best practices document for the COE ETL team.
  • Developed the mappings for the 1st phase.
  • Setting up of the local Informatica environment on the client machines which included the connectivity and access to the data sources, taking the necessary steps to set up the Relational Connectivity variables in the Workflow manager etc.
  • Used transformations like SourceQualifier, Joiner, Expression, Router, Lookups, Sequence Generator, Normalizer, Update strategy.
  • Generated sequence numbers using Informatica logic without using the sequence generator.
  • Worked with post session commands where in I used vbscript to perform certain tasks.
  • Used SQL override to perform certain tasks essential for the business.
  • Worked with the parameter file and the pmcmd command to attain the purpose of having swim lanes in single QA and Dev Environment.
  • Worked with flat file, xml file and SQL server tables as targets.
  • Used mapping variables to achieve certain goals like creating file names dynamically.
  • Designed the workflow execution process and the way the tasks needed to be executed keeping in mind Constraint Based Loading etc.
  • Unit testing of mappings and testing of various conditions.

Environment: Informatica PowerCenter 8.6, MS SQL SERVER 2005, Oracle 10g, ERWIN 4.1, SAS, Windows XP Professional, Red hat linux

Confidential

ETL Consultant

Responsibilities:

  • Worked directly with Business to capture requirements and preparation of FSD’s.
  • Designed architecture for data integration with other systems and applications.
  • Designed data model for the project.
  • Designed the Exception handling strategies to capture data issues and their resolution.
  • Performed Data Analyst activities like data profiling, data reconciliation and producing reports.
  • Worked on FSD’s, TSD’s and SIA’s.
  • Interacted with other teams and stake holders on daily basis.
  • Responsible for deliverables as per the project plan.
  • Responsible for designing mapping documents containing granular information related to different ETL processes.
  • Assisted QA team in creation of test plans.
  • Involved in performance tuning of SQL Queries, Sources, Targets and Informatica Sessions for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval.
  • Responsible for Deployment and validation of Informatica Objects and Database objects.
  • Working on POC’s and providing ETLdesign solution forData Migration and Delivery.
  • Created and scheduled workflow and tasks to schedule the loads at required frequency using Dollar U.
  • Worked on Hierarchical relationship within table to build mappings per business rule.
  • Worked extensively on almost all the transformations such as Aggregator, Joiner, Source Qualifier, Joiner, Expression Connected and Unconnected lookups, Filter, Router, Expression, Union, Sequence generator etc.
  • Worked with Memory cache for static and dynamic cache for the better throughput of sessions containing Rank, Lookup, Joiner, Sorter and Aggregator transformations.
  • Created Mapplets to be re-used in the Informatica mappings.
  • Designed Mappings in order to make integration near real time within various systems.
  • Used Aggregator Transformations along with the Sorter Transformations increasing the Mapping level performance.
  • Responsible for deployment of code to various environments.
  • Responsible for KT to Prod Support team.

Environment: Informatica 8.6, Unix, Oracle 11g, PL/SQL, SQL Server 2008, T-SQL, Toad, SQLDeveloper, SQL Loader, MS Excel, WinSCP, WinMerge, Windows XP

Confidential

Informatica Developer

Responsibilities:

  • Business Analysis and Requirements gathering.
  • Develop Logical and Physical data models that capture current state/future state data elements and data flows using Erwin.
  • Reviewed source systems and proposed data acquisition strategy.
  • Designed, developed Informatica mappings, enabling the extract, transport and loading of the data into target tables.
  • Worked extensively on almost all the transformations such as Aggregator, Joiner, Cached and Uncached lookups, Connected and Unconnected lookups, Filter, Router, Expression, Sequence generator. Etc
  • Worked with Memory cache for static and dynamic cache for the better throughput of sessions containing Rank, Lookup, Joiner, Sorter and Aggregator transformations.
  • Created Mapplets to be re-used in the Informatica mappings.
  • Used Aggregator Transformations along with the Sorter Transformations increasing the Mapping level performance.
  • Data cleansing and profiling using Trillium. Used the GeoCoder, Matcher and Parser for various cleansing needs.
  • Created Maplets and reused them as and when the similar business requirement came and made the necessary changes if any to the mapping.
  • Developed ETL procedures to ensure conformity, compliance with standards and lack of redundancy, translated business rules and functionality requirements into ETL procedures.
  • Created and scheduled worklets, setup workflow and tasks to schedule the loads at required frequency using workflow manager.
  • Server Manager used for creating and maintaining the sessions. Server Manager also used to Monitor, edit, schedule, copy, aborts and deletes the session.
  • During implementation phase, tuned Informatica Mappings for optimum performance.
  • Written Unix Scripts to invoke Informatica Workflows and sessions.
  • Monitored the session status in the workflow Monitor in the Gantt chart and Task View.
  • Involved in the error processing methods at Source level, table level and Target level.
  • Scheduled Informatica Workflows and sessions using Autosys.
  • Monitored transformation processes using Informatica Workflow monitor.
  • Worked extensively on Oracle database 9i and flat files as data sources.
  • Designed and developed table structures, stored procedures, and functions to implement.
  • Tested the data and data integrity among various sources and targets. Associated with Production support team in various performances related issues

Environment: Informatica Power Center 8.1, Oracle 9i, PL/SQL, SQL *Loader, MS SQL Server 2000, Mainframes, Windows 2000, UNIX, Sun Solaris 5.

We'd love your feedback!