Lead Informatica / Etl Developer Resume
Hartford, CT
SUMMARY:
- Around 9 years of experience in diversified fields of Application Software having considerable "in the trenches" experience in Data warehousing. Business Intelligence/data processing technologies and tools such as ETL tool Informatica Power Center 9.1/8.x. SQL Server 2007/2008, Oracle 11g/10g/ 9i, in Insurance & Telecom domain.
- Sound knowledge of Property and Causality Insurance Domains and extensive experience as Lead ETL Developer and Designer primarily with Informatica toolset
- Insurance Certification in Property and Causality - AINS21
- Extensive Knowledge on Insurance Domain process - Quotation, Submission, policy, Claim Processing, Risk Assessment, Renewal, Risk Mitigation, Business Entities of Insurance Company.
- Brain Bench certified in PLSQL and RDBMS concepts.
- Capable of looking across the entire business model to develop and promote a strategic vision of the data environment as well as deep-dive into detailed data analysis and data mapping / data profiling exercises
- Involved in Design and development of all data warehousing components e.g. source system data analysis, ETL strategy, data staging, data migration strategy, movement and aggregation plans, information and analytics delivery and data quality strategy
- Experience in identifying and evaluating the best sources of data for both internal and external loads and data quality analysis.
- Proficient in defining ETL design to extract, transform and load source data into data warehouses and data marts
- Strong experience in implementing ETL requirements like, SCDs, CDC using Informatica Power Exchange 9.x/8.x
- Extensive experience working with Various databases such as SQL Server 2007/2008, Oracle on UNIX, Windows platforms and exposure to Data Modeling Tools like Erwin, MS SQL Server databases
- Exposure to Micro Soft ETLSSIS Tool
- Experience in Configuring Loads & Business Glossary for Meta Data Management.
- Knowledge in Informatica Data Quality.
- Expertise in Data Base Maintenance Tasks - Indexing Optimization, Update Stats, Table Compression and Parallel Backup/Restore Technique on SQL Server.
- Experience with creating UNIX Korn shell scripts for Informatica pre and post session operations, Batch scripting, Wrapper scripts and day-to-day activities like, monitoring network connections, database ping utilities, Scheduler monitoring, file archiving and cleanups, file splitting, FTPing to various locations, Listfile generation scripts, Log Rotation and metrics generation scripts etc.
- Shell types worked with: Korn, Bash. Various types of scripts used: Startup Scripts (.profile, .kshrc), promotion scripts, file handling scripts, archiving scripts. Used ENV for automatic environment tailoring (PATH etc.) Manipulated Input and Output Redirection (Pipes) - Aliases (alias, unalias) Command History and Editing with Variables - Assigning Values to Variables (set, unset and Special Variables)- Built-in Variables Known to Login Shell - The Shell Environment and predefined variables - Exporting Variables(Input/Output) - Worked with files and file attributes - Worked with Directories - Basic I/O Features - String and Field Processing - Streams and pipes (exec) - Terminal Information(COLUMNS, LINES, TERM) etc
- Sound knowledge of Data Modeling, especially Multi-Dimensional Modeling and involved in creation of Galaxy Schema, Snowflake dimensional data marts, OLAP cubes and slowly changing dimensions implementations(types-I,II,III), dimensional surrogate key handling and CDC/Incremental loading implementations
- Experienced in handling challenges with high volume datasets from various sources like Oracle, Flat files, SQL Server.
- Expertise in identifying bottlenecks, tuning the DW environment, Oracle SQL query tuning(collecting statistics, analyzing explains & determining which tables needed statistics), Database tuning and Informatica mappings tuning, Workflow tuning (targets, sources, mappings and Sessions).
- Experience in Informatica mapping specification documentation and Autosys to create and schedule workflows
- Experience with loading data into Data Warehouse/Data Marts using Informatica, SQL*Loader, Export/Import utilities, FTP.
- Experience fetching and handling complex data from ISeries/AS/400 systems and XML data with both continuous and near real time frameworks.
TECHNICAL SKILLS:
ETL Tools: Informatica Power Center 9.x/8.x, SSIS.
Databases: SQL Server 2007/2008, Oracle 9i.
Database Modeling: Erwin3.5.2/4.0.
Software Tools: TOAD, Putty, Informatica Meta Data Manager, IDQ, SQL Developer, JAWS.
Programming Langages: PL-SQL, Shell Scripting, C, C++, Core Java.
Operating Systems: Sun Solaris (ULTRA 10, 3500, 8800), Windows 7/XP/ NT/95/98/2000/ME, DOS.
Network Protocols: FTP, SFTP.
PROFESSIONAL EXPERIENCE:
Confidential, Hartford, CT
Lead Informatica / ETL Developer
Responsibilities:
- Required to attend functional meetings to gather requirements and provide technical opinions about feasibility for Inward Bridge.
- Involved in project cycle plan for the data migrations, analyzing the source data, deciding the exact data extraction, transformation, loading strategy (Informatica 9.1.0) & Dimensional modeling.
- Participated in design reviews and provided input to the design recommendations; provided input to information/data flow.
- Conducted and Led design sessions for implementation of various ETL requirements such as SCD types, Data Cleansing and Staging methodologies.
- Extensively used Informatica client tools - Source Analyzer, Mapping designer, Mapplet Designer, Transformation Developer, Informatica Repository Manager, Workflow manager to develop mappings, workflows and complete production systems.
- Designed Informatica workflows and mappings for optimal performance in a large scale windows environment. Tuned Informatica workflows for better performance.
- Participated in understanding of data warehousing environment implementing dimensional surrogate key handling, exception data processing, conformed dimensions/facts creation, handling late arriving dimensions/facts etc.
- Responsible to tune ETL mappings, Workflows and underlying data model to optimize load and query Performance.
- Used debugger for identifying and fixing bugs in existing mappings by analyzing data flow, evaluating transformations and Creating mapplets, generic transformations and worklets that provides reusability in mappings and workflows
- Prepared deployment groups for Promotion/migration of code to QA and Production environments.
- Created and managed triggers, and stored procedures, views, SQL transactions.
- Implemented indexing and Data partitioning strategies that maximize query performance based on how users access the data.
- Understood Dimensional Model for Building Inward Bridge.
- Interpreted logical and physical data models for Business users to determine common data definitions and establish referential integrity of the system.
- As part of the data migration Strategy, developed ETL procedures to ensure conformity, compliance with standards and lack of redundancy, translating business rules and functionality requirements into ETL procedures.
- Maintained and improved integrity by identifying and creating relationships between tables.
- Created and monitored Autosys jobs using Autosys and set up batches to schedule the loads at required frequency using Autosys
- Identified data quality issues and their root causes. Implemented fixes and design data audits to capture the issues in the future.
- As an onsite coordinator, managed a project team comprising of 4 resources, 3 offshore and 1 onsite.
- Prepared technical design guides for offshore team, Scheduled daily meetings with offshore to monitor work progress, resolve issues and provide inputs of solution implementation
- Knowledge of Property & Causality Insurance concepts, ex: Policy & Claim life cycle, Earned & Un earned premium, reserves, lines of business, specialty insurance, Risk assessment, TPA, Claim handlers, Solvency etc..
- Knowledge of basic Re-insurance principles ex: contract layers, policy limits, attachment guidelines, different type of reinsurance contracts, payments & receivables, policy coverage, direct & assumed business etc.
- Performed POC to implement concurrent workflow execution for critical enhancements and prepared presentations for the client
- Implemented session and workflow variable to eliminate linked server.
- Guided various junior developers on best ETL practices for various ETL challenges associated with extracting heterogeneous data, Data Lineage issues, NULL data issues, etc at ETL layer.
- Understood the structure and meaning of source data through data profiling and data lineage walkthroughs.
- Defined transformations for data coming the data warehouse to match the business application of the data.
- Designed and developed production level data loads into the data warehouses that are optimized for fast performance and minimal maintenance.
- Performed impact analysis on new requirements and responsible for workload distributions
- Documented procedures and knowledge learnt during project phase and created work products including Project Initiation Document, Business Alignment Document, Project Control Plan, and a Requirements Traceability Matrix.
- Encouraged the team to develop thorough documentation around the developed modules, prepared lessons learned for each package completed.
- Responsible for developing a methodology using Informatica to transfer data and setup an environment at offshore to reduce dependency on onsite availability.
- Introduced peer review to ensure quality code delivery and to reduce code review required at onsite.
- Ramped up the Domain Knowledge on various AML and be a knowledge pool for the offshore team.
- Performed comprehensive code reviews and Converted specifications about business problems into programming instructions to ensure compliance with standards
- Provided accurate high level and detailed effort estimates and scoped programming efforts
- Created release documentation and worked as part of a dynamic team throughout the software development life cycle
Environment: Informatica Power Center 9.1.0, ERWIN, Confidential, TOAD IDE, Informatica, Windows NT.
Confidential, Hartford, CT
Senior Informatica Developer
Responsibilities:
- Worked on studying data sources, targets, required transformations, required data cleansing, required data validations, performance requirements to meet SLA was incorporated.
- Lead design sessions and wrote specification and functional design documents for enhancements and customization
- Worked with engineers from disparate source systems for data acquisition, transformation and Source data quality.
- Designed dimension and fact tables for Star Schema to develop the Data warehouse
- Worked with Informatica Support to understand various bottlenecks in existing Mappings/mapplets.
- Incorporated tuning suggestions provided by Informatica Support to Mappings and developed test strategy to validate end results after performance tuning.
- Worked on databases such as SQL Server and developed ETL Mappings using these databases.
- Integrated after cleansing/staging data from the main ISeries system and extrapolated the data required from this into Data warehouse requirements
- Created Transformations like Lookup, Joiner, Rank and Source Qualifier Transformations in the Informatica Designer.
- Created complex mappings using Unconnected Lookup, Sorter, Aggregator, newly changed dynamic Lookup and Router transformations for populating target table in efficient manner.
- Used Dynamic Look up and update strategy transformations in mappings to update the City DBs
- Created reusable Mapplets and transformations and used them in different Mappings.
- Worked extensively on Power Center client tools like Source Analyzer, Warehouse designer, Transformation developer and Mapping Designer, Workflow Designer.
- Extensively used PL/SQL programming procedures, packages to implement complex business rules not feasible with Informatica alone.
- Created and Configured Workflows, Worklets and Sessions to transport the data to target warehouse tables using Workflow Manager
- Improved performance using session partitioning.
- Optimized the performance of the mappings by various tests on sources, targets and transformations
- Designed and Developed pre-session, post-session routines and batch execution routines
- Develop solutions to performance issues within Informatica mappings, mapplets, and reports
- Debugged and Trouble Shot issues in some of the Mappings
- Actively involved in production support. Implemented fixes/solutions to issues/tickets raised by user community.
- Maintained the Build process for setting up application in all the environments for different versions.
- Participated in the Defect Summary meetings to understand the defects and also update the status of those to the respective tester.
- Followed the process which has helped in maintaining high quality deliverables.
Environment: Informatica Power Center 9.0.1, SQL Server 2008, SSIS, Autosys, Windows NT.
Confidential, Hartford, CT
Informatica Onsite Lead
Responsibilities:
- Managed a team comprising of 6 resources - 2 onsite and 4 offshore resources.
- Involved in the requirements definition and analysis to enhance existing application
- Performed impact analysis and developed time and resource estimates for the provided solutions
- Participated in the analysis of development environment of extraction process and development architecture of ETL process
- Assigned enhancements to resources; tracked the progress, Setup team meetings to resolve any bottlenecks.
- Involved in troubleshooting critical production issues and assigning to offshore to develop a code fix.
- Oversaw development and production support activities, setup meeting for impact analysis for critical performance issues persisting in application.
- Provided status reporting of team activities against the program plan or schedule
- Kept the project manager and product committee informed of task accomplishment, issues and status
- Served as a focal point to communicate and resolve interface and integration issues with other teams and escalated issues which cannot be resolved by the team
- Provided guidance to the team based on management direction
- As an Informatica Specialist, responsible in identifying bottlenecks and provide guidance in optimizing the existing mappings
- Developed batch schedules for Informatica mappings to minimize failures and focus on reducing the batch run times, scheduled using Autosys
- Designed Schedules for daily and weekly batches using Autosys
- Prepared ETL mapping specification document
- Conducted KT sessions for offshore team to familiarize the business rules of the applications and issues faced during UAT for ease of future enhancements
- Data migration Strategy involving, studying data sources, targets, required transformations, required data cleansing, required data validations, performance to meet SLA was incorporated.
- Involved in the requirements definition and analysis in support of Data Warehousing efforts
- Following the defined data migration strategy, sources and targets were identified and well defined
- Extensively used Informatica to load data from source systems like Flat Files and Excel Files into staging tables and load the data into the target database
- Designed FTP process to post files to PeopleSoft System
- Designed archive process to improve the performance
- Creation of shortcuts for source, targets and created transformations according to business logic
- As part of data migration process, data cleansing and data validations were performed through transformations to reduce errors in targets and to increase data conformity and data consistency.
- Developed Mappings between source systems (MS SQL Server, flat files) using Mapping Designer and worked with Aggregator, Lookup (connected and unconnected), Filter, Router, Joiner, Source Qualifier, Expression, Stored Procedure, Sorter and Sequence Generator(almost all transformations) transformations to migrate transformed data from source to target
- Developed mapplets, configured sessions and workflows following an efficient design pattern
- Prepared ETL mapping Documents for every mapping and Data Migration document for smooth transfer of project from development to testing environment and then to production environment.
- Prepared Run books providing guidelines to trouble shoot the errors occurred during the run time and instructions on how to restart the loads.
- Developed mapplets to convert transactional data into business units, validate business rules, fx rate conversion.
- Used unconnected lookup for populating datetime entered field in the target tables
- Used Autosys job scheduler to run the sessions as per the requirement
- Developed Stored Proc’s, functions and Involved in performance improvement
- Actively involved in production support. Implemented fixes/solutions to issues/tickets raised by user community.
Environment: Informatica Power Center 8.6, SQL Server 2007, windows batch scripts, Autosys.
Confidential
Informatica Developer
Responsibilities:
- Managed the Offshore model and was responsible for interacting with the offshore team
- Responsible for initiating a process for Batch Support and Event Management using tools Workload Automation
- Played a Key role in the release management and post-mortem release documentation preparation and analysis
- Involved in gathering requirements, Business Analysis
- Involved in design and development of complex ETL mappings in an optimized manner and created low level mapping documents from high level design.
- Created mapping, Sessions and workflows and deployed workflows to higher environment using Informatica Import Export Utility.
- Developed Pre and Post SQL scripts, PL/SQL stored procedures and functions.
- Used Debugger to check the errors in mapping.
- Generated UNIX shell scripts for automating daily load processes.
- Created test cases and executed test scripts as part of Unit Testing.
- Used Normalizer transformation to generate reports with historical data
- Created reusable transformations/mapplets to import in the common mappings to avoid complexity in the mappings.
- Created session, batches and also partitions in the session to reduce the time to load data from the existing sources to database.
- Fine-tuned Informatica mappings for performance optimization
- Extracted data from flat files using FTP Scripts
- Generated report with previous unclean data and current integrated data for Clients’ reference
- Involved in quality assurance of data, automation of processes.
- Documented the entire process. The documents included the mapping document, unit testing document and system testing document among.
- Effectively handled change requests by developing, implementing and testing of solutions.
- Reviewed mappings and workflows developed by Peers and reported the defects.
- Managed Change control implementation and coordinating daily, monthly releases.
Environment: Informatica Power Center 8.6, Unix & Oracle 9i
Confidential
Informatica & Concord Developer
Responsibilities:
- Involved in Requirement discussion, preparing Technical Requirement Document.
- Involved in complete life cycle of ETL process for data movement from Concorde System to Staging and finally to data base.
- Extensively used Power Center/Mart to design multiple mappings with embedded business logic.
- Closely co-ordinate with the Lead in the Architecture of the data model.
- Analyzed critical/complex processes of application packages to design ETL processes and adopted strategy to prevent failures and acted swiftly to recover from application failures.
- Implemented Shell Scripts to use FTP, SQL Loader utilities.
- Transformations like Sequence generator, Lookup, Joiner and Source qualifier transformations in Informatica Designer to effectively utilize Informatica services.
- Supported the informatica developers in Database Server Performance, tracking user activity, troubleshooting errors, tracking server resources and activities, tracing server events.
- Implemented industry best practices ex Mapplets
- Involved in performance tuning of the Informatica mapping using various components like Parameter files, Variables and Dynamic Cache.
- Documenting the ETL design specifications and mappings and maintaining the version control Involved in migration of Informatica mapping from Development to Production environment.
Environment: Informatica Power Center 8.6, Unix & Oracle 9i