Sr. Informatica / Etl Developer/lead Resume
2.00/5 (Submit Your Rating)
Hartford, Ct
Professional Summary:
- 9 years of experience in diversified fields of Application Software having considerable "in the trenches" experience in Data warehousing. Business Intelligence/data processing technologies and tools such as ETL tool Informatica Power Center 9.1/8.x. SQL Server 2007/2008, Oracle 11g/10g/ 9i, in Insurance & Telecom domain.
- Sound knowledge of Property and Causality Insurance Domains and extensive experience as Lead ETL Developer and Designer primarily with Informatica toolset
- Insurance Certification in Property and Causality – AINS21
- Extensive Knowledge on Insurance Domain process – Quotation, Submission, policy, Claim Processing, Risk Assessment, Renewal, Risk Mitigation, Business Entities of Insurance Company.
- Brain Bench certified in PLSQL and RDBMS concepts.
- Capable of looking across the entire business model to develop and promote a strategic vision of the data environment as well as deep-dive into detailed data analysis and data mapping / data profiling exercises
- Design and development of all data warehousing components e.g. source system data analysis, ETL strategy, data staging and data migration strategy.
- Proficient in defining ETL design to extract, transform and load source data into data warehouses and data marts
- Extensive experience working with Various databases such as SQL Server 2007/2008, Oracle on UNIX, Windows platforms and exposure to Data Modeling Tools like Erwin, MS SQL Server databases
- Production Support for Micro Soft – SSIS ETL packages.
- Configuring Loads & Business Glossary for Meta Data Management.
- Informatica Data Quality principles – Profiling, Data matching, standardization, Address validation, Matching algorithms.
- Data Base Maintenance Tasks - Indexing Optimization, Update Stats, Table Compression and Parallel Backup/Restore Technique on SQL Server.
- Experience with creating UNIX Korn shell scripts for Informatica pre and post session operations, Batch scripting, Wrapper scripts and day-to-day activities like, monitoring network connections, database ping utilities, Scheduler monitoring, file archiving and cleanups, file splitting, FTPing to various locations, Listfile generation scripts, Log Rotation and metrics generation scripts etc.
- Shell types worked with: Korn, Bash. Various types of scripts used: Startup Scripts (.profile, .kshrc), promotion scripts, file handling scripts, archiving scripts. Used ENV for automatic environment tailoring (PATH etc.) Manipulated Input and Output Redirection (Pipes) - Aliases (alias, unalias) Command History and Editing with Variables - Assigning Values to Variables (set, unset and Special Variables)- Built-in Variables Known to Login Shell - The Shell Environment and predefined variables - Exporting Variables(Input/Output) - Worked with files and file attributes - Worked with Directories - Basic I/O Features - String and Field Processing - Streams and pipes (exec) - Terminal Information(COLUMNS, LINES, TERM) etc
- Data Modeling experience for Multi-Dimensional Modeling and involved in creation of Star Schema, marts, slowly changing dimensions implementations(types-I,II,III), dimensional surrogate key handling and Incremental loading implementations
- Experienced in handling challenges with high volume datasets from various sources like Oracle, Flat files, SQL Server.
- Expertise in identifying bottlenecks, tuning the DW environment, SQL query tuning(collecting statistics, analyzing explains & determining which tables needed statistics), Database tuning and Informatica mappings tuning, Workflow tuning (targets, sources, mappings and Sessions).
- Experience in Informatica mapping specification documentation and Autosys to create and schedule workflows
- Experience with loading data into Data Warehouse/Data Marts using Informatica, SQL*Loader, Export/Import utilities, FTP.
- Experience fetching and handling complex data from ISeries/AS/400 systems.
- Self-starter, quick learner and comfortable in high intensity and challenging work environment.
- Seeking knowledge of Informatica cloud, salesforce.com & ERP solution integration, Web services, Power Exchange.
ETL Tools: Informatica Power Center 9.x/8.x, SSIS.
Databases: SQL Server 2007/2008, Oracle 9i.
Database Modeling: Erwin3.5.2/4.0.
Software Tools: TOAD, Putty, Informatica Meta Data Manager, IDQ, SQL Developer, JAWS.
Programming Langages: PL-SQL, Shell Scripting, C, C++.
Operating Systems: Sun Solaris (ULTRA 10, 3500, 8800), Windows 7/XP/ NT/95/98/2000/ME, DOS.
Network Protocols: FTP, SFTP.
Professional Experience:
Confidential , Hartford, CT
Sr. Informatica / ETL Developer/Lead
Confidential Specialty Insurance provider. Confidential use multiple applications to manage day to day operations like AS400 system, Database systems, flat file for policy and Claim process, SunGard ProCede to handle Re-Insurance business. XL Data Warehouse is an integrated platform forConfidential to generate consolidates reports. XL DW consume data from multiple ex AS400, flat file, relational source and consolidated, cleansed, conformed data will be generated. IDW is the single data ware house platform for Confidential, data is used for multiple business users to perform business analysis and generate reports for Insurance Governing bodies. Being a large Data Warehousing platform supporting multiple sources and multiple business users, can able to handle large volume of data, XL Global Insurance is expanding business globally. ProCede Inward Bridge is a bridge b/w source and SunGard ProCede will convert Source Specific format to ProCede Specific format, implement business rules.
Responsibilities
- Gathering functional requirements.
- Mapping, Technical Spec Preparation.
- Data warehousing technique implementation - SCD, Star Schema, CDC, B&C, building Data Mart.
- Informatica Code Development – using Mapping designer, Mapplet Designer, Transformation Developer, Workflow manager to develop mappings, workflows.
- Informatica Code Tuning –Static & Dynamic Cache, Partitioning, session performance improvement techniques, Concurrent WF.
- Developed proof of concept for implementing Web services, to replace Autosys file watcher.
- Mapplet & Re-usable TF Implementation for – ARTS business rule validation, Exchange rate population, Inward Risk, Inward claim detail lookup, Global key lookup, Regular expression for source sys key format conversion.
- Union TF implementation to merge business from heterogeneous sources flat file & AS400 system.
- Stored Proc Implementation for audit key generation, adhoc processing.
- Target Update override implementation for non-primary key attribute update.
- Technical Design for Unearned premium calculation, onset/offset generation, -1 dimension key fixing, Month End balance validation,
- FTP scripts for posting file to PeopleSoft Ledger.
- Implementation of SCD type 2 using lookup, router and update strategy TF’s for star schema implementation for policy & claims data mart.
- Implementation of Workflow Tasks – Decision task to trigger dependent tasks based on Month close indicator, Control task to fail wf in the event of Out of Balance, Event wait for file watcher implementation, Link condition to trigger based on current session of the session, invoking file archival mechanism, ftp processing , workflow variable assignment while post session tasks.
- Archival mechanism to move history records to history database.
- Building business glossary in Informatica Metadata Manager (IMR) for Insurance Datawarehouse.
- Data lineage analysis for dependency check across DW landscape i.e source to reports in IMR.
- Configuring data loader for database, informatica & Cognos repository in IMR.
- Developing data load design for Source to stage, stage to build layer.
- Debugger for identifying and fixing bugs in existing mappings by analyzing data flow, evaluating transformations and Creating mapplets, generic transformations and worklets that provides reusability in mappings and workflows
- Deployment groups for Promotion/migration of code to QA and Production environments.
- Developed Triggers, and stored procedures, views, SQL transactions.
- Data base Indexing and Data partitioning strategies, archiving that maximize query performance based on how users access the data.
- Interpreted logical and physical data models for Business users to determine common data definitions and establish referential integrity of the system.
- As part of the data migration Strategy, developed ETL procedures to ensure conformity, compliance with standards and lack of redundancy, translating business rules and functionality requirements into ETL procedures.
- Created and monitored Autosys jobs using Autosys and set up batches to schedule the loads at required frequency using Autosys.
- Support Microsoft SSIS package for source to stage data load.
- Identified data quality issues and their root causes. Implemented fixes and design data audits to capture the issues in the future.
- Resolving production issues, implementing fixes.
- Quick learner, Zeal to learn new technologies, Goal oriented Enforce SDLC & ITIL standards.
- Seeking knowledge on Salesforce.com integration, IDQ implementation, and Power Exchange implementation for AS400 system.
- Team leader for 3 offshore resources - distributing work, Technical discussions, clarification session, peer review, unit test case review, participating in test cycles, establishing code, testing standards, UAT support, post go live support.
Confidential
- The objective of developing a Repair Data warehouse is to facilitate Continuous Quality Improvement of Motorola products. RDW supports this objective by:
- Collecting/collating information from Authorized Service Centers (ASC’s)
- Categorizing collected data by specific dimensions such as products and customers.
- Enabling users to generate reports to facilitate analysis and internal quality and process improvements.
- The scope of the project is to improve the Repair and Return data, reporting, and transmission processes for all outsourced and in-house repair centers utilized by CHS.
- Repair Protocol Revision.
- Data transmission from ASC.
- Data Warehouse implementation.
- Managed the Offshore model and was responsible for interacting with the offshore team
- Responsible for initiating a process for Batch Support and Event Management using tools Workload Automation
- Played a Key role in the release management and post-mortem release documentation preparation and analysis
- Involved in gathering requirements, Business Analysis
- Involved in design and development of complex ETL mappings in an optimized manner and created low level mapping documents from high level design.
- Created mapping, Sessions and workflows and deployed workflows to higher environment using Informatica Import Export Utility.
- Developed Pre and Post SQL scripts, PL/SQL stored procedures and functions.
- Used Debugger to check the errors in mapping.
- Generated UNIX shell scripts for automating daily load processes.
- Created test cases and executed test scripts as part of Unit Testing.
- Used Normalizer transformation to generate reports with historical data
- Created reusable transformations/mapplets to import in the common mappings to avoid complexity in the mappings.
- Created session, batches and also partitions in the session to reduce the time to load data from the existing sources to database.
- Fine-tuned Informatica mappings for performance optimization
- Extracted data from flat files using FTP Scripts
- Generated report with previous unclean data and current integrated data for Clients’ reference
- Involved in quality assurance of data, automation of processes.
- Documented the entire process. The documents included the mapping document, unit testing document and system testing document among.
- Effectively handled change requests by developing, implementing and testing of solutions.
- Reviewed mappings and workflows developed by Peers and reported the defects.
- Managed Change control implementation and coordinating daily, monthly releases.
IBM AIX, Windows NT/XP, XML file sources, ERWIN 7, MS Visio, , COGNOS, REMEDY.
Confidential, India
Informatica & Concord Developer
Confidential is a repair entry system developed using Motorola proprietary tool Concorde and its specific language XL. It contains Forms & reports used for end users at repair centres and managers for reports. Application hosted on Unix OS. Use the shell scripts to load/send the data to the dependent systems. Data manipulation is done using XL scripting language. Data Entered through concord is not directly accessible to business analyst, Informatica used to extract data and apply business rules on data entered through Concord system and loaded to relational tables. Data from Concord is converted to relational model. Data collected globally will be consolidated at end of business day.
Informatica Developer:
Responsibilities
- Involved in Requirement discussion, preparing Technical Requirement Document.
- Involved in complete life cycle of ETL process for data movement from Concorde System to Staging and finally to data base.
- Extensively used Power Center/Mart to design multiple mappings with embedded business logic.
- Closely co-ordinate with the Lead in the Architecture of the data model.
- Analyzed critical/complex processes of application packages to design ETL processes and adopted strategy to prevent failures and acted swiftly to recover from application failures.
- Implemented Shell Scripts to use FTP, SQL Loader utilities.
- Transformations like Sequence generator, Lookup, Joiner and Source qualifier transformations in Informatica Designer to effectively utilize Informatica services.
- Supported the informatica developers in Database Server Performance, tracking user activity, troubleshooting errors, tracking server resources and activities, tracing server events.
- Implemented industry best practices ex Mapplets
- Involved in performance tuning of the Informatica mapping using various components like Parameter files, Variables and Dynamic Cache.
- Documenting the ETL design specifications and mappings and maintaining the version control Involved in migration of Informatica mapping from Development to Production environment.
IBM AIX, Windows NT/XP, XML file sources, ERWIN 7, MS Visio, , COGNOS, REMEDY.