We provide IT Staff Augmentation Services!

Sr. Lead Developer Resume

0/5 (Submit Your Rating)

Portland, OregoN

SUMMARY

  • 9 years of experience in IT Industry with skills in the Analysis, Design, Data Modeling, Development, Implementation and Testing of Data Warehousing applications for Financial,Insurance,Staffing, Pharmaceuticals and Telecommunication industries. Expertise in ETL using InformaticaPowerCenter,PowerMart&PowerConnect.
  • Over 6years of Data Warehousing experience using Informatica Power center 9.1/8.6/8.5.1/8.1.1/7. x/6.x,Data Masking,Test Data Managementdealing with various sources like Oracle, SQL Server, Tera Data and Flat Files
  • Extensive Knowledge in functional area within Informatica
  • Extensively worked on ETL Informaticatransformations effectively including - Source Qualifier, Connected -Unconnected Lookup, Filter, Expression, Router, Normalizer, Joiner, Update, Rank, Aggregator, Stored Procedure, Sorter and Sequence Generator and created complex mappings.
  • Gather requirements and design of data warehouse and data mart entities
  • Strong knowledge of Teradata RDBMS Architecture, Applications and Tools&Utilities.Experience in worked Teradata 12 utilities likeBTEQ,FastExport,FastLoad,MultiLoad, Tpump and Masking.SQL .
  • Specialized in Business Intelligence areas such as Data Integration and Data quality.
  • Good experience in Data Masking Techniques. Used Power Center Data Masking Options (Random Masking, Blurring, SSN, and Credit Card) in mappings.
  • Good Knowledge onapplying rules and policies using ILM (Information Life Cycle Management) workbench for Data Masking Transformationand loading into targets.
  • Skilled in implementing Data warehousing and Business Intelligence solutions. Expertise in Ralph Kimball and Bill Inmon’s Data Warehouse methodology
  • Good experience in Relational Databases like Oracle, MS SQL Server, Teradata12, SQL, PL/SQL, SQL*Plus, Linux, UNIX, Stored Procedures, Functions, Oracle Performance Tuning, Indexes,Functions, and Packages and ETL data validation.
  • Experience in Interacting with Business Users to gather Business Requirement Specifications (BRS) and Prepare Technical Specifications.
  • Extensive Knowledge in InformaticaPowerCenter Admin tasks likeSet workflow schedules, Monitor / Troubleshoot workflows, Code deployments, Configuration changes.
  • Analyzed the existing systems and the business user requirement for tactical and strategic needs
  • Clear knowledge on OLTP/OLAP System study, Analysis, E-R modeling, Developing database Schemas (Star schemas, Snowflake schemas), Dimensions and fact tables.
  • Knowledge on performing logical and physical data modeling using EXCELERATOR CASE tool. Designed and developed online and batch processes
  • Involving with testing team to prepare test plans, scenarios and procedures
  • Experienced in integrating various sources like Oracle, SQL Server, TeraData and Flat Files.
  • Used InformaticaDebugging techniques to debug the mapping and used session log file to trace the errors while loading into Targets, Data Marts, Data Warehouse.
  • Implemented various Performance Tuning techniques on Mappings, Targets, and Sessions.
  • Participated in Full life cycle development of Data Warehouse systems such as Analysis, Design, Development, Testing& Technical Documentation. Understanding in writing Test plans, Test cases, Unit testing, System testing, Integration testing, Estimation, planning and Functional testing.
  • Developed Mappings using Type 1, Type 2, Type 3 SCD logic to load data into Dimension tables.
  • Experience in Shell scripting on Unix, Linuxand WindowsEnvironments.
  • Created database objects like Tables, Views, Synonyms, and Indexes from Logical database design document.
  • Experience in writing, testing and implementation of the Stored Procedures, Functions and Triggers using Oracle PL/SQL.
  • Excellent analytical and logical programming skills with a good understanding and interpersonal skills with a strong desire to achieve specified goals.
  • Interacted with the Business users on regular basis to consolidate and analyze the requirement and present them the designs
  • Highly adaptive to a team environment and proven ability to work in a fast paced Team environment.
  • Excellent Communication skills and a quick learner

TECHNICAL SKILLS

ETL Tools: Informatica Power Center 9.5/9.1/8.6.1/8.1/7.1.4/6 (Source Analyzer,Data Warehouse designer, Mapping Designer,Mapplet, Transformations, Workflow Manager, Workflow Monitor,InformaticaPowercenter 8.6 Datamasking Option,Worklets, Repository Manager),Informatica Power Mart.

Data Modeling: ERWIN 7.0/4.0/3.0, Dimensional Modeling, Snowflake Schema, Star Schema.

DBMS: Oracle 8i/9i/10g,11g Exadata, SQL Server 2000/2005/2008 , Teradata8.1 and 12

SQL Utility: SQL Plus, SQL Navigator, Query Analyzer.

Case Tools: ERWIN, Power designer, Oracle designer

Methodologies: Bill Inmon’s, Ralph Kimball and JAD

Functional /Business / Data Analysis: Requirements Gathering, User Interviews, Process Flow Diagrams, Data FlowDiagrams, MS Project, MS Access, MS Office. Business Requirements Analysis and Process mapping

Performance & Monitoring Mechanisms: SQL IO Simulator and SQL IO, CA spectrum one click and Attention

Operating systems: windows NT/ 98/2000/2003/2008 , MSDOS,UNIX, Red hat Enterprise LINUX

Languages: C, C++, SQL, PL/SQL, Unix Shell Scripting.

EPIC: Proficient in two RWB (2) Epic modules and application sup-components.

PROFESSIONAL EXPERIENCE

Confidential, Portland,Oregon

Sr. Lead Developer

Responsibilities:

  • Provide operational & information systems expertise in the analysis, development, training, testing, optimization & application of KP HealthConnect business & clinical systems.
  • As the subject matter expert in functionality & Epic modules, collaborate in the provision of information to regional leadership, clinicians, staff, brokers, employers, professional organizations & vendors including Epic Systems Corporation.
  • Collaborate w/ work teams, departments, regional leadership, clinicians, staff, & information technology to define needs, facilitate agreements & decisions.
  • Plan execution of small projects including user communication, support & post implementation review.
  • Assess customer needs, research options, support analysis, design, & test & implement application solutions.
  • Configure & build complex features & new modules for integration w/ installed Epic applications.
  • Plan & Lead in relevant workflow sessions & manage resulting workflow changes.
  • Direct configuration activities in Epic system to resolve problems escalated to the KP HealthConnect Operations department.
  • Set up & test configuration to resolve complex Confidential t safety problems modifications & enhancement.
  • Build, maintain & update application master files & category lists to ensure data integrity & maintain synchronization w/ Enterprise CB (Collaborative Build) updates
  • On Call rotation for 24X7 support.
  • Managing oracle databases in aEXADATAenvironment
  • As the subject matter expert, test complex system changes, develop training materials, assess workflow impact & solutions, & manage deployment.
  • QA test script development, test data validation, test results analysis, defect & problem resolution & identification of problem fix.
  • Providing defect status, test case execution status and Overall work stream status to Higher management
  • Extensively worked on Informatica IDE/IDQ.
  • Involved in massive data profiling using IDQ (Analyst Tool) prior to data staging.
  • Supporting Informatica admins taskslike creating folders and providing access
  • Review, analyze & recommend implementation of release notes as received from Epic Systems.Analyze Change Requests for impact of change & costs and preparing standard process documents like PSF (Project Setup Form) and PSDD
  • Communicate changes to business partners & assist w/ implementation of change.
  • Used Workflow Manager to load data from different sources (Oracle, SQL Server, Text files) to the target database, which was Oracle.
  • Populated and maintained Fact and Dimension tables in Daily Data Star Schema.
  • Imported various Sources and Targets, created Transformations using Informatica Power Center Designer.
  • Designed and developed Informatica Mappings and Sessions based on requirements and business rules to load data from source flat files and Relational target tables. Used transformations like Connected and Unconnected lookups, Aggregator, Expression, Update, Router and Sequence generator.
  • Working closely with DBA Team to Configuration and administration of full rack Exadatadatabases
  • Used joiner transformation to extract data from different tables.
  • Implemented Error handling and E-mail notifications using pre-session, post-session stored proceduretransformation (Connected and unconnected).
  • Used Informatica Workflow Manager to create, schedule, monitor and send the error messages in case of process failure.
  • Created global and local folders and granted permissions using Informatica Repository Manager.
  • Did performance tuning at source, transformation, target, and workflow levels.
  • Created staging and dimension tables in SQL Server database using SQL Query Analyzer and SQL Server Enterprise Manager.
  • Supporting the applications assessment with all the Data Masking Team on Proxy server and proving support on the databases and applications
  • Constantly interacted with DataWarehouse Architecture team to understand the requirements and implement them seamlessly.
  • Wrote user defined stored procedures using SQL Query Analyzer.
  • Provided daily production support by monitoring the processes running ..
  • Did Unit testingfor data errors based on timestamp, and checked counts against daily summarized data and weekly summarized data and documented technical metadata.

Environment: InformaticaPowerCenter 9.5,Oracle10g,SQL Server Enterprise Manager,SQL Query Analyzer, SQL Server 2000, Flat Files, Data Masking, Windows 2000 and Windows NT,EPIC-2014,2015

Confidential

Sr. Developer analyst

Responsibilities:

  • Gathered user requirements and reporting requirements for a better understanding of the business process and the current reporting system.
  • Extensively worked on ETL Informaticatransformations effectively including - Source Qualifier, Connected -Unconnected Lookup, Filter, Expression, Router, Normalizer, Joiner, Update, Rank, Aggregator, Stored Procedure, Sorter and Sequence Generator and created complex mappings.
  • Involved in various phases of SDLC from requirement gathering, analysis, design, development and testing to production.
  • Extracted data from various source systems like Oracle, SQL Server and flat files as per the requirements.
  • Extensively worked on ETL performance tuning to tune the data load and worked with for SQL query tuning etc.
  • Developed various sessions, batches containing parameters files, indicator files and multiple sources file.
  • Coordinating and tracking all projects for seamless releases using Project Management SystemJIRA, Source Code Management SystemSVNand Document Management SystemSharePoint.
  • Responded to all incoming questions and inquiries related to JIRA applications.
  • Prepared projects, dashboards, reports and questions for all JIRA related services.
  • Involved in ETL process from development to testing and production environments.
  • Worked with mappings to dynamically generate parameter files used by other mappings.
  • Involved in creation of Schema objects like Indexes, Views, and Sequences.
  • Documented the mappings used in ETL processes including the Unit testingand Technical document of the mappings for future reference.
  • Developed post-sessionand pre-sessionbatchscripts for tasks like merging flat files after Creating, deleting temporary files, changing the file name to reflect the file generated date etc.
  • Strong experience in Dimensional Modeling using Star and Snow Flake Schema, Identifying Facts and Dimensions, Physical and logical data modeling using ERwin.
  • Designed and developed scripts to automate the entire process.
  • Responsible for monitoring all the session that are running, scheduled, completed, failed and debugged the mapping of the failed session.
  • Extensively working with Autosys job creation and scheduling and adding calendars according to the client requirement
  • Involved in performance tuning of the ETL process by addressing various performance bottlenecks at the extraction and transformation stages in sources, mappings, and sessions and successfully tuned them for maximum performance using best practices.
  • Interacted with various business users and documented business requirements, discussed issues to be resolved and translated user input into ETL design.
  • Worked with data analysts to implement Informatica mappings and workflows, shell scripts and stored procedures to meet business requirements.

Environment: -Informatica Power Center 9.0.1, Data Masking Tool,DMSuite,AUTOSYS, JIRA (v5+),DatamaskingOption,Oracle 10g, Informix, SQL Developer, PL/SQL,UNIX, Batch scripting . SQL Query Analyzer, SQL Server, Flat Files, Windows 2008 and Windows NT

Confidential, Thousandoaks,CA

Sr. Developer Analyst

Responsibilities:

  • Gathered user requirements and reporting requirements for a better understanding of the business process and the current reporting system.
  • Involved in various phases of SDLC from requirement gathering, analysis, design, development and testing to production.
  • Used Informatica Power Exchange to import source definitions from Mainframe system to Informatica Power Center.
  • Worked on Informatica Power Center tools - Designer, Repository Manager, Workflow Manager and Workflow Monitor.
  • Constantly interacted with the Business Intelligence team and the Data warehouse Architecture team to understand the requirements and implement them seamlessly.
  • Meeting with business users and defining Functional and Technical business requirements.
  • Extensively used ERwin to design Logical/Physical Data Models, forward/reverse engineering.
  • Created Family of stars using dimensional modeling concepts of Degenerated dimension, Fact less fact table,Aggregate fact tables, rapidly changing dimensions in Multidimensional model.
  • Identified all the conformed dimensions to be included in the target warehouse design and confirmed the granularity of the facts in the fact tables.
  • Working closely with Architects, Lead and Project Managerfor the applications assessment to all the Data Masking Team on Proxy server and proving support on the databases and applications
  • Gathering requirements from Business Analyst’s in understanding the Databases and Structures required for masking in star schema data model.
  • Modeled the Data Warehousing Data marts using ERWIN.
  • Extensively worked on the Repository Manager to create/modify/delete users/group/roles.
  • Extensively created Reusable Transformations and Mapplets to implement the business logic for loading to the target database.
  • Extracted data from various source systems like Oracle, DB2, SQL Server and flat files as per the requirements.
  • Designing the ETL process flows to load the data into DB2 Data mart from Heterogeneous sources.
  • Worked on Informatica tool Source Analyzer, Data warehousing designer, Mapping Designer, Mapplets and Transformations.
  • Implemented Slowly Changing Dimension (SCD)Type 1 and Type 2 for inserting and updating Target tables for maintaining the history.
  • Most of the transformations were used like Source Qualifier, Aggregator, Filter, Expression, andUnconnected and connected Lookups, Update Strategy.
  • Improved the mapping performance using SQL overrides.
  • Used PL/SQL to write stored procedures to increase the performance.
  • Developed post-session and pre-session shell scripts for tasks like merging flat files after Creating, deletingtemporary files, changing the file name to reflect the file generated date etc.
  • Developed various sessions, batches containing parameters files, indicator files and multiple sources file.
  • Optimized query performance, session performance.
  • Designed and developed UNIX shell scripts as part of the ETL process to compare control totals, automate the process of loading, pulling and pushing data from and to different servers.
  • Developed the UNIX shell scripts to send out an E-mail on success of the process indicating the destination folder where the files are available.
  • Involved in ETL testing, Created Unit test plan and Integration test plan to test the mappings, created test data.

Environment: -Informatica Power Center 9.0.1, Data Masking Tool DMSuite, InformaticaPowercenter 8.6,, DatamaskingOption,Oracle 10g, DB2, SQL Developer, PL/SQL,UNIX. SQL Query Analyzer, SQL Server, Flat Files, Windows 2008 and Windows NT

Confidential

Programmer Analyst /Data Modeler

Responsibilities:

  • Involved in various phases of SDLC from requirement gathering, analysis, design, development and testing to production.
  • Worked on Informatica Power Center tools - Designer, Repository Manager, Workflow Manager and WorkflowMonitor.
  • Meeting with business users and defining Functional and Technical business requirements.
  • Identified all the conformed dimensions to be included in the target warehouse design and confirmed the granularity of the facts in the fact tables.
  • Extensively created Reusable Transformations and Mapplets to implement the business logic for loading to the target database.
  • Involved in DMSuiteData Masking tool Server installation.
  • Involved in Knowledge Transition at the Client site- including the Database concepts, Regular Expressions, Masking Algorithms, Creating profiler sets, etc.
  • Extensively used all the Transformations like source qualifier, aggregator, filter, joiner, Update Strategy,Unconnected and connected Lookups, Router, Sequence Generator etc. and used transformation language like transformation expression, constants, system variables, data format strings etc.
  • Extensively worked on Workflow Manager and Workflow Monitor to create, schedule, monitor Workflows,Worklets and various tasks like command, assignment, control and session tasks
  • Extracted data from various source systems like Oracle, SQL Server and flat files as per the requirements.
  • Developed various sessions, batches containing parameters files, indicator files and multiple sources file.
  • Designed and developed UNIX shell scripts as part of the ETL process to compare control totals, automate the process of loading, pulling and pushing data from and to different servers.
  • Worked on InformaticatoolSource Analyzer, Data warehousing designer, Mapping Designer, Mapplets and Transformations and XML files
  • Implemented Slowly Changing Dimension(SCD)Type 1 and Type 2 for inserting and updating Target tables for maintaining the history.
  • Provided production support by monitoring the processes running daily.
  • Extensively worked on ETL performance tuning to tune the data load and worked with for SQL query tuning etc.
  • Developed various sessions, batches containing parameters files, indicator files and multiple sources file.

Environment: -Informatica Power Center 8.6.1, CDC, Data Masking Tool DMSuite, InformaticaPowercenterDatamaskingOption,Business Objects 6.5, Cognos, SAP, UNIX, PL/SQL, Oracle, MS SQL Server 2005/2000, Erwin 7.1.

Confidential

Data Analyst/ ETL developer

Responsibilities:

  • Responsible for performance tuning ETL process to optimize load and query Performance.
  • Extensively used ERWIN to design Logical/Physical Data Models, forward/reverse engineering, publishing data model, applying DDLs to database, Data Modeling and restructuring the existingdata Model using ERWIN
  • Added and modified Informatica mappings and sessions to improve performance, accuracy, andmaintainability of existing ETL functionality. Recommended additions and modifications whereappropriate.
  • Responsible for administration of Informatica Environment. Created users and groups,configured profiles and privileges.
  • Created folders and configured folder security. Configuredshared folders for re-usable objects.
  • Wrote SQL overrides in source qualifiers for extracting only the required rows for optimalperformance.
  • Created complex transformations using connected/unconnectedLookup,Procedure.
  • Maintained warehouse metadata, naming standards and warehouse standards for futureapplication development.
  • Scheduling and Loading data process and monitoring the ETL Process.
  • Used various data Sources like Flat Files, Relational Tables
  • Used Fast Load, Multi-Load andTPump to load data into the Market Insite (MKIS) Teradata datawarehouse.
  • Involved in writing, testing, and implementation of the triggers, stored procedures, functions atDatabase level and form level using PL/SQL.
  • Extensively used Stored Procedures for Data Pre Session and Post Session Loading
  • Debugged the mappings using the Debugger facility in Informatica. Used the target load orderfeature for loading tables with constraints.
  • Provided production support by monitoring the processes.
  • Identified the bottlenecks and improved overall performance of the sessions
  • Preparation of Unit Test Plans and verification of functional specifications and review ofdeliverables.
  • Translated general reporting requirements into drill-down/drill-across support, andreview designs with business users.

Environment: Informatica Power Center 7.1 (Source Analyzer, Data warehousing designer,Mapping Designer, Mapplet, Transformations, Business Objects 5.0,Erwin, Oracle 8i,,Teradata8.1,InformaticaPower Connect, Autosys, PL/SQL, SQL * Loader, TOAD, Teradata8.1, Cobol files, MS SQL Server 2000, UNIX Sun Solaris and Shell Scripting

We'd love your feedback!