We provide IT Staff Augmentation Services!

Tech Lead Resume

3.00/5 (Submit Your Rating)

VirginiA

SUMMARY

  • 9+ years of IT experience in Software Industry in Analysis, Development and Implementation of business applications for Banking, Finance, Health care and Re - Insurance domains
  • 9+ years of experience in ETL (Extract, Transform and Load) processes utilizing Informatica Power Center 9.5/9.1/8.6 (Repository Manager, Designer, Workflow Manager, and Workflow Monitor) and worked on different types of Transformations such as Source Qualifier, Filter, Aggregation, Router, Joiner, Stored Procedure, Normalizer, Update Strategy, Lookup, Union, etc.
  • Involved in all the stages of Software Development Life Cycle (SDLC) from analyzing the Requirements, Designing, Implementing and developing solution, testing, implementing the solution in production, post production support and transition
  • Strong understanding of the principals of Data Warehouse using Dimension Tables, Fact Tables, Star Schema modeling and Snowflake Schema modeling
  • Proficient in developing Informatica mappings, Sessions, Workflows, Mapplets, Worklets, Reusable sessions and Reusable transformations. Proficient in debugging, troubleshooting and performance tuning
  • Expert in data warehousing techniques for data cleansing, Slowly Changing Dimension (SCDs - Type 1, Type 2 and Type 6) surrogate key generation process and CDC (Change Data Capture)
  • Proficient in writing SQL queries in Oracle, Teradata, DB2 and SQL Server
  • Used scheduling tools like Control M, WLM, Autosys and TWS to schedule and monitor production jobs
  • Expert in Unit testing, System Integration Testing, implementation and maintenance of mappings
  • Implemented Performance Tuning at various levels such as Source, Confidential, Mapping, Session and Partitioning of data in Informatica
  • Experience in integration of various data source system analysis and data extraction from various sources like Flat files, Oracle, DB2, Teradata, DB2 and XML
  • Experience in PL/SQL programming and using T-SQL
  • Experience in creating UNIX Shell Scripts for data cleansing in the source files and validating pre and post loading of data
  • Used ERwin to model the database changes
  • Worked on change management process using tools like HPSD, TSRM, Peregrine Service Center to migrate components to production
  • Excellent interpersonal and communication skills to work with user communities

TECHNICAL SKILLS

ETL Tools: Informatica 9.5/9.1 / 8.6 / 7. x

Database: Oracle 11g/9i/8.x, Terdata13.0/14.0, DB2 8.x,SQL Server 2008

Operating System: Windows NT/2000/XP, UNIX

Front End Tools: Toad 9.x/8.x, SQL*Plus, SQL Developer, Aqua Data Studio 4.5.2, SQL Assistant, T-SQL, Fast Load, BTEQ scripts, Multi Load, Fast Export, View point and Tpump

Scheduling Tool: WLM, Control M, TWS (Tivoli Work Scheduler), Autosys

Metadata Tool: Data Station, Metadata Repository

Technical Drawing Tool: Microsoft Visio diagram

Versioning Tools: Clear Case, PVCS, MKS and MSTFS

Software Deployment Tools: WBSD and TAD

Document storage: Microsoft share point

Data Modeling Tool: ERwin

Request Tracking Tools: TSRM, MKS Integrity Client, HPSD and Peregrine Service Center

PROFESSIONAL EXPERIENCE

Confidential, Virginia

Tech Lead

Environment: Informatica 9.5, Teradata 13.0 and Teradata 14.0, BTEQ scripts, UNIX shell scripting, Rational Clear Case, Clear Quest, WLM, TSRM

Responsibilities:

  • Performed analysis on the business requirement and created technical design document with the Informatica mapping and BTEQ script details
  • Conducted review sessions with architect to get the design document reviewed and approved
  • Created Slowly Changing Dimension (SCD) Type 2 mappings for loading dimension tables
  • Responsible for developing Teradata BTEQ scripts and Informatica mappings to load EDWard for Membership, COA and Product Subject areas
  • Used components such as Mapping Designer, Workflow Manager, and Workflow Director and worked on different types of Transformations such as Source Qualifier, Filter, Aggregation, Router, Update Strategy, Lookup, Union, Expression, Joiner etc
  • Write and execute Test cases to capture the test results
  • Code check-in clear case tool and baseline the code version
  • Worked with testing team to perform SIT and with users on UAT. Clarify any questions from testing team during testing
  • Perform Code review
  • Created work orders/ change tickets and obtaining approvals for migration and promote code to SIT, pre-production and to production
  • Represent the status meetings with stakeholders
  • Worked on analyzing the variances from the Restated Membership Month Counts (RMMC) table data and the source data to ensure data quality in EDWard

Confidential, California

SQL/ETL Data Analyst

Environment: Informatica 9.1, Oracle 11g, UNIX, Autosys, Putty, PL/SQL and TOAD

Responsibilities:

  • Used SQL queries to join the data from SBB&T bank and apply ETL transformations to load to Oracle data warehouse
  • Used Informatica designer to develop mappings, Mapplets to load and generate files from Oracle data warehouse. Workflow Manager to export/import components and workflow monitor to monitor job run
  • Designed and developed Informatica mappings using Informatica designer to load the tables various transformations used are Joiner, Lookup, Sorter, Aggregator, Stored Procedure, Union, Router, Filter, Update Strategy, SQL transformations. To mask sensitive data, a reusable data masking transformation is used
  • Responsible for coding and loading data for AFS (Automated Financial System- Commercial Loan systems), CIF (Customer Information), DDA (Demand Deposit Accounting), ILS (Installment Loan System) subject areas
  • Worked on profiling of the data loaded- to check the values of the key fields are valid and null values for fields
  • Created PL/SQL procedures, functions and built ad-hoc views using SQL queries to cater the needs of business group, like flagging a customer record based on the customer status, calculation of the balance amount for deposits, built views to provide data to support Over-Draft model
  • Worked on Exception reports, these reports capture the records which are received from the source file but not loaded to the warehouse. These exception records help catch the data issues
  • Reprocessed corrected error records to the data warehouse

Confidential, California

Developer/Analyst

Environment: Informatica 9.1, Oracle 11g, SQL Server 2008, UNIX, Putty, TOAD, PVCS

Responsibilities:

  • Used Informatica designer to develop mappings, Mapplets, Workflow Manager to export/import components and workflow monitor to monitor job run.
  • Created T-SQL queries to populate Dimension tables
  • Prepared low level design document which TEMPhas the complete details about all the components to be built for the project
  • Design the Unix scripts and Informatica mappings/session and workflows based on the requirements
  • Develop and test the Unix scripts and Informatica components
  • Used Informatica designer to develop mappings and Mapplets
  • Used Workflow Manager to export/import components from one environment to the other and workflow monitor to monitor job run
  • Designed the Unix scripts and also Informatica workflows in a reusable manner which can be used across subsidiaries

Confidential

Team Leader

Environment: Informatica, DB2, Teradata, UNIX, ERwin, TOAD, Aqua Data Studio, Autosys, MSTFS, Data Station, WBSD, HPSD and TAD

Responsibilities:

  • Involved in gathering requirement by interacting with the business clients
  • Estimate the requirements and prepare impact analysis document from requirements
  • Guide offshore team to develop coding and make them understand the requirements
  • Created new Informatica mappings, sessions to populate SCD type 6 tables
  • Developed BTEQ scripts to load to Staging area
  • Designed file level and record level validations to ensure quality data is loaded to the warehouse
  • Designed a reprocessing logic for data when the primary information is missing but still data TEMPhas to be loaded
  • Modeled new database objects in ERwin tool
  • Created Autosys job box to trigger all UNIX scripts which in turn calls the workflow
  • Used MSTFS for code versioning and HPSD to create RFC to migrate the code, Control M and DB changes

Confidential, Minneapolis

Module Lead/Developer

Environment: Informatica, Oracle, DB2, UNIX, ERwin, TOAD, Aqua Data Studio, Control M, MSTFS, Data Station, WBSD and TAD

Responsibilities:

  • Worked with the business clients to understand business requirement
  • Designed a data quality monitor to validate data post load
  • Designed and developed Informatica jobs
  • Created new Informatica jobs to populate SCD type 2 dimension tables
  • Designed the process to reload the corrected data which fails validation
  • Designed reprocessing logic for data when the primary data is missing but still data TEMPhas to be loaded
  • Created UNIX shell scripts which are used to trigger the Informatica jobs passing all the variables for the job to execute like Source and Confidential databases connection information.
  • Used job control M to invoke the Unix scripts which invokes Informatica jobs
  • Used MSTFS version control tool for code versioning.
  • Used HPSD to raise deployment request and to maintain Incident management for Job failure request.

Confidential

Module lead and Developer

Environment: Informatica, DB2, Teradata, UNIX, ERwin, TOAD, Aqua Data Studio, Autosys, MSTFS, Data Station, WBSD, HPSD and TAD

Responsibilities:

  • Created high level and low level design documents based on the requirement
  • Designed and developed Informatica Mappings/workflows/sessions.
  • Developed SCD type 6 as we capture and store historical information for dimensions
  • Responsible for designing, developing code to load the dimension and Fact tables for Transaction, Authorization, Involved Party subject areas
  • Designed the job flow in control M to load dimension tables, generate surrogate keys and load Fact tables
  • Created Unix scripts to invoke the job/sequence from control m by passing necessary variables
  • As a module lead mentored 2 developers and 2 testers and ensured successful project delivery
  • Involved in deploying the code to production through the release management process which involves moving the code to source control tool, MSTFS and developing a build using TAD

Confidential

Developer

Responsibilities:

  • Designed the low level design document using the high level design document and mapping document
  • Created T-SQL scripts to move data from Landing zone to Staging layer and tan to Dimension and Fact tables
  • Created new Reusable function with improved reject row handling, string handling, timestamp conversion
  • Created UNIX shell scripts to remove control chars in the files before starting the loading process
  • Developed Mapplets to generate surrogate keys which can be reused for all the dimension tables.
  • Performed Unit Testing and system testing on the developed components.
  • Created Control M jobs to trigger jobs and support production job run

Confidential

Developer

Environment: Informatica 7.1.2, Informatica 6.1, Oracle, UNIX, TWS, PVCS, PL/SQL

Responsibilities:

  • Designed mappings in Informatica designer, Mapplets Workflow Manager to export/import components and workflow monitor to monitor job run.
  • Developed Informatica Mappings to load Type 2 mapping to load Dimension tables
  • Created PL/SQL procedures, functions and cursors to load data
  • Worked on exception handling in PL/SQL
  • Developed mappings in Informatica to load the data from the source Into the Data Warehouse, using different transformations like Source Qualifier, Expression, Lookup, Aggregator, Update Strategy, Filter, Union and Sequence Generator transformations.
  • Used Informatica Designer to create reusable transformations to be used in Informatica mappings and Mapplets
  • Implemented performance-tuning logic on targets, sources, mappings, sessions to provide maximum efficiency and performance.

Confidential

Developer

Environment: Informatica 6.X, Informatica 7.1.2, Oracle, TWS, TOAD and UNIX

Responsibilities:

  • Extensively worked on transformations like Source Qualifier, Expression, Lookup, Aggregator, Update Strategy, Filter, Union and Sequence Generator transformations in various mappings.
  • Responsible for creating and testing ETL processes composed of multiple mappings using Informatica PowerCenter7.0 Environment.
  • Extracted data from source systems SQL Server, Sequential files, Flat files and loaded data into Oracle data warehouse.
  • Used mapplets for code reuse and implementing complex business logic which increased to productivity of work.
  • Developed test data and conducted performance testing on the developed modules
  • Worked on performance tuning and enhancement of mappings transformations
  • Participated in weekly status meetings, and conducting internal and external reviews as well as formal walkthroughs among various teams, and documenting the proceedings.
  • Created TWS jobs to trigger all UNIX job and created our load design using it.
  • Preparation of the project and internal documents Microsoft Share point.
  • Used MKS version control tool for code versioning

We'd love your feedback!