We provide IT Staff Augmentation Services!

Etl Developer/ Sql Bi Developer Resume

2.00/5 (Submit Your Rating)

Cold Spring, KY

SUMMARY:

  • 9 plus years of experience in the IT Industry encompassing various roles and a wide range of skill sets and industry verticals
  • 8 years of experience with Informatica, including Power Center 9.1/9.5/9.6 and Informatica Developer (IDQ)
  • Expertise in Requirements Analysis, Designing, Coding, Testing & Implementation of ETL Data Warehousing projects using Informatica Power Center and Power Exchange 9.x/8.x/7.x, Oracle, SQL Server, PL/SQL, Mainframes, DB2, Teradata, Flat Files and UNIX Shell Scripts.
  • Extensive experience in designing, developing, and implementing Extraction, Transformation, and Load (ETL) techniques on multiple database platforms and operating system environments.
  • Expert level mastery in designing and developing complex mappings to extract data from diverse sources including flat files, RDBMS tables, mainframe files, XML files, applications and COBOL sources.
  • Extensive experience with ETL tool Informatica in designing Workflows, Worklets, and Mappings. Extensively worked on developing and debugging Informatica mappings to make sure of correct data passage without any errors. Good knowledge of Informatica Server level properties.
  • Expertise in developing solutions for Power Exchange to access mainframe sources such as VSAM files and sequential files. Created datamaps for complex copybooks.
  • Expertise in writing JCLs in mainframes wherever required as a part of the ETL process. Very good knowledge in error handling of mainframe jobs and good knowledge of creating Change Man packages to move the mainframe jobs from development to production.
  • Extensive knowledge in all areas of Project Life Cycle including requirements analysis, system analysis, design, development, documentation, testing, implementation and maintenance.
  • Sound knowledge of relational and dimensional modeling techniques of data warehouse concepts, the Star and Snowflake schemas, SCD, surrogate keys, and normalization/denormalization.
  • Developed slowly changing dimension mappings of Type 1, Type 2 and Type 3 (Version, Flag and Timestamp)
  • Experience in scheduling workflows using both Informatica and other scheduling tools like Control - M, Cron, etc.
  • Experience in integration of various relational sources like SQL Server, Oracle, Teradata, and DB2 into the staging area.
  • Very good knowledge and experience with performance tuning, identifying and resolving performance bottlenecks in various levels like sources, targets, mappings and sessions.
  • Experienced in performing unit testing and system testing.
  • Very good knowledge of SQL and PL/SQL. Experience in writing complex Stored Procedures to get the desired output with efficient run time.
  • Strong knowledge of the different client tools of IBM Datastage (Designer client, Director client and Administrator client).
  • Proficient working with Unix/Windows environments and writing UNIX shell scripts.
  • Very good hands-on experience working with tools like Putty, WinSCP, TOAD, SQL Plus, and SQL Server Management Studio, SQL Server Migration Assistant.
  • Strong skills in data analysis, data requirement analysis and data mapping for ETL processes.
  • Very good skills in mentoring and providing knowledge transfer to team members, support teams and customers.
  • Strong verbal and written communications skills and excellent analytical and problem-solving skills. Excellent team work spirit and capable of learning new technologies and concepts if needed.

TECHNICAL EXPERTISE:

Programming: Cobol, UNIX Shell Scripting and PL/SQL

ETL Tools: Informatica Power Center, Informatica Power Exchange, Informatica Data Quality (Analyst Tool), IBM Datastage, Talend, Ab Initio

Data Modeling: Dimensional Data Modeling, Star Schema Modeling, Snowflake Modeling, FACT and Dimensions Tables, Physical and Logical Data Modeling

Operating Systems: Windows 7/XP/NT/2000, UNIX and Mainframes

Databases: Oracle, IBM DB2, Teradata, SQL Server, Netezza

Other Technologies: MS word, MS Excel, MS Power Point, Micro-Strategy, Control-M, Linux/Unix, PeopleSoft, Oracle BI Applications

PROFESSIONAL EXPERIENCE:

Confidential, Cold Spring, KY

ETL Developer/ SQL BI Developer

Responsibilities:

  • Deployed the created SSIS packages in Development and Testing environments.
  • Used SSIS to create ETL Packages to validate, extract, transform and load data to data warehouse databases, data mart databases.
  • Prepared the complete data mapping for all migrated jobs using SSIS.
  • Generated packages using different Transformations like Lookups, Derived Column, Merge Join, Fuzzy Lookup, For Loop, For Each Loop, Conditional Split, Union all, Script component and etc.
  • Experience in providing Logging, Error handling by using Event Handler, and Custom Logging for SSIS Packages.
  • Created SSIS Packages and involved in Project configurations and deployments between Development and QA and Production servers.
  • Migrated database objects between environments (Staging, Conversion, Development, and Production)
  • Responded to AD-HOC requests and special projects that require immediate attention while at the same time maintaining deadlines on existing requests.
  • Worked with project manager and technical leads for resolution of technical issues.
  • Developed SSIS packages using for each loop in Control Flow to process all excel files within folder, File System Task to move file into Archive after processing and Execute SQL task to insert transaction log data into the SQL table.
  • Migrated data from different sources including flat files, Excel files, MS Access, Oracle 10/11g to SQL Server 2012 by using SSIS.
  • Involved in SSIS Package, DTS Import/Export for transferring data.
  • Imported and exported data in the tables using Toad for Oracle and SSMS for SQL.
  • Developed complex stored procedure using T-SQL to generate ad hoc reports
  • Created triggers to enforce data and referential integrity, on the specific table which helped users to prevent inserting invalid data or to respond to a certain events like INSERT, UPDATE and DELETE.
  • Used Microsoft Visual C# in script component of SSIS.
  • Highly proficient in the use of T-SQL for developing complex Stored Procedures, Triggers, Tables, User Defined Functions, views, indexes, user profiles, Relational Database models, query writing and SQL joins.
  • Expert in Performance Tuning, Handling Indexes, and Query Optimization
  • Developed complex SSIS and DTS packages for ETL purposes. Implemented complicated transformations in the development of SSIS packages.
  • Identified and resolved problems encountered during development and release of the SSIS code.
  • Developed, Monitored and Deployed SSIS Packages from Development environment to Production environment.
  • Performed Data Extraction, Transforming and Loading (ETL) using various tools such as SQL Server Integration Services (SSIS), Log Shipping, DTS, Bulk Insert and BCP.
  • Experience in working with SSIS, for ETL process ensuring proper implementation of Event Handlers, Loggings, Checkpoints, Transactions and package configurations

Environment: SSIS, SSRS, SSAS, Microsoft SQL Server 2008, Microsoft SQL Server 2012, Visual Studio 2012, TOAD for Oracle, Oracle 10g/11g, SQL Server Management Studio 11, Microsoft Power BI Desktop, .NET Framework, SQL, T-SQL, PL/SQL, Flat Files, Team Foundation Server

Confidential, Cincinnati, OH

Sr. ETL and PL/SQL Developer

Responsibilities:

  • Interacting with business owners to gather both functional and technical requirements. 
  • Documenting the business requirements and framing the business logic for the ETL process. 
  • Developing technical specifications and other helpful ETL documents following CME Group's standards. 
  • Used agile methodology for SDLC and utilize scrum meetings for creative and productive work. 
  • Design and develop PL/SQL packages, stored procedure, tables, views, indexes, and functions; implement best practices to maintain optimal performance. 
  • Design, develop, and test Informatica mappings, workflows, worklets, reusable objects, SQL queries, and Shell scripts to implement complex business rules. 
  • Transferring the data from various sources like XML, flat files, DB2 into Oracle data warehouse. 
  • Extensively worked on SCD type 2 using Look up transformation. 
  • Identifying bottlenecks/issues and fine tuning them for optimal performance. 
  • Oversaw unit and system tests and assisted users with acceptance testing. 
  • Responsible for capturing, reporting, and correcting error data. 
  • Performed many ETL related tasks including data cleansing, conversion, and transformations to load Oracle 11G based Data Warehouse. 
  • Work with DBA's and systems support personnel in elevating and automating successful code to production. 
  • Provide on-call support to production system to resolve any issues. 
  • Conducting code walkthroughs and review peer code and documentation. 

Environment: Informatica Power Center 10, Informatica Developer 10 (IDQ), SQL, PL/SQL, T-SQL, Oracle, SQL Server, Flat Files, UNIX, Windows, Mainframes, Netezza.

Confidential, Erlanger, Kentucky

Sr. ETL and PL/SQL Developer

Responsibilities:

  • Worked on complete SDLC (Software Development Life Cycle) including system analysis, high level design, detailed design, coding and testing.
  • Involved in the design and development of mappings for loading data into different target tables
  • Clear understanding of Business Intelligence and Data Warehousing concepts.
  • Experience in mapping techniques for Type 1, Type 2 and Type 3 slowly changing dimensions.
  • Extensively used Router, Lookup, Aggregator, Expression and Update Strategy transformation.
  • Involved in performance tuning of the mapping to reduce the load time
  • Involved in design and development of business requirements by co-coordinating with business users.
  • Created and managed the triggers for the smooth flow of data from the Source tables to unboxing tables.
  • Implemented ETL process using UNIX shell script.
  • Performed Code Reviews for Production deployment planning.
  • Identify and document gaps in the data domains and work with end users and analysts for data definition mapping process, functional and non-functional requirements.
  • Developed control files, Stored Procedures to manipulate and load the data into Oracle database using SQL Loader.
  • Writing scripts for collection of statistics, reorganization of tables and indexes, and creation of indexes for enhancing performance for data access.
  • Involved in performance tuning using Explain Plan, analyzing Schema's, Tables and Indexes.
  • Customized the solution by using partition tables, index views.
  • Worked intensively on PL/SQL Tables, Procedures, Records, and Used Dynamic SQL and triggers.
  • Wrote Cleanup procedures to clean up the data that is too old.
  • Have worked on tasks like monitoring the number of records received, calculating the threshold of the counts and sending email from oracle when the threshold is not met.
  • Analyzed the record count for a given domain, calculated the mean, standard deviation, the upper and lower limits for the same and illustrated them in excel charts.
  • Also worked on tasks like file frequency, where an email from oracle has to be sent when there is any delay in receiving the file from client and also check the status of the file.
  • In order to overcome performance bottlenecks, implemented alternate approaches like range partitioning of tables, use of various methods like global temporary tables etc.
  • Provided alternative approaches when faced with performance bottlenecks. Approaches like range partitioning, global temporary tables.
  • Supported file loading process and solved the errors involved.
  • Wrote extensive exception handling procedures to handle exceptions.
  • Played an important role in analysis, requirements gathering, functional/technical specification, development, deploying and unit testing.
  • Performance tuning and query optimization were extensively needed in these projects and worked on the same.

Environment: Informatica Power Center 9.6.1 (Informatica Designer, Repository Manager, Workflow Manager, Workflow Monitor, Repository Server Administration Console), SQL, PL/SQL, T-SQL, Oracle, SQL Server, Flat Files, UNIX, Windows, Mainframes.

Confidential, Birmingham, AL

Oracle/ ETL Developer

Responsibilities:

  • Worked in multiple projects with agile methodology environment.
  • In the course of the present project’s SDLC, was introduced into KANBAN environment. Good experience.
  • Hands-on experience with using JIRA ticketing system (Ex., creating tickets and assigning and monitoring the status until it’s closed)
  • Worked on outlook email tool and moved to corporate Gmail system.
  • Hands on experience of Microsoft Office products.
  • Developed common routine mappings using different transformations like Filter, Router, Connected & Unconnected lookups, Stored Procedure, Joiner, Update Strategy, Expressions and Aggregator transformations to pipeline data to Data Warehouse.
  • Used Informatica Source Analyzer, Mapping Designer, Transformation Developer and Warehouse Designer for Extraction, Transformation and Loading.
  • Involved in migrating from Informatica 9.5.1 to Informatica 9.6.1 version.
  • Responsible for monitoring all the sessions that are running, scheduled, completed and failed. Debugged the mapping of the failed session to check the progress of data load.
  • Written pre and post Unix Shell Scripts where ever required for the ETL Processes.
  • Involved in SQL scripts Macros, stored procedures in Oracle and SQL Server to implement business rules.
  • Did PL/SQL coding to analyze, compare, consolidate, cross reference, correlate data across disparate systems.
  • Worked with Parameters and Variables, Pre SQL/Post SQL, Pre session and post session commands, Emails in Mappings and Sessions.
  • Used shortcuts to reuse objects without creating multiple objects in the repository and inherit changes made to the source automatically.
  • Reviewed and tuned the SQL Queries.
  • Oracle tables maintenance like partitioning monthly, daily and running statistics.
  • Wrote SQL scripts for validation test cases to validate the data that loaded using Informatica.
  • Created table partitioning and indexes and views. Wrote scripts to run the statistics on the oracle tables.
  • Involved with Salesforce teams
  • Wrote PL/SQL stored procedures and Unix scripts as a part of ETL Process.
  • Hands on experience on Oracle 11g/10g PL/SQL Optimization.
  • Created Profiles in Informatica Data Quality Developer tool to profile tables as a part of Data analysis and made accessible to the end clients by generating reports.
  • Extensively worked on masking sensitive data.
  • Worked on Complete Software Development Life Cycle starting from information Strategy planning, requirement analysis, Design, Development, Coding, Testing, Debugging, and rollout to the field user and support for Production environment.
  • As a whole worked on building ETL Processes with sources from Flat Files, Mainframes, DB2, SQL Server and Oracle and played a vital role on the team in getting things done.

Environment: Informatica Power Center 8.6.1/9.1/9.5 /9.6.1 (Informatica Designer, Repository Manager, Workflow Manager, Workflow Monitor, Repository Server Administration Console), Power Exchange 8.6.1/9.1/9.5 , Informatica Data Quality (Developer Tool), Control-M, JIRA, Micro Strategy 9.4.1, Hadoop.1.x, Hbase, Hive, AWS, SQL, PL/SQL, Oracle 11g/10g, SQL Server 2008, T-SQL Flat Files, UNIX, Windows, Mainframes, DB2.

Confidential, New Orleans, LA

Sr. ETL and PL/SQL Developer

Responsibilities:      

  • Worked on complete SDLC (Software Development Life Cycle) including system analysis, high level design, detailed design, coding and testing. 
  • Analyzed functional requirements and developed technical designs and specs. 
  • Developing common routine mappings using different transformations such as Filter, Router, Connected & Unconnected Lookups, Stored Procedure, Joiner, Update Strategy, Expressions and Aggregator transformations to pipeline data to Data Warehouse.
  • Involved in the design and development of mappings for loading data into different target tables 
  • Clear understanding of Business Intelligence and Data Warehousing concepts. 
  • Experience in mapping techniques for Type 1, Type 2 and Type 3 slowly changing dimensions. 
  • Extensively used Router, Lookup, Aggregator, Expression and Update Strategy transformation. 
  • Involved in performance tuning of the mapping to reduce the load time 
  • Involved in design and development of business requirements by co-coordinating with business users. 
  • Created and managed the triggers for the smooth flow of data from the Source tables to unboxing tables.
  • Implemented ETL process using UNIX shell script. 
  • Performed Code Reviews for Production deployment planning. 
  • Identify and document gaps in the data domains and work with end users and analysts for data definition mapping process, functional and non-functional requirements. 
  • Developed control files, Stored Procedures to manipulate and load the data into Oracle database using SQL Loader. 
  • Writing scripts for collection of statistics, reorganization of tables and indexes, and creation of indexes for enhancing performance for data access. 
  • Involved in performance tuning using Explain Plan, analyzing Schema's, Tables and Indexes. 
  • Customized the solution by using partition tables, index views. 
  • Worked intensively on PL/SQL Tables, Procedures, Records, and Used Dynamic SQL and triggers. 
  • Wrote Cleanup procedures to clean up the data that is too old. 
  • Have worked on tasks like monitoring the number of records received, calculating the threshold of the counts and sending email from oracle when the threshold is not met. 
  • Analyzed the record count for a given domain, calculated the mean, standard deviation, the upper and lower limits for the same and illustrated them in excel charts. 
  • Also worked on tasks like file frequency, where an email from oracle has to be sent when there is any delay in receiving the file from client and also check the status of the file. 
  • In order to overcome performance bottlenecks, implemented alternate approaches like range partitioning of tables, use of various methods like global temporary tables etc. 
  • Provided alternative approaches when faced with performance bottlenecks. Approaches like range partitioning, global temporary tables. 
  • Supported file loading process and solved the errors involved. 
  • Wrote extensive exception handling procedures to handle exceptions. 
  • Played an important role in analysis, requirements gathering, functional/technical specification, development, deploying and unit testing. 
  • Performance tuning and query optimization were extensively needed in these projects and worked on the same. 

Environment: Informatica Power Center 9.6 (Informatica Designer, Repository Manager, Workflow Manager, Workflow Monitor, Repository Server Administration Console), SQL, PL/SQL, T-SQL, Oracle, SQL Server, Flat Files, UNIX, Windows, Mainframes.

Confidential, Birmingham

Sr. ETL Lead with PL/SQL

Responsibilities:   

  • Used ETL Tools like Informatica, IBM Datastage and Talend tools as part of Data obfuscation of different applications across the Bank
  • Developing common routine mappings using different transformations such as Filter, Router, Connected & Unconnected Lookups, Stored Procedure, Joiner, Update Strategy, Expressions and Aggregator transformations to pipeline data to Data Warehouse.
  • Using Informatica Source Analyzer, Mapping Designer, Transformation Developer and Warehouse Designer for Extraction, Transformation and Loading.
  • Querying and analyzing multiple databases and handling the errors as per the client specifications.
  • Developed multiple jobs where the source was a COBOL file & performed a number of validations & moved the data to the Oracle environment.
  • Extracted data from mainframe source systems to flat files.
  • Five years of experience using DAC administration console for testing and performance tuning of workflows
  • Provide daily production support for PeopleSoft payroll 
  • Developed Application Engine programs, Component Interfaces to load data from external systems into PeopleSoft System. 
  • Worked to design, develop, and maintain a tax processing management system utilizing a fixed tax infrastructure
  • Experience with tax interface applications
  • Responsible for monitoring all sessions that are running, scheduled, completed and/or failed. Debug the mapping of the failed session to check the progress of data load.
  • Write pre- and post- Unix shell scripts wherever required for ETL processes.
  • Write complex stored procedures in Oracle and SQL Server to implement business rules.
  • Use PL/SQL coding to analyze, compare, consolidate, cross reference, correlate data across disparate systems.
  • Worked with logical parameter files, variables, Pre-SQL/Post-SQL, pre-session and post-session commands, email tasks in workflows.
  • Worked with complex copybooks that had several OCCURS and REDEFINES to import into Power Center using Power Exchange.
  • Used shortcuts to reuse objects without creating multiple objects in the repository and inheriting changes made to the source automatically.
  • Review and tune SQL queries.
  • Created Change Man packages to move mainframe jobs from development to production.
  • Wrote PL/SQL stored procedures and Unix scripts as part of the ETL process.
  • Administered and tuned the performance of the SQL Server and Oracle database level properties and also SQL queries.
  • Created profiles in the Informatica Data Quality Developer tool to profile tables as a part of data analysis and helping the end clients by generating reports.
  • Extensively worked on masking sensitive data.
  • Used Talend Open Studio for Data Integration and IBM Datastagetools as part of the process of obfuscation of different applications across the bank. Since we are the only obfuscation team for the whole bank, I was able to learn different tools and environments.
  • Debugging and troubleshooting complex Talend and IBM Datastage ETL processes.
  • Extensively worked on code migration from development to production.
  • Worked on complete Software Development Life Cycle starting from information strategy planning, requirement analysis, design, development, coding, testing, and debugging, to rollout to the field user and support for the production environment.
  • As a whole worked on building ETL processes using Informatica, IBM Datastage and Talend with sources from flat files, Mainframes, DB2, Teradata, SQL Server and Oracle, and played a vital role on the team in getting things done.

Environment: Informatica Power Center 8.6.1/9.1 (Informatica Designer, Repository Manager, Workflow Manager, Workflow Monitor, Repository Server Administration Console), TALEND 5.5, Power Exchange 8.6.1/9.1, Informatica Data Quality (Developer Tool), IBM Datastage 8.5/9.1, Control-M, SQL, PL/SQL, Oracle 11g/10g, SQL Server 2008, Flat Files, UNIX, Windows, Mainframes, DB2, Teradata.

Confidential, Birmingham

ETL Developer

Responsibilities:

  • Worked across several teams supporting the Informatica Development effort using Power Center and Power Exchange.
  • Worked with Business Analysts to translate business requirements into technical specifications.
  • Extensively defined & used parameters and variables in workflows for many ETL jobs.
  • Troubleshooting database, workflows, mappings, source, target to find out the bottlenecks and improving performance
  • Testing the ETL mappings and solving the defects logged by QA’s in Team Tracker
  • Analyzing entire code of the application and developing an application information document describing various business rules of the application.
  • Responsible for monitoring all the sessions that are running, scheduled, completed and failed. Also debugged mappings of failed sessions to check the progress of data loads.

Environment: Informatica Power Center 8.x, Informatica Power Exchange 8.x, Teradata, Oracle, PL/SQL, SQL server, TOAD, flat files, UNIX and Windows.

We'd love your feedback!