Data Analyst/sql Developer Resume
St Louise, MO
SUMMARY
- Around 8+ years of extensive experience in ETL (Extract Transform Load), Data Integration, Data Profiling, Data Mapping, Data Warehousing using Informatica, Teradata and Oracle technologies.
- 6+ years of experience in Data Analysis and Data Profiling using Teradata SQL.
- 2+ years of experience in Data Modeling using Erwin tool.
- Experience in Data Modeling involving both Logical and Physical modeling using DM tools Erwin and ER Studio. Created data models for both 3NF and Dimensional Data models including Star Schemas.
- Proficient with the concepts of Data Warehousing, Data Marts, ER Modeling, Dimensional Modeling, Fact and Dimensional Tables with data modeling tools ERWIN and ER Studio.
- Experience in Data Analysis, Data Profiling and Data Mapping. Used SQL and Data Profiler tools.
- Very good understanding of scheduling the Informatica mapping using the Informatica workflow manager etc.
- Extensively created and used various Teradata Set Tables, Multi - Set table, global tables, volatile tables, temp tables.
- Extensively used different features of Teradata such as BTEQ, Fastload, Multiload, SQL Assistant, DDL and DML commands.
- Very good understanding of Teradata UPI and NUPI, secondary indexes and join indexes.
- Experience in creating 3NF and Dimensional Models using data modeling tools ERWIN and Sybase power designer.
- Technical and Functional experience in Data warehouse implementations ETL methodology using Informatica Power Center 9.0.1/8.6/8.1 /7.1 , Teradata, Oracle 10g/9i/8i and MS SQL SERVER 2008/2005/2000 in Finance, Health Insurance and Pharmacy Domains.
- Automated the BTEQ report generation using Unix scheduling tools on weekly and monthly basis. Well versed with understanding of Explain Plans and confidence levels and very good understanding of Database Skew. Knowledge in Query performance tuning using Explain, Collect Statistics, Compression, NUSI and Join Indexes including Join and Sparse Indexes
- Extensively worked on Viewpoint for Teradata to look at performance Monitoring and performance tuning. Well versed with Teradata Analyst Pack including Statistics Wizard, Index Wizard and Visual Explain.
- Experience in Mainframe applications development. Mainframe skills include TSO, COBOL II, JCL, DB2, SQL, SPUFI, QMF, IMS, IDMS, CICS and VSAM.
- Extensive knowledge in Business Intelligence and Data Warehousing Concepts with emphasis on ETL and System Development Life Cycle (SDLC).
- Working Knowledge of Data warehousing concepts like Star Schema and Snowflake Schema, Data Marts, Kimball Methodology used In Relational, Dimensional and Multidimensional data modelling.
- Extensive knowledge on Data Profiling using Informatica Developer tool.
- Implemented Slowly changing dimension types (I, II &III) methodologies for accessing the full history of accounts and transaction information designed and developed change data capture solutions (CDC) for the project, which captures and analyses changes from daily feeds to maintain history tables.
- Proficiency in design and developing the ETL objects using Informatica Powercenter with various Transformations like Joiner, Aggregate, Expression, SQL, Lookup, Filter, Update Strategy, Stored Procedures, Router, Rank, normalizer transformations etc.
- Automated the BTEQ report generation using Unix scheduling tools on weekly and monthly basis. Well versed with understanding of Explain Plans and confidence levels and very good understanding of Database Skew. Knowledge in Query performance tuning using Explain, Collect Statistics, Compression, NUSI and Join Indexes including Join and Sparse Indexes.
- Extensively worked on PMON/Viewpoint for Teradata to look at performance Monitoring and performance tuning. Well versed with Teradata Analyst Pack including Statistics Wizard, Index Wizard and Visual Explain. Experienceinprogramming with SQL and PL/SQL (Stored Procedures, Functions, Cursors, and Database Triggers).
- Very good experience in Oracle database application development using Oracle 10g/9i/8i/x, SQL, PL/SQL, SQL Loader.
- Strong SQL experience in Teradata from developing the ETL with Complex tuned queries including analytical functions and BTEQ scripts.
- Extensively used Mapping Variables, Mapping Parameters, and Dynamic Parameter Files for improved performance and increased flexibility and also worked with XML Sources & Targets.
- Data Processing Experience in Designing and Implementing Data Mart applications, mainly Transformation Process using Informatica.
- Developing workflows with Worklets, Event waits, Assignments, Conditional flows, Email and Command Tasks using Workflow Manager.
- Knowledge of push down optimization concepts and tuning Informatica objects for optimum execution timelines.
- Experienced with identifying Performance bottlenecks and fixing code for Optimization in Informatica and Oracle.
- Created UNIX shell scripts for Informatica post and pre session operations, database
- Administration and day-to-day activities like, monitor network connections and database ping utilities.
- Extensive experience in implementation of Data Cleanup Procedures, Transformation, Scripts, Stored Procedures and execution of Test plans for loading the data successfully into Targets.
- Creating Checklists for Coding, Testing and Release for a smooth, better & error free project flow
TECHNICAL SKILLS
Primary Tools: Informatica Power Center 9.5.1, 9.0.1/8.6/8.1 , Ab Initio (Co>Op 3.0.3.9/2.15/2.14 , GDE 3.0.4/1.15/1.14 ), Teradata SQL, Teradata Tools and Utilities, Oracle 10g/9i, MS SQL Server 6.5/7.0/2000 , DB2
Languages: Teradata SQL, Unix Shell Scripting
Teradata Utilities: BTEQ, FastLoad, MultiLoad, TPump, SQL Assistant, Teradata Manager
Databases: Teradata 14/13.10/13/12/ V2R6.2, Oracle 10g/9i
Operating Systems: Windows 95/98/NT/2000/XP, UNIX, Linux, NCR MP-RAS UNIX
Scheduling tools: Control M, Clear Quest
Data Modeling: Erwin 7.3, ER Studio
PROFESSIONAL EXPERIENCE
Confidential, St. Louise, MO
Data Analyst/SQL Developer
Environment: Informatica Power Center 9.1/8.6, Teradata 14/13.10, Unix, Oracle 10g, Control-M, TOAD, SQL Assistant, Unix Shell Scripting, Erwin 7.3, Autosys, ETL/ELT.
Responsibilities:
- Involved in full Software Development Life Cycle (SDLC) - Business Requirements Analysis, preparation of Technical Design documents, Data Analysis, Logical and Physical database design, Coding, Testing, Implementing, and deploying to business users.
- Providing technicalsupport and guidance to the offshore team to address complex business problems.
- Involved in gathering business requirements, logical modeling, physical database design, data sourcing and data transformation, data loading, SQL and performance tuning.
- Created Data Models for multiple subject areas involving star and snowflake schemas using Erwin. Involved in Logical to physical data model conversion, Forward Engineering, Reverse Engineering, DDL generation.
- Did the Data Profiling and Data Analysis using SQL queries looking for Data issues, Data anomalies.
- Involved in ETL data mapping for source to target tables and also worked on ETL scripts to do the ETL.
- Defining the schema, staging tables, and landing zonetables, configuring base objects, foreign-key relationships, complex joins, and building efficient views.
- Expertise in writing scripts for Data Extraction, Transformation and Loading of data from legacy systems to target data warehouse using BTEQ, FastLoad, MultiLoad, and Tpump.
- Performed Query Optimization with the help of explain plans, collect statistics, Primary and Secondary indexes. Used volatiletable and derivedqueries for breaking up complex queries into simpler queries. Streamlined the Teradata scripts and shell scripts migration process on the UNIX box.
- Worked on InformaticaPower Center tools - Designer, Repository Manager, Workflow Manager, and Workflow Monitor.
- Worked and Implemented Pushdown Optimization (PDO) to optimize performance issues of complex mappings involving numerous transformations and hence degrading the performance of the session.
- Using various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer.
- Developing as well as modifying existing mappings for enhancements of new business requirements mappings to load into staging tables and then to target tables in EDW. Also created mapplets to use them in different mappings.
- Working on different tasks in Workflows like sessions, events raise, event wait, e-mail, command, worklets and scheduling of the workflow.
- Creating sessions, configuring workflows to extract data from various sources, transforming data, and loading into enterprise data warehouse.
- Running and Monitoring daily scheduled jobs by using Work Load manager for supporting EDW (Enterprise Data Warehouse) loads for History as well as incremental data.
- Design, Development and Documentation of the ETL (Extract, Transformation & Load) strategy to populate the Data Warehouse from the various source systems.
- Prepared data marts on policy data, policy coverage, claims data, client data and risk codes.
- Extensively used Informatica PowerCenter 8.6 to create and manipulate source definitions, target definitions, mappings, mapplets, transformations, re-usable transformations, etc.
- Involved in design and development complex ETL mappings and stored procedures in an optimized manner. Used Power exchange for mainframe sources.
- Involved in loading the data from Source Tables to ODS (Operational Data Source) and XML files using Transformation and Cleansing Logic using Informatica.
- Based on the logic, used various transformation like Source Qualifier, Normalizer, Expression, Filter, Router, Update strategy, Sorter, Lookup, Aggregator, Joiner, XML, Stored procedure transformations in the mapping.
- Involved in performance tuning of mappings, transformations and (workflow) sessions to optimize session performance.
- Developed Informatica SCD type-I, Type-II and Type III mappings and tuned them for better performance. Extensively used almost all of the transformations of Informatica including complex lookups, Stored Procedures, Update Strategy, mapplets and others.
- Snowflake Schema was mainly used with Geography, Customer, Product, and Time as basic dimensions.
- Creating Test cases for Unit Test, System, Integration Test and UAT to check the data quality.
- Investigating failed jobs and writing SQL to debug data load issues in Production.
- Writing SQL Scripts to extract the data from Database and for Testing Purposes.
- Automated the ETL process using UNIX Shell scripting.
- Performed tuning and optimization of complex SQL queries using Teradata Explain and Run stats.
- Created a BTEQ script for pre population of the work tables prior to the main load process.
- Extensively used Derived Tables, Volatile Table and Global Temporary tables in many of the ETL scripts.
- Developed MLOAD scripts to load data from Load Ready Files to Teradata Warehouse.
- Performance Tuning of sources, Targets, mappings and SQL queries in transformations.
- Created Primary Indexes (PI) for both planned access of data and even distribution of data across all the available AMPS.
- Created appropriate TeradataNUSI for smooth (fast and easy) access of data.
- Worked on exporting data to flat files using TeradataFastExport.
- Analyzed the Data Distribution and Reviewed the Index choices.
- In-depth expertise in the Teradatacost based query optimizer, identified potential bottlenecks.
- Worked with PPI Teradatatables and was involved in Teradata specific SQL fine-tuning to increase performance of the overall ETL process.
- Implemented project using Agile software methodologies (scrum).
Confidential, Lansing, MI
Data Analyst
Environment: Teradata 14.0/13.0, BTEQ, FastLoad, MultiLoad, Fast Export, Teradata SQL Assistant, OBIEE 11g/10g, DB2, ERwin R7.3, IBM Mainframes MVS/OS, JCL, TSO/ISPF, Changeman, SPUFI, FileAid, COBOL, ZEKE, DB2, UNIX, FTP.
Responsibilities:
- Involved in understanding the Requirements of the End Users/Business Analysts and developed strategies for ETL processes.
- Extracted datafrom DB2 database on Mainframes andloadedit into SET and MULTISET tables in the Teradata database by using various Teradata load utilities. Transferred large volumes of data using Teradata FastLoad, MultiLoad, and T-Pump.
- Architected and developed FastLoad and MultiLoad scriptsdeveloped Macros and Stored procedures to extract data, BTEQscripts to take the date range from the database to extract data.
- Created JCL scripts for calling and executing BTEQ, FastExport, Fload, and Mload scripts.
- Responsible for requirements gathering for an enhancement requested by client. Involved in analysis and implementation for an Intranet Based Information Management Information System.
- Provided maintenance and support of Online and Batch Programs using COBOL, DB2, CICS, JCL.
- Writing queries using SPUFI to extract data from various DB2 Views for reporting purpose
- Converting the Table data from DB2 region to TERADATA region using FASTLOAD and MULTILOAD Utilities.
- Responsible for Coding, Unit Test Plans, Unit Test Results, Functional Testing and Regression Testing.
- Synchronizing all regions PCR, Unit and System while migration changes from lower region to acceptance.
- Wrote several DB2 Stored Procedure scripts to implement the business logic.
- Handling Ad-Hoc Report Requests.
- Reviewing programs for QA and Testing.
- Environment: Z/OS, COBOL II, JCL, CICS, Teradata, VSAM, DB2, Streamweaver, CHANGEMAN, Easytrieve, FileAid, TSO/ISPF, SPUFI, QMF, Xpeditor, MS Products.
- Developed Teradata BTEQ scripts to implement the business logic and work on exporting data using Teradata FastExport.
- Wrote highly complex SQL to pull data from the Teradata EDW and create AdHoc reports for key business personnel within the organization.
- Created data models for information systems by applying formal data modeling techniques.
- Strong expertise in physical modeling with knowledge to use Primary, Secondary, PPI, and Join Indexes.
- Designed Fact tables and Dimension tables for star schemas and snowflake schemas using ERWIN tool and used them for building reports.
- Performed reverse engineering of physical data models from databases and SQL scripts.
- Provided database implementation and database administrative support for custom application development efforts.
- Performance tuning and optimization of database configuration and application SQL by using Explain plans and Statistics collection based on UPI, NUPI, USI, and NUSI.
- Developed OLAP reports and Dashboards using the Business intelligence tool - OBIEE.
- Involved in comprehensive end-to-end testing- Unit Testing, System Integration Testing, User Acceptance Testing and Regression.
- Provided 24/7 On-call Production Support for various applications and provided resolution for night-time production job abends, attend conference calls with business operations, system managers for resolution of issues.
Confidential, Richmond, VA
Informatica/Teradata Developer
Environment: Teradata 12.0 (FastLoad, MultiLoad, FastExport, BTEQ), Teradata SQL Assistant, Informatica Power Center 8.5, Unix, SQL, PL/SQL, Work Load Manager, MS Access, UNIX
Responsibilities:
- Involved in full Software Development Life Cycle (SDLC) - Business Requirements Analysis, preparation of Technical Design documents, Data Analysis, Logical and Physical database design, Coding, Testing, Implementing, and deploying to business users.
- Providing technicalsupport and guidance to the offshore team to address complex business problems.
- Involved in gathering business requirements, logical modelling, physical database design, data sourcing and data transformation, data loading, SQL and performance tuning.
- Defining the schema, staging tables, and landing zonetables, configuring base objects, foreign-key relationships, complex joins, and building efficient views.
- Expertise in writing scripts for Data Extraction, Transformation and Loading of data from legacy systems to target data warehouse using BTEQ, FastLoad, MultiLoad, and Tpump.
- Performed Query Optimization with the help of explain plans, collect statistics, Primary and Secondary indexes. Used volatiletable and derivedqueries for breaking up complex queries into simpler queries. Streamlined the Teradata scripts and shell scripts migration process on the UNIX box.
- Worked on InformaticaPower Center tools - Designer, Repository Manager, Workflow Manager, and Workflow Monitor.
- Using various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer.
- Developing as well as modifying existing mappings for enhancements of new business requirements mappings to load into staging tables and then to target tables in EDW. Also created mapplets to use them in different mappings.
- Working on different tasks in Workflows like sessions, events raise, event wait, e-mail, command, worklets and scheduling of the workflow.
- Creating sessions, configuring workflows to extract data from various sources, transforming data, and loading into enterprise data warehouse.
- Running and Monitoring daily scheduled jobs by using Work Load manager for supporting EDW(Enterprise Data Warehouse) loads for History as well as incremental data.
- Investigating failed jobs and writing SQL to debug data load issues in Production.
- Writing SQL Scripts to extract the data from Database and for Testing Purposes.
- Interacting with the Source Team and Business to get the Validation of the data.
- Involved in Transferring the Processed files from mainframe to target system.
- Supported the code after postproduction deployment.
- Familiar with Agile software methodologies (scrum).
Confidential, Louisville, KY
Teradata /Informatica ETL Developer
Environment: Informatica 8.5 (Designer, Repository Manager, Workflow Manager, Workflow Monitor), Teradata 12, UNIX, Citrix, Toad, PL/SQL Developer
Responsibilities:
- Analyzing, designing and developing ETL strategies and processes, writing ETL specifications, Informatica development, and administration and mentoring other team members.
- Developed mapping parameters and variables to support SQL override.
- Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, and SQL, Lookup (File and Database) to develop robust mappings in the Informatica Designer.
- Worked on Teradata and its utilities - tpump, fastload through Informatica.Also created complex Teradata Macros
- Worked and Implemented Pushdown Optimization (PDO) to optimize performance issues of complex mappings involving numerous transformations and hence degrading the performance of the session.
- Involved in Performance tuning at source, target, mappings, sessions, and system levels.
- Exhaustive testing of developed components.
- Worked on the various enhancements activities, involved in process improvement.
- Used Informatica client tools - Source Analyzer, Warehouse designer, Mapping designer, Transformation Developer, WorkFlow Manager, Workflow Monitor.
- Worked on Change data Capture (CDC) using CHKSUM to handle any change in the data if there is no flag or date column present to represent the changed row.
- Worked on reusable code known as Tie outs to maintain the data consistency. It compared the source and target after the ETL loading is complete to validate no loss of data during the ETL process.
- Worked on Ab Initio in order to replicate the existing code to Informatica.
- Implemented Teradata MERGE statements in order to update huge tables thereby improving the performance of the application.
- Used reusable transformation objects such as mapplets to avoid duplication of metadata, reducing the development time.
- Performed unit testing at various levels of the ETL and actively involved in team code reviews.
- Created shell scripts to fine tune the ETL flow of the Informatica workflows.
- Migrated the code into QA (Testing) and supported QA team and UAT (User).
- Working with Power Center Versioning (Check-in, Check-out), Querying to retrieve specific objects, maintaining the history of objects.