Etl Informatica Developer/data Engineer Resume
Seattle, WA
SUMMARY:
- 9 years of experience in using Informatica PowerCenter 10.4.0, 9.x, 8.x, IDQ, Informatica ETL Developer in full life cycle of Software Development (SDLC) of requirements gathering, Analysis, Application Design, Development, Testing, Implementation, System Maintenance, Documentation, and support of Data Warehousing applications.
- 7 Years of experience in requirements gathering, design, development implementation and testing of data warehousing and business intelligence applications.
- Databases of experience using Oracle, DB2, MS SQL Server, Teradata, MYSQL, ORACLE 11g/10g, PL/SQL and tuning
- Experience withInformaticaPower Centre 10.4.0, 9.x, 8.x IDQ 9.6.1, 10.1.1 Teradata, SQL server, oracle databases.
- Experienced in debugging and Performance tuning of the Informatica mappings, sessions and workflows.
- Experience in developing ETL program for supporting Data Extraction, transformations and loading using Informatica Power Center.
- Hands on experience in identifying and resolving performance bottlenecks in various levels like sources, using Extract Transform and Load (ETL) and strong understanding of OLTP, OLAP concepts.
- Experience in Data Modeling & Data Analysis experience using Dimensional Data Modeling and Relational Data Modeling, Star Schema/Snowflake Modeling, FACT & Dimensions tables, Physical & Logical Data Modeling.
- Worked and have good knowledge in Agile and Waterfall mode of Software Development methodology.
- Experienced in writing SQL, Oracle, PL/SQL programming, Stored Procedures, Package, Functions, Triggers, Views, Materialized Views, cursors and tuning the SQL query.
- Experience writing daily batch jobs using UNIX shell scripts and developing complex UNIX Shell Scripts for automation of ETL. Korn or Bash Shell scripting experience in a Unix / Linux environment.
- Experience with Teradata utilities like Fast Load and Multi Load and Tpump and Teradata Parallel transporter and highly experienced in Teradata SQL Programming. Expert in performance tuning and dealing with huge volume of data.
- Expertise in using Teradata database and utilities like TPT, FASTLOAD, MULTILOAD, and BTEQ scripts.
- Proficient in implementing complex business rules through different kinds of Informatica transformations, Workflows/Worklets and Mappings/Mapplets.
- Optimized and updated logical & physical data models to support new and existing projects. Maintained conceptual, logical, and physical data models along with corresponding metadata.
- Knowledge in Big Data tools Hadoop ecosystem components like Hadoop Map Reduce, HDFS, Hive, Sqoop, Apache Spark and Kafka.
- Strong knowledge in RDBMS concepts, Data Modeling (Facts and Dimensions, Star/Snowflake Schemas), Data Migration, Data Cleansing and ETL Processes.
- Knowledge on maintaining and run the Control - M and Control-M Enterprise Manager software that supports the automated batch scheduling datacenter.
- Knowledge analyzing user requirements, procedures, and problems to automate processing or to improve existing job flows and schedules systems.
- Experience in Agile methodologies. Good experience in Programming language/ Structural query language (PL/SQL).
- Experienced in writing SQL, Oracle, PL/SQL programming, Stored Procedures, Package, Functions, Triggers, Views, Materialized Views, cursors and tuning the SQL query.
- Performed data conversion from Flat files to Oracle using OracleSQL* Loader.
- Experience working with Informatica IICS tool effectively using it for Data Integration and Data Migration from multiple source systems in Azure Sql Data warehouse.
- Experience working with IICS concepts relating to data integration, Monitor, Administrator, deployments, permissions, schedules.
- Designed and developed approaches to acquire data from new sources like Mainframe (DB2), and AS400 (DB2).
- Maintained documentation for all processes implemented.
- Hands on experience in identifying and resolving performance bottlenecks in various levels like sources, using Extract Transform and Load (ETL) and strong understanding of OLTP, OLAP concepts. Efficient in ELT/ETL solutions for the Revenue per Customer per Month Data Enablement Project.
- Experienced in Informatica ETL Developer role in Data Warehouse projects, Enterprise Data Warehouse, OLAP and dimensional data modeling.
- Good understanding of Amazon AWS Architecture, EC2, S3 bucket AWS CLI commands.
- Used SQL queries to identify data issues, data fixes, manual extracts, etc.
- Data Researching, Data Collection, Quality Assurance, Analysis and Problem-Solving Skills.
- Worked with offshore team for testing jobs that are developed
TECHNICAL SKILLS:
Tools: InformaticaPower Center,InformaticaPower Exchange,IICS, InformaticaData Quality 10.2.2 andInformaticaAnalyst tool 10.2.1, Informatica BDM, Talend
Languages: XML, UML, HTML, C, C++, UNIX, Shell Scripting, python 3.7, Power shell.
Database: Oracle, SQL Server, IBM DB2, MS Access, Teradata, Snowflake, MYSQL, Postgres, Oracle, Hadoop, ANSI SQL, AS400, PL/SQL, T-SQL.
Reporting Tools: Erwin Data Modeler, Tableau, Power BI, SSRS
Other Tools: SQL Loader, SQL Plus, Query Analyzer, Putty, MS Office, MS Word, S3, Control-M, AUTOSYS, Salesforce, Lambda.
Teradata Tools & Utilities: BTEQ, Multi Load, Fast Load, Fast Export, Tpump, Teradata Manager, SQL Assistant
WORK EXPERIENCE:
Confidential, Seattle, WA
ETL Informatica Developer/Data Engineer
Responsibilities:
- Analyzed the business requirements and framing the Business Logic for the ETL Process and maintained the ETL process using InformaticaPowerCenter 10.4.0
- Worked on Agile Methodology, participated in daily/weekly team meetings, worked with business and technical teams to understand data and profile the data.
- Understanding transformation and techniques especially with prem source to Attunity.
- Expert in designing Parallel jobs using various stages like Join, Merge, Lookup, remove duplicates, Filter, Dataset, Lookup file set, Complex flat file, Modify, Aggregator, XML.
- Update Strategy, Aggregator, Expression, Joiner Transformations and then loaded into data Warehouse using Informatica BDM 10.2.2.
- Experienced in Snowflake advanced concepts like setting up resource monitors and performance tuning.
- Involved in creating Unixshell scripts for Informaticaworkflowexecution.
- Data extracted the information from various disparate prem sources including but not limited to SQL, Teradata, DB2, MSSQL, and flat files and loaded into destination or used directly for profiling.
- Expertise includes taking any incoming data set and applying various data quality logic to it as per business needs.
- Analyzing data using Snowflake query window Design and Big Data Quality Rules
- Extensively worked on Informatica transformations such as Source Qualifier, Joiner, Filter, Router, Expression, Lookup, Aggregator, Sorter, Normalizer, Update Strategy, Sequence Generator and Stored Procedure transformations.
- Migrated data from legacy systems SQL server 200, AS400 to Snowflake and SQL server.
- Extensively used the Informatica cloud (IICS) transformations like Address validator, Exception, Parser, Solid experience in debugging and troubleshooting Sessions using the Debugger and Workflow Monitor.
- Used SQL scripts and AWS resources (Lambda, Step Function, SNS, S3) to automate data migration.
- Expertise in deploying Snowflake features such asdata sharing, events, and lake-house patterns
- Worked with multiple divisions throughout the organization to conform with best practices and standards.
- Created connections including Relational connection, Native connections, and Application connections.
- Experience in GDW Teradata BTEQ scripts and enhancements based on new requirements
- Involved in Performance tuning of sessions, mappings, ETL procedures, processes for better performance and support integration testing
- Developed ETL pipelines in and out of data warehouse using combination of Python and Snowflakes SnowSQL Writing SQL queries against Snowflake.
- Involved in data analysis and handling the ad-hoc requests by interacting with business analysts, clients and resolve the issues as part of production support
- Performed debugging by checking the errors in the mapping using Debugger utility of the Designer tool and made appropriate changes to generate the required results
- Liaison with cross functional teams across the enterprise, including OLTP development team and various data-warehouse teams (on site and offshore team members)
- Implemented Error Handling Strategy in ETL mappings and route the problematic records to exception tables
- Engaged in creating UNIX Shell scripts to invoke workflows and used PL/SQL to create the dynamic pmcmd commands and parameter files for the workflows
- Responsible for writing Autosys JILs and scheduling the Informatica workflows on the Autosys server
- Prepared Test Plan and Test strategy from the Business Requirements and Functional Specification for the integrations.
- Developed Test Cases for Deployment Verification,ETLData Validation, and application testing.
- Worked asETLTester responsible for the requirements /ETLAnalysis,ETLTesting and designing of the flow and the logic for theData warehouseproject.
- Followed waterfall and AGILE development methodology and adhered to strict quality standards in requirement gathering.
- Functioned as the Onsite / Offshore coordinator for a team.
- Experienced in writing complexSQL queriesfor extracting data from multiple tables.
- Created custom views to improve performance of thePL/SQLprocedures.
- Testing has been done based on Change Requests and Defect Requests.
- Preparation of System Test Results after Test case execution.
- Experience in Linux/Unix technologies Strong understanding of internals of Spark and Hadoop e.g. Data Frame, DAG, data partition and distribution, named node limitations and tuning
- Expertise in Unix commands and in scripting (Korn Shell & Python).
- Worked on Unix based File System good in log monitoring, analyzing, and providing remediation steps.
- Worked with informatica support for fixing Informatica Linux server issues.
- Worked on moving S3 folders and buckets to cloud using Python in Lambda. Hands on python development
Environment: Informatica BDM 10.2.1 and 10.2.2, Big Data Quality (IDQ), ICS, ETL, Attunity, Shell, SQL Server, DB2, Oracle, Salesforce, AS400, AWS S3, Teradata, Snowflake, Aurora, Hadoop 2.9, Informatica Administrator console, informatica analyst, Postgres SQL, Hive, Linux, shell, Python 3.6, Informatica Cloud.
Confidential, Long Island
ETL Developer
Responsibilities:
- Involved in gathering requirement from business customers and developed requirement document and ETL specification.
- Deployed Talend jobs into server and schedule through Talend Job Conductor tool. Worked independently to develop ETL Talend Job’s.
- Developed ETL Talend complex job is by using components such as tMap, tFilterRow, tJavaRow, etc
- Created complex Stored Procedures, Triggers, Functions, Indexes, Tables, Views and other T-SQL code and SQL joins for applications following SQL code standards.
- Extensively involved in performance tuning of the Informatica ETL mappings by using the caches and overriding the SQL queries and by using Parameter files.
- Using Talend extracted the information from various disparate data sources including but not limited to Oracle, Netezza, MySQL, Mongo, Hadoop, MSSQL, and flat files and loaded into destination.
- Experience with big data platform (i.e., Hadoop) big data tools such as Apache Spark.
- Proficiency in SQL, big data technologies and working with large data sets.
- Ability to translate business attributes (GUI Labels, Reporting Attributes) into Data Model elements
- Used reverse engineering techniques to find the source tables and fields for modifications.
- Developed confidential proprietary analytical tools and reports with Confidential Excel, and Power Pivot and Power Point.
- Provide support and maintenance on ETL processes and documentation.
- Responsible for all phases of solution delivery from initial requirements analysis, solution estimation, coordination of development, testing and delivery to production.
- Should possess hands on experience in Design, Development and CI/CD release.
- Experienced in developing transformations with different file formats like Positional, JSON, XML, CSV, Flat file
- Experienced in working with Database components in Talend like Oracle, MySQL etc.
- Performed Index analysis for tables and came up with more efficient solutions to useClusteredandNon-Clustered Indexesfor significant performance boost using index tuning wizard.
- Filtered bad data usingDerived column, Lookups, Fuzzy Lookups,Conditional split.
- Configure and maintain usingReport ManagerandReport ServerforSSRS,DeployedandScheduledthe Reports inReport Manager
- Created Reports usingCharts,Gauges,Tables,matrix. CreatedParameterized Report,Dashboard Report,linked reportandSub Report by Year, Quarter, Month, and Week
- CreatedDrill Down Reports, Drill Through Reportby Region.
Environment: ETL, SSRS, SSAS, Informatica Power Center 10.1/9.6.1/9.5.1 , Oracle E-Business Suite (EBS), Talend 6.2, Netezza 3.0, Hadoop work bench, putty 0.64, Teradata14.0, Kafka, SQL Server 2014, Netezza, UNIX, Toad, PL/SQL, DB2, Tableau 10.2.
Confidential, Milwaukee, WI
ETLDeveloper/Teradata Developer
Responsibilities:- Worked in Agile development methodology environment and Interacted with the users, Business Analysts for collecting, understanding the business requirements.
- Worked on building the ETL architecture and Source to Target mapping to load data into Data warehouse.
- Involved in the installation and configuration of Informatica Power Center 10.1 and evaluated Partition concepts in Power Center 10.1
- Analyzing the method of transforming existing data into a format for the new environment and the loading of this data into other database structures.
- Reviewing existing migration tools and providing recommendations for improving performance of the migration process.
- Providing necessary change and support documentation.
- Developed various mappings using Mapping Designer and worked with Aggregator, Lookup, Filter, Router, Joiner, Source Qualifier, Expression, Stored Procedure, Sorter and Sequence Generator transformations.
- Created stored procedures, views, user defined functions and common table expressions in SQL and Hadoop.
- Fluent in T-SQL programming language: creating and managing database objects, stored procedures, triggers, views and user defined functions.
- Expert in T-SQL query development and analysis: stored procedures, triggers, views, etc. Expert in T-SQL querying for creating data extracts.
- Tested to verify that all data were synchronized after the data is troubleshoot and also usedSQLto verify/validate my test cases.
- Reviewed Informatica mappings and test cases before delivering to Client.
- Worked asETLTester responsible for the requirements /ETLAnalysis,ETLTesting and designing of the flow and the logic for theData warehouseproject.
- Written several UNIX scripts for invoking data reconciliation.
- Experienced in writing complexSQLqueries for extracting data from multiple tables.
- Testing has been done based on Change Requests and Defect Requests.
- Preparation of System Test Results after Test case execution.
- Implemented very large-scale data intelligence solutions around Snowflake Data Warehouse.
- A solid experience and understanding of architecting, designing and operationalization of large-scale data and analytics solutions on Snowflake Cloud Data Warehouse.
- Reviewed ETL performance and conducts performance tuning as required on mappings / workflows or SQL.
- Created UNIX scripts for parsing and modifying data and experience in usingAUTOSYSjob scheduler for automation ofUNIX shell scriptsandbatch scheduling.
- Built SSIS packages involving ETL process, extracting data from various flat files, Excel files, legacy systems and loading into SQL server.
- Manage cross-program data assurance for physical data items in source and target systems.
- Strong SQL knowledge and working experience in Teradata Stored Procedures/BTEQ scripts.
- Experience in creating FACT tables. Knowledge of reporting tools especially Tableau would be advantageous.
- Extracted data from different sources like MVS data sets, Flat files (“pipe” delimited or fixed length), excel spreadsheets and Databases.
- UsedTeradata utilities FAST LOAD, MULTI LOAD, TPUMPto load data.
- Wrote, tested and implemented Teradata Fast load, Multiload and BTEQ scripts, DML and DDL.
- Managed all development and support efforts for the Data Integration/Data Warehouse team.
- Used Informatica power center 9.6.1 to Extract, Transform and Load data into Netezza Data Warehouse from various sources like Oracle and flat files.
- Developed and deployed ETL job workflow with reliable error/exception handling and rollback within the MuleSoft framework.
- UsedBTEQandSQL Assistant(Query man) front-end tools to issue SQL commands matching the business requirements to Teradata RDBMS.
- Provided on call support during the release of the product to low level to high level Production environment.
Environment: InformaticaDeveloper, C++, Oracle12C, AWS, Teradata14.0, SQL Server 2014, Autosys Scheduler Tool, UNIX, Toad, PL/SQL, SSIS, SSRS, T-SQL, Power Connect, DB2, Tableau10.1, Power Shell,Teradata.
Confidential, Lakeland, FL
ETLDeveloper/ SQL developer
Responsibilities:
- Interacted with Data Modelers and Business Analysts to understand the requirements and the impact of the ETL on the business.
- Designed ETL specification documents for all the projects.
- Created Tables, Keys (Unique and Primary) and Indexes in the SQL server.
- Extracted data from Flat files, DB2, SQL and Oracle to build an Operation Data Source. Applied business logic to load the data into Global Data Warehouse.
- Extensively worked on Facts and Slowly Changing Dimension (SCD) tables.
- Maintained source and target mappings, transformation logic and processes to reflect the changing business environment over time.
- Used various transformations like Filter, Router, Expression, Lookup (connected and unconnected), Aggregator, Sequence Generator, Update Strategy, Joiner, Normalizer, Sorter and Union to develop robust mappings in the Informatica Designer.
- Extensively used the Add Currently Processed Flat File Name port to load the flat file name and to load contract number coming from flat file name into Target.
- Worked on complex Source Qualifier queries, Pre, and Post SQL queries in the Target.
- Worked on different tasks in Workflow Manager like Sessions, Events raise, Event wait, Decision, E-mail, Command, Worklets, Assignment, Timer and Scheduling of the workflow.
- Extensively used workflow variables, mapping parameters and mapping variables.
- Created sessions, batches for incremental load into staging tables and scheduled them to run daily.
- Used shortcuts to reuse objects without creating multiple objects in the repository and inherit changes made to the source automatically.
- Implemented performance tuning logic on Targets, Sources, Mappings and Sessions to provide maximum efficiency and performance.
- QA (Testing) and supported QA team and UAT (User).
- Created detailed Unit Test Document with all possible Test cases/Scripts.
- Conducted code reviews developed by my teammates before moving the code into QA.
- Worked on DataStage clustering architecture and parallel jobs knowledge in combine/sort/aggregate/transform data
- Experienced in creating reusable objects and job templates
- Scheduled, compiled, and run DataStage jobs efficiently
- Year of solid hands-on experiences in using XML stage
- Provided support to develop the entire warehouse architecture and plan the ETL process. documentations along with different test cases to smooth transfer of project and to maintain SDLC.
- Identified the bottlenecks and improved overall performance of the sessions. Experience in Scheduling Informatica sessions for automation of loads in Autosys.
- Provided production support by monitoring the processes running daily.
Environment: Informatica Power Center 10.x/9.x, IDQ, Erwin, MDM, AWS, Oracle Data Integration (ODI), Oracle 11g/10g, PL/SQL, SQL Loader, MS SQL Server 2012/2008, Autosys, SQL Server 2014, SSIS, T-SQL
Confidential, Herndon, VA
ETL Informatica Developer
Responsibilities:
- Developed ETL programs using Informatica to implement the business requirements.
- Communicated with business customers to discuss the issues and requirements.
- Created shell scripts to fine tune the ETL flow of the Informatica workflows.
- Used Informatica file watch events to pole the FTP sites for the external mainframe files.
- Production Support has been done to resolve the ongoing issues and troubleshoot the problems.
- Performance tuning was done at the functional level and map level. Used relational SQL wherever possible to minimize the data transfer over the network.
- Effectively used Informatica parameter files for defining mapping variables, workflow variables, FTP connections and relational connections.
- Involved in enhancements and maintenance activities of the data warehouse including tuning, modifying of stored procedures for code enhancements.
- Effectively worked in Informatica version-based environment and used deployment groups to migrate the objects.
- Used debugger in identifying bugs in existing mappings by analyzing data flow, evaluating transformations.
- Effectively worked on Onsite and Offshore work model.
- Pre and post session assignment variables were used to pass the variable values from one session to other.
- Designed workflows with many sessions with decision, assignment task, event wait, and event raise tasks, used informatica scheduler to schedule jobs.
- Performed unit testing at various levels of the ETL and actively involved in team code reviews.
- Identified problems in existing production data and developed one-time scripts to correct them.
- Fixed the invalid mappings and troubleshoot the technical problems of the database.
- UpgradedOracle 9i to 10gsoftware in different environments for latest features and tested databases.
- Developed and modifiedtriggers, packages, functionsandstored proceduresfor data conversions and PL/SQL procedures to create database objects dynamically based on user inputs.
- WroteOracle SQL, PL/SQL, SQL*Plusprograms required to retrieve data using cursors and exception handling.
- Worked onXMLalong with PL/SQL to develop and modify web forms.
- Creatingindexeson tables to improve the performance by eliminating the full table scans and views for hiding the actual tables and to eliminate the complexity of the large queries.
- Fine-tuned procedures/SQL queries for maximum efficiency in various databases usingOracle Hints, for Rule based optimization.
Environment: Informatica Power Center 9.6.1, Informatica Data Quality (IDQ 8.6.1), DataStage, Oracle 11g, PL/SQL, Flat files, MySQL, WinSCP, Notepad++, Toad, Quest Central, UNIX scripting, Windows NT.
Confidential
Informatica Administrator
Responsibilities:
- Installed and configured Informatica Power Center 9.1.
- Configured Load balancing and Grid in Informatica.
- Creation and maintenance of Informatica users and privileges.
- Created Native Groups, Native Users, roles, and privileges and assigned them to each user group.
- Developed a standard ETL framework to enable the reusability of similar logic across the board. Involved in System Documentation of Dataflow and methodology.
- Installation and configuration of B2B DX, MFT and troubleshoot the issues.
- Developed mappings to extract data from SQL Server, Oracle, Flat files, and load into DataMart using the Power Center.
- Written pre session and post session scripts in mappings.
- Created deployment groups and assigned the permissions to the deployment group.
- Created Informatica Folders and assigning the permissions to the Folders.
- Created OS profiles for the user which used to run the applications.
- Migration of Informatica Mappings/Sessions/Workflows from Dev to Test and Test to Stage and Stage to Prod environments.
- Coordinated with UNIX and Database Administrators for creating OS profiles and file systems.
- Handled outages when there is any maintenance in UNIX and DATA BASE.
- Collaboratively worked with Application Support Team, Network, Database Teams, and UNIX teams.
- Bouncing Activities or Restarting the Service in Network Changes or in any maintenance and Communicating to Business.
Environment: Informatica PowerCenter 9.1 HF4, Oracle 11.2.0.3.0, Linux and OS.