We provide IT Staff Augmentation Services!

Etl Informatica Developer / Data Analyst Resume

0/5 (Submit Your Rating)

Minneapolis, MN

SUMMARY

  • Data analyst/ETL Informatica expert with around 8+ years of Total IT Experience in analysis, design, development, testing, deployment, and production support of Enterprise Data warehouse applications using Informatica across Healthcare, Manufacturing and Retail domains under all the phases of SDLC.
  • 8+ years of vast experience in developing ETL interfaces usingInformatica 10.0, 9.x, 8.xandIDQ 8.6for Enterprise Data warehouse and standalone data marts.
  • Expertise in developing end - to-end integration solutions using DataStage and multiple databases likeORACLE,DB2,Teradata,Netezza,MySQL,PostGreSQLandSQL Server.
  • Worked with Teradata utilities likeFast Load and Multi Load and Tpump and Teradata Parallel transporterand highly experienced inTeradata SQL Programming.
  • Experience inTeradata Parallel Transporter (TPT). Used full PDO on Teradata and worked with different Teradata load operators.
  • SuperiorSQLskills and ability to write and interpret complex SQL statements and also skillful in SQL optimization and ETL debugging and performance tuning
  • Experience in developing ofon-line transactional processing(OLTP),operational data store(ODS) anddecision support system(DSS) (e.g., Data Warehouse) databases.
  • Experienced in writingSQL, PL/SQL programming, Stored Procedures, Package, Functions, Triggers, Views, Materialized Views.
  • Experience in Inmon and Kimball data warehouse design and implementation methodologies
  • Strong familiarity withmaster dataandmetadata managementand associated processes
  • Hands-on knowledge of enterpriserepository tools,ata mappingtools,data profiling toolsand data and information system life cycle methodologies.
  • 1+ year of experience inAWS (Amazon Web Services),S3 BucketandRedshift (AWS RelationalDatabase).
  • 1+ year of experience in working withHive, Impala, HDFS, SparkSQLetc.
  • Experience with architecture experience implementing proper data structures for analytical reporting from an enterprise data warehouse.
  • ImplementedChange Data Capture(CDC) with Informatica Power Exchange.
  • Used Informatica Power Exchange to accessVSAM filesalso worked onFlat files,nestedJSONandXMLfiles
  • Well versed with data quality features likeAnalyst,IDQ& transformation likeKey Generator,Standardizer,Case Converter,MatchConsolidationetc.
  • AppliedAddress transformationfor Address Validation and Standardization.
  • Strong in implementation ofdata profiling,documenting Data Quality metricslike Accuracy, completeness, duplication, validity, consistency.
  • Have good skills on understanding and development ofbusiness rulesfor its Standardization, Cleanse and Validation of data in various formats.
  • Very strong knowledge of InformaticaData Quality transformationslike Address validator, Parser, Labeler, Match, Exception, Association, Standardizer and other significant transformations.
  • Extensively worked on Informatica Power center transformations as well likeExpression,Joiner,Sorter, Filter,and Router, and Normalizer, rank, Lookups, Stored procedure, Update strategy, Source Qualifier, Union, CDCand other transformations as required.
  • Proficient on Data Warehouse Concepts likeDimension Tables,FactTables,Slowly Changing DimensionsandDataMart’s
  • Experience inExtraction,TransformationandLoading(ETL) data from various data sources intoData MartsandData Warehouseusing Informatica power center components (Repository Manager,Designer,Workflow Manager,Workflow Monitorand InformaticaAdministration Console).
  • Strong Experience in developingSessions/Tasks, WorkletsandWorkflowsusing Workflow Manager tools -Task Developer,Workflow & Worklet Designer.
  • Experience in performance tuning of Informatica mappings and sessions to improve performance for the large volume projects.
  • Experience in debugging mappings, identified bugs in existing mappings by analyzing the data flow and evaluating transformations.
  • Experience in writingUNIX shell scripts,SQL scriptsfor development, automation of ETL process, error handling and auditing purposes.
  • Well versed with waterfall and Agile methodologies in Software Development Life Cycle (SDLC)
  • Strong Knowledge of BigData /Hadoop Ecosystemand Data modelling in Hadoop environment.
  • Have got good problem-solving skills. Team player with excellent analytical, communication and multi-tasking skills.

TECHNICAL SKILLS

ETL: Informatica PowerCenter 10.0, 9.5.1, 9.0, 8.1.1,, Power Exchange 10.0.1, 8.6,9.1

Data Profiling Tools: Informatica IDQ 10.0, 9.5.1, 8.6.1

ETL Scheduling Tools: Control M, ESP.

RDBMS: DB2, Oracle 11g/12c, Teradata 13/15, SQL Server 2008/2012, MySQL, PostgreSQL 9.2

UNIX: UNIX, Shell scripting

Reporting Tools: Tableau 9, Cognos 9/10

Defect Tracking Tools: Quality Center, Bugzilla

Operating Systems: Windows XP/2000/9x/NT, UNIX

Source Management: BitBucket, Visual SourceSafe

Programming Languages: C, C++, PL/SQL

Other Tools: Notepad++, Toad, SQL Navigator, Teradata SQL Assistant, Teradata view point, JIRA, Rally

PROFESSIONAL EXPERIENCE

Confidential, Minneapolis, MN

ETL Informatica Developer / Data Analyst

Responsibilities:

  • Extensively Worked with Business Users to gather, verify and validate various business requirements.
  • Identified various source systems, connectivity, tables to ensure data availability to start the ETL process.
  • Worked as Data modeler and createdData modelfor warehouse and involved inODSandDatamartdata models.
  • Worked as Data analyst to analyze the source systems data.
  • CreatedDesign Documentsfor source to target mappings. Developed mappings to send files daily toAWS.
  • UsedUNIX scriptingto apply rules on the raw data within AWS.
  • Created Complex mappings usingUnconnectedandConnectedLookup,AggregatorandRoutertransformations for populating target table in efficient manner.
  • Createdstored proceduresto use oracle generated sequence number in mappings instead to using InformaticaSequence generator.
  • Created complex Mappings and implemented Slowly Changing Dimensions likeType 1, Type2 and Type 3for data loads.
  • Created complex Mappings to implement data cleansing on the source data.
  • UsedMapping Variables,Mapping ParametersandSession Parameters to increase the re-usability of the Mapping.
  • Created source to target mappings, edit rules and validation, transformations, and business rules. Analyzed client requirements and designed the ETL Informatica mapping.
  • Scheduledand Run Workflows in Workflow Manager and monitored sessions using Informatica Workflow Monitor.
  • Tunedperformanceof Informatica session for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval.
  • Validatedandtestedthe mappings using Informatica Debugger, Session Logs and Workflow Logs.
  • Created detailUnit test plansand performed error checking and testing of the ETL procedures using SQL queries, filtering out the missing rows into flat files at the mapping level.
  • UsedUltra Edittool andUNIX Commandsto create access and maintain the session parameter files, data files, scripts on the server.
  • UsedCUCUMBERautomated test tool to automate the unit tests for Informatica ETL.
  • Followed and automated the Acceptance Test Driven Development (ATDD) and Test Driven Development (TDD) for unit tests for Informatica ETL.
  • Scheduled the ETLs using ESP scheduler.

Environment: Informatica Power Center 10, Power Exchange 10, Oracle 11g, Cognos 10.x, AWS, Redshift, DB2, Flat files, SQL, putty, UltraEdit-32, shell scripting, Toad, Quest Central, UNIX scripting, Windows NT

Confidential, Richfield, MN

ETL Informatica developer / Data Analyst

Responsibilities:

  • Extensively Worked with Data Modeler, Data Analysts and Business Users to gather, verify and validate variousbusiness requirements.
  • Identified various source systems, connectivity, tables to ensure data availability to start the ETL process.
  • CreatedDesign Documentsfor source to target mappings
  • CreatedWorkflows,Tasks,databaseconnections,FTPconnections using Workflow Manager.
  • Extensively developed various Mappings using different Transformations likeSourceQualifier,Expression,Lookup (connected and unconnected),UpdateStrategy,Aggregator, Filter,Router,Joineretc.Used Workflow Manager for Creating, Validating, Testing and running the workflows, sessions and scheduling them to run at specified time.
  • Createdpre-sessionandpost-sessionshell commands for performing various operations like sending an email to the business notifying them about any new dealer branches.
  • Providing the Architecture/Domain knowledge toReport developerfor the creation of Dashboards and Reports.
  • Wote BTEQ, MLOAD and TPUMP scripts to load the data into Teradata tables.
  • Performed Fine tuning of SQL overrides for performance enhancements and tuned Informatica mappings and sessions for optimum performance.
  • Extensively worked with Look up Caches likeShared Cache,Persistent Cache, Static CacheandDynamic Cacheto improve the performance of the lookup transformations.
  • Created Source definitions and Flat File Target definitions using the Informatica
  • Used UNIX commands(vi editor) to perform the DB2 load operations.
  • Created detail Unit test plans and performed error checking and testing of the ETL procedures using SQL queries, filtering out the missing rows into flat files at the mapping level.
  • Scheduled the ETLs usingESPscheduler.

Environment: Informatica Power Center 8.6.1/8.1.1 , Power Exchange 9.1,IDQ 9,Oracle 11g, Teradata 15, TOAD, Cognos 9, Tableau 9.2, UNIX, Autosys, SQL*Loader, PowerCenter Mapping Architect for Visio, Sybase Power Designer, IBM DB2, Flat files, SQL, putty, UltraEdit-32, shell Programming, Toad, Quest Central

Confidential, Bloomington IL

ETL Informatica developer

Responsibilities:

  • Involved in the requirements definition and analysis in support of Data Warehousing efforts.
  • Worked on ETL design and development, creation of the Informatica source to target mappings, sessions and workflows to implement the Business Logic.
  • Used variousTransformationslike Joiner, Aggregate, Expression, Lookup, Filter, Union, Update Strategy, Stored Procedures, and Router etc. to implement complex logics while coding a Mapping.
  • Extracted datafromFlat files, SQL ServerandOracleandloaded them intoTeradata.
  • Involved inData Profiling&Data Analysison heterogeneous Database sources likeOracle,flat files.
  • Extensively worked onInformatica Data Quality (IDQ 8.6.1)for data analysis, data cleansing, data validation, data profiling and matching/removing duplicate data.
  • Designed and developedInformatica DQ Jobs, Mapplets using different transformation like Address validator, matching, consolidation, rules etc. for data loads and data cleansing.
  • Preparation oftechnical specificationfor the development of Extraction, Transformation and Loading data into various stage tables.
  • Used theIntegration servicein PowerCenter 8.6.1 to start multiple instances of the same workflow.
  • Implementeddifferent Tasksin workflows which included Session, Command, E-mail, Event-Wait etc.
  • CreatedData BreakpointsandError Breakpointsfor debugging the mappings using Debugger Wizard.
  • ScheduledandRun Workflowsin Workflow Manager and monitored sessions using Informatica Workflow Monitor.
  • Validated and tested the mappings using Informatica Debugger, session logs and workflow logs.
  • Worked on migrating existing PL/SQL packages, stored procedures, triggers and functions to Informatica Power Center.
  • Transformed bulk amount of data from various sources toTeradatadatabase by usingBTEQ, MLOAD and TPUMP scripts.
  • Transferred data using Teradata utilities likeSQL Assistant,Fast ExportandFast Load.
  • Involved inPerformance Tuningof SQL Queries, Sources, Targets and sessions by identifying and rectifying performance bottlenecks.
  • Worked onAutosys Schedulerto automate the Workflows.
  • Testedall the mappings and sessions inDevelopment, UATenvironments and also migrated into Production environment after everything went successful.

Environment: Informatica PowerCenter 8.6.1, Power Exchange 8.6, Informatica Data Quality (IDQ 8.6.1), Informatica Data Quality (IDQ 8.6.1), SQL Server, Oracle 11g, PL/SQL, Flat files, MySQL, Teradata 13, WinSCP, Notepad++, Toad, Quest Central, UNIX scripting, Windows NT.

Confidential, Columbus, IN

Informatica Developer

Responsibilities:

  • Assisted to prepare design/specifications for data Extraction, Transformation and Loading
  • DevelopedInformaticamappings, enabling the extract, transport and loading of the data into target tables.
  • CreatedWorkflow, WorkletsandTasksto schedule the loads at required frequency using Workflow Manager.
  • Preparedreusable transformationsto load data from operational data source to Data Warehouse.
  • Wrote complexSQL Queriesinvolving multiple tables with joins.
  • Worked withcomplex queriesfor data validation and reporting usingSQLandPL/SQL.
  • Developed stored procedures using PL/SQL and driving scripts using Unix Shell Scripts.
  • Scheduledand Run Workflows in Workflow Manager and monitored sessions using Informatica Workflow Monitor.
  • Analyzed the dependencies between the jobs and scheduling them accordingly using the Work Scheduler.
  • Improved theperformanceof the mappings, sessions using various optimization techniques.

Environment: Informatica 8.1, PL/SQL, OBIEE, Erwin, Oracle 10g, SQL Server 2008, Flat files, SQL, putty, UltraEdit-32, shell Programming, Toad, SQL Developer, UNIX scripting, Windows NT.

We'd love your feedback!