We provide IT Staff Augmentation Services!

Sr. Etl Informatica Developer Resume

0/5 (Submit Your Rating)

Phoenix, AZ

SUMMARY

  • 11+ years of IT experience with strong knowledge in Data Warehousing, ETL using Informatica Power Center 9.x/10.x, SQL, 5+ years of experience in SAS reporting and 2 years of experience in IBM Mainframe.
  • Experience in using Informatica Power Exchange while reading data in real time and creation of data maps for reading the Mainframe source files into Informatica.
  • Experience in working with various data sources like Relational Databases (DB2, Oracle, Netezza, Yellowbrick and Teradata), Flat Files and COBOL Mainframe files.
  • Worked in generating the reports as per the business requirements using Reporting tools like SAS.
  • Experience in writing PL/SQL stored procedures, Unix commands and scripts.
  • Experience in working with the scheduling tools like Tivoli, OpCon Enterprise and Automic.
  • Experience in Software Development Life Cycle (SDLC) methodologies and well versed with Agile (using JIRA) and Water - Fall models.
  • Excellent communication with customer and can work with ease in onsite-offshore model.
  • Strong expertise in production support and handling job failures within SLA.
  • Extensively worked on analysis, design, develop and tuning the Mappings, Mapplets, Reusable transformations, Sessions/Tasks, Worklets, and Workflows as per the requirements using Informatica.
  • Hands on experience in Mainframe OS/ 390 or Z/OS applications and worked in the areas of COBOL, JCL, VSAM, DB2, SYNCSORT, File Aid, File Manager, Endevor, ISPF and IBM Utilities.
  • Have a good understanding of ETL/Informatica standards and best practices, Change Data Capture (CDC) and Slowly Changing Dimensions (SCD1, SCD2).
  • Experience in Data analysis, Transformation, Loading and Reporting in Health Care and Retail sectors, it includes detailed knowledge of statistical analysis of health care data and in the production of reports and tables.
  • Experience in Creating SAS Datasets, Data manipulation for producing Excel, HTML and PDF formatted files using SAS/ODS and SAS/EXPORT.
  • Good knowledge and experience in using Database Utility Tools like TOAD, SQL Developer and other tools like Putty, Core FTP, SSH, WinSCP, TSRM and Jira.
  • Strong ability to write Complex SQL queries and worked extensively on SQL query preparation and query analysis.
  • Having good knowledge on Health Care Insurance domain and ANSI X12 standard 834 data.

TECHNICAL SKILLS

ETL Tools: Informatica Power Center 10.2/10.4 and Power Exchange

Databases: DB2, Teradata, Oracle 11g/12c/19c, Netezza & Yellowbrick

Query Tools: TOAD, Oracle SQL Developer, Aginity workbench, Dbeaver.

Reporting tools: SAS/Base, SAS/EG 7.1, SAS/ODS, SAS/MACROS, SAS/GRAPH, SAS/ACCESS, Stored Process Server.

Scheduling Tools: TWS-Tivoli Workload Scheduler, OpCon and Automic

Programming languages: JCL and COBOL

Mainframes tools: File-Aid, File manager and Endevor

Other Tools: Unix, Putty, Core FTP, WinSCP, Jira, ITSM, METIS Jenkins

Domain knowledge: Health care Insurance (X12 standard), Banking, Retail and Eligibility System.

Data Modeling Tools: Microsoft Access, Erwin and MS Visio

OS: OS\390 or Z\OS, AIX, Windows 7, 10 and XP.

PROFESSIONAL EXPERIENCE

Confidential, Phoenix, AZ

Sr. ETL Informatica Developer

Responsibilities:

  • Assess the as-is architecture and work with business data management team to come up with the solution best suited to the business needs.
  • Lift and shift existing Informatica mappings for Quarterly loads present in Netezza to Yellobrick and load the Quarterly data into Yellowbrick warehouse tables.
  • Work on mapping logic changes to support the new framework design and create Automic job setup to generate the LRF files.
  • Coordinate with onsite and offshore team to get the work done and review the test results.
  • Create a plan to start parallel run for 26 markets starting with reading data from source systems for each business day and load Pre-stage and LZ tables.
  • Coordinate with database admins to get the status of data refresh from Netezza to POD and POA databases in YB and work on the CRD refresh activities.
  • Update the Unix scripts to schedule the jobs in Automic scheduler to run Data watcher, Data poller, Event trigger, Event flow, Data transformer, Data warehouse loader and CRD loader jobs.
  • Analyze any mismatches in data count or point counts and send the report to corresponding team to take the necessary action.
  • Work with reporting team to generate the Quarterly reports from the new system (YB) and send them to the business.
  • Analyze any mismatches in the quarterly report and update the team.

Environment: Informatica Power Center 10.4, Oracle12c/19c, Netezza, Yellowbrick, Aginity workbench, Dbeaver, WinSCP, ITSM, Rally, Jenkins (Metis), Unix and Automic.

Confidential, Phoenix, AZ

Sr. ETL Informatica Developer

Responsibilities:

  • Assess the as-is architecture and work with business data management team to come up with the solution best suited to the business needs.
  • Analyze existing Informatica mappings present in Netezza and create the mapping documents to develop metadata setup in Oracle in the new framework design.
  • Develop oracle stored procedures and functions in the new system to replicate the mapping logic in existing Netezza system.
  • Write the complex SQL’s to in corporate the logic present in existing Informatica mappings.
  • Work on mapping logic changes to support the new framework design and create Automic job setup to generate the LRF files.
  • Validate data in WT tables to make sure the data and point counts are matching with Netezza before loading CRD and DW tables.
  • Coordinate with Infa admins to migrate the end of day balancing mappings from Netezza to YB and create the required connection strings in YB.
  • Work on end of day balancing mappings in the new system to make sure the results are balanced.
  • Coordinate with onsite and offshore team to get the work done and review the test results.
  • Create a plan to start parallel run for 26 markets starting with reading data from source systems for each business day and load Pre-stage and LZ tables.
  • Coordinate with database admins to get the status of data refresh from Netezza to POD and POA databases in YB and work on the CRD refresh activities.
  • Update the Unix scripts to schedule the jobs in Automic scheduler to run Data watcher, Data poller, Event trigger, Event flow, Data transformer, Data warehouse loader and CRD loader jobs.
  • Analyze any mismatches in data count or point counts and send the report to corresponding team to take the necessary action.
  • Work with reporting team to generate the adequacy reports from the new system (YB) and send them to the business.
  • Analyze any mismatches in the adequacy report and update the team.

Environment: Informatica Power Center 10.4, Oracle12c/19c, Netezza, Yellowbrick, Aginity workbench, Dbeaver, WinSCP, ITSM, Rally, Jenkins (Metis), Unix and Automic.

Confidential, Little Rock, AR

Sr. ETL Developer

Responsibilities:

  • Understanding the requirements from business analysts and converting them into technical specifications.
  • Create the designs for source to target table mappings using Microsoft Access tool.
  • Coordinate with cross walk teams like source teams to get the data extracts and DBA’s to load them from outbound to inbound (Informatica server) locations and then to DB2 tables.
  • Extracted the data from multiple operational sources for loading staging area, Data warehouse, Data Marts using SCD Type1 and Type2.
  • Developed complex SQL queries, mappings for Data quality, Data masking, Data standardization and Data profiling processes.
  • Identify all the reference tables in Legacy systems and develop the cross-reference mappings for them in the new system to be migrated to ARIES NexGen solution.
  • Develop Informatica mappings and workflows to extract the data from legacy systems (CURAM and ANSWER), validate and transform the data to convert the individuals/cases from old systems to the new system (ARIES) as per the user requirements.
  • Working in performance tuning of Informatica and SQL code.
  • Create tasks to send the flagging files with the clients that are converted into ARIES to end users.
  • Preparing the scripts for DDL changes for the requirement including the changes to the stored procedures, according to changes in the model work on changes in the Informatica.
  • Preparing/Modifying the parameter files & promotion documents for the created mappings, sessions & workflows.
  • Promoting the code changes to the production with help of Admin(s), after receiving the sign-off on the tasks.
  • Work with Data Modeler’s/DBA’s for creation of any new working tables or creating indexes and keys for the existing tables to improve the performance.
  • Work on production issues/defects and close them by following SDLC cycle.

Environment: Informatica Power Center 10.4, Oracle, DB2, WinSCP, Jira, Microsoft Access, SVN, Unix and OpCon.

Confidential, Minnesota, MN

Lead Informatica Developer

Responsibilities:

  • As an onsite technical lead, participated in all project phases to satisfy customer requirements.
  • Ensure the quality of project deliverables by effectively coordinating between onsite stake holders & offshore team.
  • Provided the quick turnaround time for any queries from the clients.
  • Maintained proper communication with cross commit teams to avoid any late stage issues.
  • Coordinating with respective source teams in Identification of source systems to the data warehouse.
  • Analyzed the existing operational sources and involved in development of the conceptual, logical and physical data models for the newly built data warehouse.
  • Worked on PSR’s to fix the abends and improve the performance of Jobs.
  • Worked with Business analyst, SMEs and Business users on code changes.
  • Actively involved in creating Test Plans based on the requirements submitted by the Business analysts.
  • Documented processing times for each module and developed test cases while performing the unit, system and end-to-end testing.
  • Reviewing the components and code developed by offshore team and done performance tuning on the components to enhance the execution times.
  • Reviewing the test results before sending it to client.
  • Delivering the reports to customers on time along with good quality.

Environment: Informatica Power Center 10.2, Power Exchange, DB2, Mainframe files, Jira and Unix.

Confidential, Memphis, TN

Programmer Analyst

Responsibilities:

  • Identifies requirements by establishing personal rapport with potential and actual clients and with other persons able to understand service requirements.
  • Analyze requests for custom software development and data reports, provide estimates of time and cost.
  • Arranges project requirements in programming sequence by analyzing requirements, preparing a workflow chart and diagram using knowledge of computer capabilities, programming language, and logic.
  • Provide software analysis, detail design specifications and user release notes.
  • Evaluate and identify needed processes, documentation, and service improvement opportunities.
  • Develop, test, implement, and maintain custom software and system modules.
  • Contributes to team effort by accomplishing related results as needed.
  • Provide programming and data analysis support for the custom applications.
  • Effectively prepared and published various performance reports and presentations.
  • Manage multiple projects/priorities simultaneously.
  • Coordinate and facilitate staff training as needed for software implementations.
  • Provide status report on the progress of work to RM.

Environment: Informatica Power Center - 9.6.1, Teradata 14.0, Teradata SQL Assistant, ODM tool, Flat Files, WinSCP, SAS 9.4, SAS/Enterprise Guide 7.1, Proc SQL, Unix, Git and Control-M.

Confidential, Arkansas

Sr. SAS Programmer/Informatica Developer

Responsibilities:

  • Interacted with Subject Matter Experts (SME) and Business Analysts to understand the functional requirements and preparing the low level and high-level design documents.
  • Analysis of the specifications provided by the business and given technical responses using Requirements Traceability Matrix (RTM) document.
  • Involved as an ETL Developer and worked extensively on transformations like Lookup, Filter, Expression, Aggregator, Router, Source Qualifier, Sorter, Sequence Generator, Lookup, Normalizer etc.
  • Designed and developed new mappings using Connected, Unconnected Lookups and Update strategy transformations.
  • Acted as a SAS programmer in developing SAS programs and to generate reports in the form of PDF, Excel and HTML by using SAS ODS facility.
  • Developed complex SAS Macros to simplify SAS code and effectively reduce coding time.
  • Created SAS commands to change file permissions in the Unix environment and to copy files from one location to other location.
  • Engaged and worked with all the necessary teams to fix production P1, P2 issues.
  • Performed Root Cause Analysis (RCA) on various issues and prepared troubleshooting documents.
  • Closely monitored critical Production jobs for successful completion.
  • Reviewing the test results before sending it to client.

Environment: Informatica Power Center, SAS 9.3, SAS/EG 5.1, SAS/BASE, SAS/Macro, SAS/Annotation, SAS/GRAPH, SAS/ODS, SAS Stored Process server, Proc SQL, DB2 and Unix.

Confidential

SAS Programmer/Mainframe Developer

Responsibilities:

  • Gathering requirements from onsite team, analyzing the requirements and preparing technical specifications.
  • Prepared the low-level design document based on the technical specifications and high-level design.
  • Development of new programs and make changes to the existing components as per the requirements.
  • Develop the new SAS code (control cards) based on the given business needs and merge it with the existing mainframe Cobol code.
  • Enhancing existing Cobol programs for adding new groups, i.e. add new logic to existingprograms as per mapping documents.
  • Setup new Production JCLsas well as test JCLs for new 834 groups.
  • Setup integration test job series for each 834 groups.
  • Provide status report on the progress of work to Onsite and offshore RM.
  • Reviewing the test results before sending it to client/onsite.
  • Postproduction jobs monitoring

Environment: Z/OS on IBM mainframe, COBOL, JCL, VSAM, Ascential Data Stage TX 7.5 (Mercator 7.5), File-Aid, Endevor, SAS, Proc SQL, DB2 and Unix.

We'd love your feedback!