Sr.informatica Lead Resume
Edison, NJ
SUMMARY
- Around 8 years of IT experience with an excellent understanding of Software Development Life cycle inRetail, Manufacturing and Banking, Logistics.
- Worked as Data Analyst for Complex Banking Compliance projects - Know your Customers- KYC, Genesis Data Integration - Confidential which involves end to end coordination from Source systems to production support team, interaction with Source systems, preparing data profiling documents, TRL specifications, FRD go through with business.
- Involved in understanding Business Process and coordinating with Business-users to get specific user requirements. Involved in source data analysis and identified business rules for data migration and for developing data warehouse data marts.
- Worked as Team lead managing a team of 6 to 8 members.
- Worked as production support analyst in DWH environment.
- Experience in methodology for supporting Data Extraction, Transformations and Loading Process in a corporate-wide-ETL solution using Informatica Power Center (Designer, Manager and Monitor), Data Mining and OLTP.
- Worked on Push down Performance Tuning techniques and Partitioning for better performance in data loads.
- Worked with Informatica Work-flow Manager to run the workflows/mappings and monitored the session logs in Informatica Workflow Monitor.
- Had Experience in data modellingStar Schema, Fact and Dimension Table in Informatica using Erwin tool.
- Strong skills in Data Analysis, Data Requirement Analysis and Data Mapping for ETL Processes.
- Worked on Complex and Advanced Transformation Java,XML, Transaction Source Control, SQL Transformation techniques.
- Written shell scripts to automate the ETL process involving, Archiving Source and Target Files, Creating parametric list file for indirect loads, deleting the older archived files.
- Involved in automating Code move between Development, UAT and production environments by using Deployment Scripts dynamically.
- Have solid experience on database query tools such as Toad,SQLdeveloper, Rapid SQL Navigator and SQL Plus.
- Hands on experience in Tableau, Cognos. Have good knowledge on Spotfire reporting tool
- Strong in writing Stored Procedures, Functions, Triggers and packages utilizing PL/SQL.
- Proficient in Application Lifecycle management tool including Requirement gathering, Release Management, Test reports and defect tracking using Industry standard Tools like HPQuality Center (QC) 10/ALM,, JIRA.
- Fair knowledge on Ab Initio architecture and the ETL methodology execution in Ab Initio.
- Good Experience in unit Testing Initial (Reconcile) and Daily (Delta) ETL loads.
- Have fair experience with production supports using BMC support management applications.
- Good knowledge on external API connectors with Informatica likeEloqua.
- Good experience installing, configuring, testing Hadoop ecosystem components.
TECHNICAL SKILLS
Languages: SQL, C, C++, PL/SQL, Core Java
RDBMS: Oracle 11g/10g,Teradata,Sybase, DB2
Operating Systems: UNIX, Windows vista / XP / 2000 / NT / 98 / 95.
RDBMS Query Tools: SQL Developer, SQL Navigator, SQL Plus
Testing Tools: Win Runner, Test Director
Test Management Tools: HP Quality Center 10/ ALM
Version Control Tools: Virtual-Source Safe, SVN
Defect Tracking Tools: HP Quality Center ALM, JIRA
Scheduling Tools: Autosys, Control-M,and Informatica Scheduler.
ETL Tools: Informatica Power Center, Ab Initio
Other Tools: Putty, TextPad, Compare It, WinSCP, Araxis, Confidential Comp
Reporting Tools: Tableau, Qlikview, Cognos, Business Objects
Big Data Ecosystems: Hadoop, MapReduce, HDFS, HBase, Hive, Pig, Sqoop, Cassandra, Oozie.
PROFESSIONAL EXPERIENCE
Confidential - Edison, NJ
Sr.Informatica Lead
Responsibilities:
- Involved in defining business requirement based on understanding on data available in operational source system.
- Through understanding the Global Consumer banking and Industrial Consumer Banking along with large scale and small firms.
- Automated file handling such as archiving of files, deletion of old files, creating parameter files by writing shell scripts and invoke them from Command task in Informatica.
- Involved in various business meetings in understanding the data concepts and worked in team coordination with SIT, UAT and productions team to spot the defect and assign to appropriate teams based on severity by giving some suggestion on how to debug the issues and what would be the probable root causes
- Participated in High level and Low level design.
- Involved in developing DWH Architecture, complex mappings and data modeling.
- Experienced in writing complex SQL queries involving more number Joins, Co Related Query, Sub Queries etc.
- Expertise in creating DDL, DML and TCL scripts in SQL Server and Oracle databases.
- Good in PL SQL procedures, Functions, Triggers and Packages.
- Maintaining Source Data Mapping and ETL Data Mapping templates.
- Used various transformations like Stored Procedure, Connected and Unconnected lookups, Update Strategy, Filter transformation, Joiner transformations to implement complex business logic. Involved in studying the existing mapping logic and restructured the logic by developing reusable mapplets, Work Lets and Reusable Transformation using Transformation Developer.
- Worked in validating the mapping logic to identify the performance issues which includes, identifying the unused ports by using Propagation Links, SQl override in Source Qualifier to filter out the unneeded data getting into the mapping logic, Partitioning etc.
- Worked on performance tunings involving partitioning, Push-down optimization and other performance tuning techniques.
- Efficient in handling the cache managements and worked extensively on Dynamic cachetechniques.
- Developed Tableau workbooks to perform year over year, quarter over quarter, YTD, QTD and MTD type of analysis.
Environment: Informatica 9.6.1, Teradata, UNIX, Tableau, Oracle 9G, SQL Developer, Operator Console
Confidential - MN
ETL Informatica Developer- Onsite coordinator
Responsibilities:
- Involved in gathering requirements and preparing design and technical documents.
- Interact with source systems to understand the data formats and involved in preparing the data profiling documents to get the minimum requirements to migrate the data from source systems to MDI systems.
- Participated in function go through meetings and proposed the feasibility of incorporating the data from different source systems into Local Data Quality processors.
- Coordinate with BA team and develop all ETL processes.
- Responsible for Extracting, Transforming and Loading data from Oracle, SQL Server, XML files and Flat files.
- In-depth knowledge of cache management for Joiner,Aggregator, Lookup Dynamic Cache Management, Persistent and Static managements.
- Developed complex mappings such as Slowly Changing Dimensions Type II-Time stamping in the Mapping Designer.
- Worked extensively on mappings and different types of transformations.
- Involved in debugging Informatica Objects.
- Creating/Building and Scheduling workflow and Sessions using the Workflow manager.
- Used Unix Shell Scripts to automate pre-session and post-session processes using Pre SQL and Post SQL for dropping and creating indexes.
- Monitor all production issues and inquiries and provide efficient resolution for same.
- Worked extensively on different types of transformations like Source qualifier, expression, Aggregator, Router, filter, update strategy, lookup, sorter, Normalizer, sequence generator, etc.
- Worked with XSD and XML files generation through ETL process.
- Involved in creating Oracle, indexes and writing PL SQL procedures,Triggers, Functions queries.
- Coordinated with testing team to make testing team understand Business and transformation rules being used throughout ETL process.
- Involved in preparing a documentation which includes day to day learning from work activities and more simplified ways to explore Informatica tool options which is approved and share in Knowledge management portal.
- Worked on POC to migrate the data through HADOOP by creatingHive queries that helped market analysts spot emerging trends by comparing fresh data with EDW reference tables and historical metrics.
- Managed and reviewed Hadoop log files.
- Enabled speedy reviews and first mover advantages by using Oozie to automate data loading into the Hadoop Distributed File System and PIG to pre-process the data.
- Developed Qlikview complex reports for analyzing KPI’s, sales reports, demand and supplies, product inventory etc.
Environment: Informatica 9.1, UNIX, Center Point Quote tool, HP ALM, Qlikview, DB2, Oracle 11g, BAAN CRP Systems, Hadoop Eco systems.
Confidential - MN
ETL Informatica Developer / Onsite Coordinator
Responsibilities:
- Identified the requirements and arranged peer reviews, walkthroughs and sign off meetings.
- Involved in Gathering and Analyzing Business Requirements, High Level Designing and Detail Technical Design of ETL application.
- Involved in Leading a team in executing the price adjustment project by creating mapping documentation for Source to Target transformation, coordinating with offshore in executing the Transformation logic as mentioned in the mapping document that includes fetching the pricing and part numbering information for every product that we offer from Confidential brand (all windows and doors) from BAAN ERP and other source systems from different systems.
- Written shell scripts to automate the ETL process involving, Archiving Source and Target Files,Creating parametric list file for indirect loads, deleting the older archived files.
- Involved in identifying the performance bottlenecks through session logs and work log monitors in Informatica and suggesting solutions to improve the performance to offshore team.
- Worked on tuning the performance tuning with Pushdown Optimization on the source databases, partitioning the session using Dynamic, Range and Hash partitioning techniques.
- Effective written SQL Queries by accessing objects in other database through DB links.
- Good Experience in Initial (Reconcile) and Daily (Delta) ETL loads.
- Responsible for developingPL SQL Stored Procedures, Functions, Packages, triggers and other Database objects.
- Involved in automating Code move between Development, UAT and production environments by using Deployment Scripts dynamically. Also have experience in User management in Admin Console.
- Skilled in effective usage of Workflow manager tasks to improve the ETL process execution.
- Involved in offshore coordination in guiding them to understand the complex mapping documents and getting the deliverables on time, reviewing the deliverables with onsite team and moving the code to Dev. Environment.
- Created a process on how the offshore deliverables should be presented to onsite leads to ensure proper tracking of progress and difficulties faced by offshore and dependencies needed from offshore.
- Conduction knowledge sharing sessions with offshore about new learnings from the customer side both on the technical and business front.
Environment: Informatica Power Center 9.2, UNIX, SQL Developer, Oracle 10g HP ALM, Windows NT,MS Excel, VSS (Visual Source Safe), Putty, WinSCP, HP QC 10.0.
Confidential
ETL Developer / Team Lead
Responsibilities:
- Worked as a Team Lead for gathering business requirements, system requirements, data mapping requirement specifications, and responsible for documenting functional requirements and supplementary requirements in HP Quality Center 10.
- Developing various graphs as per the business requirements.
- Creating the Psets, DML’s and XFR s.
- Worked on distributing the graph components in different phases thereby enabling load balancing and recovery checkpoints.
- Code changes of Transformations in a graph
- Involved in unit testing data mapping and conversion in a server based data warehouse.
- Creation of SDLC documentation, including User Requirements, Technical Specifications, and Development Reports.
- Involved in extensive testing of batch processing for positive and negative scenarios.
- Check-in and Check-out the graphs between Sandbox and EME.
- Debugging of graphs using Checkpoints and log files.
- Good exposure to Sandbox and EME implementation.
- Worked with Multi File (MFS) and Serial file source systems.
- Written Shell scripts to run the Ab Initio developed jobs (.ksh).
- Tested the ETL process for both before data validation and after data validation process. Tested the messages published by ETL tool and data loaded into various databases
- Involved in validating inbound and outbound batch processes.
Environment: Ab Initio GDE 2.14, Ab Initio Co-operating System2.15, JIRA Rational Clear Case, Batch Jobs, Shell Scripting, Autosys, UNIX, WINSQL, Putty.
Confidential
ETL Informatica Developer / POC
Responsibilities:
- Acted as one point of contact from offshore team and worked upon understanding the requirements and arranged peer reviews, walkthroughs and sign off meetings before starting up with an enhancement.
- Attended the Requirements reviews, Design reviews, Code reviews along with onsite team to understand the overall Process flow and Source to Target Mappings.
- Proven Experience in working AGILEmethodology, and actively involved in interaction with BA’s, Developers and infrastructure team
- Involved in designing the mappings as per the Data Mapping document provided by onsite Leads and effectively delivered the deliverables in XML formats for onsite reviews.
- Skilled in debugging the issues using Informatica Debugger, reported by onsite team and worked on solutions based on the overall understanding of the root cause.
- Used the Informatica Power CenterWorkflow manager and its run-time engine to schedule running the solution, testing and debugging its components, and monitoring the resulting executable versions.
- Skilled in using all session logs,WorkflowLogs, Reject Files to find the performance issues and proposed the bottlenecks areas to onsite team.
- Performed SQL validation to verify the data extracts and record counts in the database tables.
- Performed usability testing frequently to ensure continuous availability of online services to Personal clients.
- Writing SQL Queries to retrieve data from database.
- Conducted weekly batch testing to in corporate changes in new builds.
- Provided the onsite lead with weekly reports on development efforts, issue status, dependencies needed to resolve the issues and other related information needed for effective offshore deliverables.
- Used SVN to maintain all the older codes for reference purposes and to maintain a track of changes in a local repository.
- Ensure whether all the codes (mappings, workflows are checked in for onsite review and ensure smooth transitioning between offshore team and onsite leads.
Environment: Informatica Power Center 9.1, HP Quality center 9.2, SQLPlus, UNIX, Flat Files, XML Files, Shell Scripting, Cognos.
Confidential
Production Support- ETL
Responsibilities:
- Managing the Incidents in BMC
- Converting incidents to problem management.
- Gathering the Requirements from Business Users
- Analyzing Problem management issues and providing solutions for the same.
- Preparing the Technical Design based on the requirements
- Responsible for Extracting, Transforming and Loading data from different sources and placing it into Data warehouse
- Worked extensively on mappings, workflows, sessions and different types of transformations.
- Involved in writing Sql queries and UNIX shell scripts
- Wrote shell scripts for loading data.
- Analyzing, monitoring and running UNIX crontab jobs and Informatica jobs
- Production support with quick response and resolving the issue in minimal time frame
- Met tight deadlines and worked under immense pressure.