Migration Lead / Architect Resume
CA
SUMMARY:
- Over 10 years of IT experience in Application Design, Data Modeling, Application Development, Data Analysis, Implementations and Testing of Applications in Data Warehousing as different segments in SDLC and more TEMPthan 9 Years of experience in DW.
- Experience working on Amazon Redshift and Big Data technologies for the past year and half.
- Hands on implementing the cloud solution by moving all the historical data to the cloud and implementing Data pipeline for the incremental high volume data processing.
- Maintained outstanding relationship wif Developers, Business Analysts and Business Users to identify information needs as per business requirements.
- Extensive Data Warehouse experience using Informatica PowerMart / PowerCenter v 7.1, 8.1 as ETL tool on Oracle, SQL Server and DB2 Databases and other Informatica components Power Exchange, PowerConnect, Designer, Workflow Manager, Workflow Monitor, Repository Manager, Repository Server Administration Console Data Quality and Data Explorer.
- Configured the Real time option using Power exchange along wif Informatica
- Having strong skills in Administration, Designing, Coding, Development, and Implementation of Business Applications wif Web - enabled, three tier and Client/Server environment using Informatica, on UNIX and Windows Platforms.
- Experience in business functionality implementations, developing Data Models, Data Warehouses, and Data Marts from logical models as per specific business requirements.
- Worked on Data conversion and migration projects.
- Involved in migrating the projects using Informatica to higher versions of Informatica.
- Involved in developing Logical and Physical Models using Erwin 3.5/4.0 Designer.
- Tuned SQL queries and performed refinement of the database design leading to significant improvement in system response time and efficiency.
- Experience in creating and using PL/SQL Stored Procedures, Functions, Triggers, Views, Synonyms, and Packages in databases.
- Implemented performance-tuning techniques at application, database and system levels.
- Strong noledge on Project & Change Management, SDLC and System Analysis & Design.
- Good noledge and experience in Shell Scripts.
- Used the scheduling tool Autosys for automating the process.
- Good exposure to development, testing, debugging, implementation, documentation, and production support.
- Motivating team player wif outstanding Organizational and Inter-personal Skills
- Developed complex SQL queries and PL/SQL packages.
- Involved in Setting up the whole environment for the development of ETL.
- Architected and managed Data warehouses in production up to 50 TB in size.
- Systems wif daily load volumes of 500 Million records
- Tables of size 5 TB and 15-20 billion records.
- Re-architected the old processes to run in minutes compared to previous run time of 3-4 hours and wifout effecting any reporting requirements.
- Automated several processes to reduce the ma hours from days to hours.
TECHNICAL SKILLS:
Database: Oracle10g/9i/8i, MS SQL Server 2000, Teradata, DB2MySQL, Netezza, Amazon Redshift
ETL Tools: Informatica PowerCenter 7.1.2, 8.1, 8.5, 9.0 Informatica Power Exchange, Data Quality and Data Explorer.
Languages: SQL, PL/SQL, XML Python, Java, Hive
Data Modeling: Erwin 3.5/4.0, MS Visio
Packages: Siebel DAC, MS Team Foundation Server
Scripting: UNIX Shell scripting, VB script
Reporting Tools: Webfocus, MicroStrategy
PROFESSIONAL EXPERIENCE
Confidential, CA
Migration Lead / Architect
Responsibilities:
- Efficiently migrated all the on premise data from Pl/SQL databases to Amazon redshift using Java, Shell script and Hadoop methodologies.
- Built a framework to validate all the data between Redshift data base and the on premise data base ParAccel.
- Developed data pipelines to process the data from the source systems(API’s, files on FTP server) directly into Redshift database.
- Scheduled the data pipelines using cron Jobs to run at regular interval of time.
- Designed the DB architecture and the security model.
- Designed the summary and the TD tables for efficient reporting in Tableau.
- Fine-tuned the reports dat are running slow.
- Worked on the design of building the combined system architecture.
Environment: Python, Java, Amazon Redshift, S3, Hadoop, Pig Hive, Amazon EC2, Data pipeline.
Confidential, CA
Technical Architect
Responsibilities:
- Migrated all the on premise data from sql server, oracle, MySQL and db2 databases to Amazon redshift using python, Attunity tool on Amazon EC2 instance.
- Built a framework to validate all the data between Redshift data base and the on premise data sources.
- Built a mechanism to sync the sql server on premise data and the Redshift DB on Amazon.
- Developed data pipelines to process the data from the source systems(API’s, files on FTP server) directly into Redshift database.
- Scheduled the data pipelines using cron Jobs to run at regular interval of time.
- Successfully resized the clusters from 16TB to 32 TB in quick time.
Environment: Python, Java, Amazon Redshift, S3, Glacier, Hadoop, Pig Hive, Attunity, Amazon EC2, Data pipeline.
Confidential
Responsibilities:
- Analysis of requirements and designing the new global integrated reporting systems.
- Involved in reducing the dependencies on multiple systems to pull the metadata so as to reduce the chances of errors.
- Designed the ETL framework to process multiple files from different partners and load into DW wif different audit tables and checks in place.
- Designed the framework to trigger the reports once the ETL jobs completes.
- Coordinating wif MSTR/reporting team to understand the tool capabilities and provide the optimal solution.
- Proposed solution for critical business logic to meet the needs.
- Involved in setting up the complete process from development to production migration.
- Worked on integrating the data from external systems post acquisitions.
- Involved in setting up the process to support the system post production.
Environment: Informatica Power Center 9.0, SQL Server, Oracle, Perl, Control-M
Confidential, CA
Migration Team Lead
Responsibilities:
- Analysis of all the existing systems and the functionality
- Creating the equivalent tables and objects in Netezza database
- Leading the team to Migrate the code to support Pushdown Optimization
- Find how to achieve the same logic as the existing system
- Review the test case presented by the team
- Help the team to implement in Re-Architecting certain scenarios
- Provide the Migration check list documents to the team
- Review the issues raised in the issue logs.
- Coordinating the parallel run and support UAT
- Migration of the whole code to the production.
Environment: Informatica Power Center 9.0, SQL, NZPLSQL, Netezza, FITS, Aginity, Linux, Teradata, Oracle, and DB2
Confidential, Torrance, CA
Sr. Informatica Architect / Team Lead / Netezza Developer
Responsibilities:
- Analyzed of all the sources and designing the LDM, PDM of the DataMart
- Created the source to target mapping document.
- Reviewed the stored procedures developed.
- Reviewed the Informatica code.
- Designed the automation part of the whole process.
- Reviewed the test cases for the whole Datamart.
- Prepared Production deployment documents and checklists.
- Coordinated wif the offshore team for the whole cycle of the project.
Environment: Informatica PowerCenter 8.6, SQL, NZPLSQL, Netezza, Mercury Quality Centre, Aginity
Confidential, San francisco, CA
QA Lead
Responsibilities:
- Prepared and executed the test case documents for all the data and the report validations
- Closely worked wif the development team to get the bugs fixed and reverified.
- Tracked all the test cases in the Quality centre.
- Reviewed the stored procedures developed.
- Reviewed the Informatica code.
- Reviewed the Production deployment documents and checklists.
- Passed on all he code,test cases to the UAT team.
Environment: menformatica PowerCenter 8.6, SQL, NZPLSQL, Netezza, Mercury Quality Centre, Aginity,Business Objects.
Confidential, CA
Sr. Informatica Architect / Team Lead
Responsibilities:
- Installed all the components related to Siebel Warehouse development. Components include Informatica 7.1.2. Siebel Analytics Packages.
- Created Database Objects like Tables, Views, Partitions and Indexes.
- Involved in creation of PL/SQL Packages, Functions and Stored Procedures using Dynamic SQL and handled exceptions.
- Created Indexes and table partitions to fine tune the performance of SQL Queries.
- Created a customized process for the customer profitability analysis.
- Performed QA for the mappings developed and prepared testing documents.
- Loaded data in to XML Targets using different transformations and mappings according to the requirement.
- Coordinated wif the Onsite/offshore team during the whole life cycle of the Project.
- Prepared all the documents related to the Project.
- Created Unix Shell Scripts for automating the execution process and to execute post production jobs.
- Involved in the performance tuning of the various jobs.
- Handled huge POS data on Teradata data base.
- Implemented Real-Time option using Power Exchange wif Informatica for the real time interfaces required
Confidential, LA, CA
Migration Team Lead/Administrator
Roles & Responsibilities:
- Installed all the components (Informatica Power center 8.5.1 Server, Client) related to Informatica.
- Set up three different environments for Development, QA and production and installed all the software’s on the servers.
- Coordinated wif the DBA’s for the repository data bases to be created.
- Converted all the UNIX scripts, VB scripts according to the new environment wif the help of the Offshore Team.
- Took back up of the repositories on the old Informatica Server, Importing the repositories to the new version 8.5.1 and Upgrading to it.
- Attended weekly meetings to review the status of the upgrade.
- Installed Power Exchange using the Real Time option, enabling the Log miner options wif the help of DBA’s and setting up the whole environment for a particular project dat requires the real time option.
- Allocated tasks to the offshore team in testing the whole new environment and getting the status every day.
- Coordinated wif the Informatica Technical support to get the issues resolved and applying the patches provided by the Support team.
- Migrated all the Code from Development to QA and then Production phase by phase
- Supported the Production support teams in fixing the issues.
- Created New Users and assigning the roles and responsibilities and folders in the new Environment
- Documented the whole work related to the Installation.
Environment: Power Center 8.5.1, PL/SQL, SQL, Unix wif Oracle, Oracle 10g, SQL Server 2005, Siebel DAC, Informatica Power Exchange
Confidential, LA, CA
Sr. Informatica Designer / Developer
Responsibilities:
- Extraction, transformation and loading of data from various source system to Siebel Application were performed on daily, monthly basis
- Loaded the data in to XML Targets for Application team usage.
- Created a complex release assignment mapplet to be used for various Processes like Invoices, Orders and POS data Load.
- Extensively used the Power Center client tools Designer, Workflow Manager, and Workflow Monitor.
- Involved in Unit and Integration Testing of Informatica Mappings and sessions.
- Prepared program specifications for PL/SQL Procedures and Functions to do the conversion.
- Involved in Technical Documentation, Unit test, Integration test and writing the Test plan.
- Involved in database development by creating Oracle PL/SQL Functions, Procedures, Triggers, Packages, and Records.
- Scheduled workflows and sessions depend on client requirement.
- Created Dimension models by defining the relationships and cardinalities between different entities in Erwin.
- Assisted DBA in creating the Physical data model.
- Participated in weekly status meetings and conducting internal and external reviews as well as formal walkthroughs among various teams, and documenting the proceedings.
- Created UNIX scripts to be called by Autosys for scheduling the jobs.
- Profiled and cleansed the data from different source systems using informatica Data Explorer and Data Quality.
Environment: Informatica PowerMart/PowerCenter 8.1,Informatica Data Quality and Data Explorer PL/SQL, Unix wif Oracle, MS-SQL, Oracle 10g, DB2, Flat files, Autosys
Confidential, LA, CA
Informatica Developer
Responsibilities:
- Involved in data modeling and design of the whole system.
- Extracted data from sources including Oracle and Field width and Delimited Flat files and Loaded into DB2
- Created mappings using different transformations like Aggregator, Expression, Stored Procedure, Filter, Joiner, Lookup, Router and Update Strategy
- Transformed the data to integrate wif the business requirements.
- Handled operating system tasks by generating Pre and Post-session UNIX shell scripts.
- Involved in optimizing the performance by eliminating target, source, mapping and session bottlenecks.
- Created ad hoc queries from the database using SQL to provide information to the business units.
- Used Autosys tool to schedule sessions and batches via Shell Scripts.
- Analyzed data loads, coordinated the changes for proper data transition, and adhered to quality standards.
- Testing the interfaces by writing different test cases
- Preparation of Production deployment documents and checklists.
- Reporting the data errors.
Environment: Informatica PowerMart/PowerCenter 7.1, PL/SQL, Unix wif Oracle, Oracle 8i, DB2, Autosys
Confidential, Irvine, CA
Informatica Developer
Responsibilities:
- Developed the mappings using Informatica’s data integration technology to extract, transform, and load data from the oracle and flat files.
- Tested for the business rule and prepared the test cases.
- Prepared the detailed unit test cases for ETL mapping.
- Converted the stored procedures in Oracle SQL into Informatica mappings.
- Tuned the mapping for its better performance.
- Involved in the development of reporting
- Validating the reports by writing the test cases and SQL scripts
- Post production support
Environment: Oracle 7.3, SQL*Loader, SQL, PL/SQL, Erwin3.5 and Windows NT 4.0, Netezza, Informatica Power Centre 7.1
Oracle Developer
Sales Order Processing System
Responsibilities:
- Created dimensional data model for different modules using Erwin3.5.
- Developed complex SQL queries and PL/SQL packages, including stored procedures & functions, to fetch data for reporting and to accomplish several computations.
- Created various SQL and PL/SQL scripts for verification of the required functionalities
- Generated a number of reports for management to review for system functionality as against the old legacy system
- Worked wif various functional experts to implement their functional noledge into working Procedures
- Worked on optimizing existing procedures and functions using PL/SQL
- Created various Database triggers using PL/SQL
- Developed several forms and reports in the process. Also converted several standalone procedures/functions in PL/SQL to packaged procedure for code reusability, modularity and control.
Environment: Oracle 7.3, SQL*Loader, Erwin3.5 and Windows NT 4.0, Java