Informatica B2b Developer Resume
South Brunswick, NJ
SUMMARY:
- Around 12 years of work experience in the IT industry focusing on data analysis, application design and development, data modeling and implementation of data warehousing systems.
- Advanced knowledge and excellent concepts in ETL, decision support systems and d ata warehousing/ databases.
- Excellent developer and analyst in Informatica data warehouse environment.
- Excellent with Informatica PowerCenter 10.1/ Data explorer/ PowerExhange/ MDM/ Studio/ B2B Data exchange, MFT 10.2.
- Experienced in data modeling making use of dimensional data modeling, star schema/ snowflake schema, creating fact and dimension tables, physical and logical data modeling.
- Good understanding of relational database environments.
- Experienced in managing ETL on a large data warehouse including development, implementation and ongoing support of automated load and validations processes.
- Involved in the overall architectural design, strategy of on - going design and migration from development to UAT and UAT to production.
- Proven technical and analytical skills.
- Good programming experience in C++, Java and shell scripts.
- Expertise in full life cycle projects, from conceptualization to implementation and maintenance.
- Experience with O racle 12c, 11g, 10g, 9i, 8i and 7.x on SCO Unix, sun Solaris and Windows 95/ 98/ NT.
- Working knowledge on Informatica Master Data Management (MDM) and IDQ.
- Hands on experience in design and configuration of landing tables, staging tables, base objects, hierarchies, foreign - key relationships, lookups, query groups, queries/custom queries and packages.
- Proficient in all phases of the system development life cycle, including requirements definition, data conversion, system implementation, system testing and acceptance.
- Excellent oral/ written communication skills, strong decision-making skills, organizational skills, analytical problem-solving skills and a good team player.
TECHNICAL SKILLS:
ETL: Informatica 6.2/ 7.1.3/ 8.6.1/9.0.1/ 9.5.1/10.2.
Application Packages: VC++ 6.0, TOAD, Oracle SQL developer 3.0.04.34, Visual Basic, OpenGL, Trillium and ColdFusion
Web Programming: XML, HTML, VB script, Java script, JSP, Macromedia
Data Modeling: Erwin 7.3.8 SP2/ 9.5, Visio 2007, UML
Data loading tools: Unix shell scripts, SQL* loader, SQL* plus
Languages: C, C++, C#, Java, COBOL, PL/SQL and SQL
RDBMS: Oracle 12c/ 11g/ 10g/ 9i/ 8i, Teradata 12.0, MS SQL server 2005/2008, Sybase, DB2, MS Access 2007.
Operating Systems: Windows NT/ XP/ 95/ 98/ 2000, IBM AIX 4.2/ 4.3, sun solaris 2.6/ 2.7.
EXPERIENCE:
Confidential, south brunswick, NJ
Informatica B2B Developer
Responsibilities:
- Strong Informatica Power Centre Knowledge or desirable experience in Informatica B2B Data Exchange, DIH and Health care domain.
- Should have strong communication skills along with job related technical skills.
- Install and register the set of transformations that you use in Power Center workflows to process B2B Data Exchange documents.
- Should be able to Test the initial setup with one initial DX process agreed upon by client and vendor in coordination with B2B developers.
- Should have strong DX experience.
- Prepare a platform administration guide and the best practices document.
- Manage user authentications, policies, groups and categories and the associated privileges.
- Carry out repository management activity.
- Manage or coordinate activities associated with Operation Console.
- Create and manage B2B data exchange scheduler.
- Create and manage applications to associate with power center workflows.
- Manage and customize partner and account Information.
- Manage and archive B2B data exchange events.
- Monitor load and performance of the overall environment and identify performance glitches in the system if any.
- Come up with health reports for the environment.
Environment: Informatica B2B Data Exchange 10.2, MFT 10.2, Informatica PowerCenter 10.2, Oracle Database 12c, Unix/Linux.
Confidential, Billerica, Massachusetts
Technical Consultant/ ETL Lead
Responsibilities:
- Developed complex Informatica ETL code to read data, transform data and write data into the target systems in the enterprise.
- Implemented dynamic functionality to read a changing set of values and execute them based on as is, hard code, Sql based and use the results into calling the API from Informatica which can be extendable to other business systems.
- UNIX shell scripting for writing routines to call and execute custom business functionality.
- Master data Management (MDM) of the Interface is done with complex routines and workflows.
- Implement the complex dynamic reusability of the code for other systems with a few changes in the set of parameters.
- Designed and Implemented data quality scorecards and dashboards.
- Expertise in Informatica for coding of the Software to build data hubs.
- UNIX shell scripting to call and execute custom business functionality.
- Data Hub shall send OpenGPG encrypted data files to the Confidential sFTP server.
- Decrypted and validated the format of the file.
- Validated the data in the files and load the data from the files into the IMPACT database.
- Errors generated due to decryption, file format or validation of data shall be sent to stakeholders.
- Invoked the PL/SQL code to load dimensions from the IMPACT source system.
- Effectively analyzes, troubleshoots any issues and defects encountered and follow-up with business on any product defects.
- Support release management, deployment and code migration processes.
- Proficient in SVN repository to manage code versioning, labeling, moving and storage.
Environment: Informatica Powercenter 10.1, Informatica B2B Data Exchange 9.5.1, IDQ, Oracle Database 12c Enterprise Edition Release 12.1.0.2.0 , PuTTY .67, Tortoise SVN 1.9.7, WinSCP 5.11.3, Oracle SQL developer 4.1.5.21, Unix Shell Scripts, PL/SQL and SQL, Atlassian JIRA v6.0.7, Windows 8 Enterprise, IBM AIX 4.3.
Confidential, Edison, New Jersey
Sr. ETL Developer/ Analyst
Responsibilities:
- Strong knowledge of design, architecture and development in SQL server 2012 environment.
- Significant experience interfacing with business analyst.
- Gathering, analyzing requirements and requirements gathering sessions with data analysts.
- Created technical spec document, data mapping document for the ETL process.
- Created unit test documents for the ETL code.
- Tested the ETL objects to optimize load performance.
- Implemented loads using change data capture for real time processing.
- Designed mappings between facets sources to data warehouse in Greenplum.
- Developed workflows to source SQL server data.
- Worked with deployment team for code deployment from development to SIT, UAT and production.
- Issue resolution and results turnover using ALM defect management.
- Designed and Implemented data quality scorecards and dashboards.
- Excellent teamwork and understanding with the users.
- Reports generated from the EDW.
- System documentation.
- Enterprise job scheduling software using active batch workload automation
Environment: Microsoft SQL server 2012, Informatica Powercenter 9.5.1, IDQ, SQL server Management studio, Visual studio 2010, Greenplum database, PgAdmin III 1.16, PostgreSQL
Tools: 1.16.1, Active batch, Tortoise SVN repository browser, HP ALM defect management.
Confidential, Portland, Maine
Senior ETL Developer
Responsibilities:
- Strong knowledge of design, architecture and development in DB2 environment.
- Significant experience interfacing with business analysts.
- Gathering, analyzing requirements and requirements gathering sessions with data analysts.
- Created technical spec document, data mapping document for the ETL process.
- Created unit test documents for the ETL code.
- Implementation of data profiling, creating score cards, creating reference tables and documenting Data Quality metrics/dimensions like Accuracy, completeness, duplication, validity, consistency.
- Good skills in analyzing trend charts from score cards to analyze the threshold which is to be considered in further development.
- Have good skills on understanding and development of business rules for its Standardization, Cleanse and Validation of data in various formats.
- Very strong knowledge of Informatica Data Quality transformations like Address validator, Parser, Labeler, Match, Exception, Association, Standardizer and other significant transformations.
- Tested the ETL objects to optimize load performance.
- Implemented CDC loads using XML for real time processing.
- Designed mappings between teradata sources to data warehouse in DB2.
- Developed workflows to source teradata data using teradata parallel transporter.
- Written parameters and shell scripts to automate the load process and job control.
- Worked with shared services for code migration from development to itest, acceptance and production.
- Excellent issue resolution and results turnover.
- Excellent teamwork and understanding with the users.
- The project helped in achieving high level reports used by the business users and their teams.
- System documentation.
Environment: Informatica Powercenter 9.5.1, Informatica IDQ, power exchange CDC with SQL serverPowercenter data validation client 9.5.2.0, Teradata 14.00.03, IBM DB2 9.1.7, Teradata SQL assistant, DB2 AIX v9.7, DB2 Linux v10.1, WinSCP 5.5.4.
Confidential, New York City
ETL Lead
Responsibilities:
- Strong knowledge of design and development in oracle and DB2 environment.
- Significant experience interfacing with business users.
- Gathering, analyzing requirements and preparing the data elements spreadsheet.
- Created data mapping document for the ETL process.
- Tested the ETL objects to optimize load performance.
- Designed mappings between flat files, DB2 sources to data warehouse, which is in oracle.
- Developed slowly changing dimensions type 2 and type 1.
- Created parameter files and shell scripts in UNIX.
- SQL and PL/SQL coding.
- Worked with shared services for code migration from development to QA and production.
- Automated daily, weekly and monthly workflows to run different jobs.
- Excellent production support and issue resolution.
- Excellent teamwork and understanding with the users.
- The project helped in achieving high level reports used by the senior management and their teams.
- System documentation.
Environment: Informatica Powercenter 9.0.1, IBM DB2 9.1.7, Oracle 11g, TOAD for Oracle 10.6, Erwin 7.3.8 SP2, TOAD for DB2 5.0, Putty 0.62, WinSCP 5.1.2, Beyond compare 3.3.5, SQL Server 2008 R2, IBM AIX UNIX 7.1.
Confidential, Roseland, New Jersey
Data Warehouse Developer
Responsibilities:
- Strong knowledge of design and development in oracle 11g environment.
- Significant experience interfacing with DBAs, designers, developers.
- Gathering, analyzing and normalizing requirements, source & target system data analysis (RDBMS, hands-on SQL).
- Identifying data quality issues and recommending resolutions.
- Data mapping, data extraction, transformation & load.
- Worked closely with the business community to assess business needs, define requirements.
- Tested and modified the ETL objects to optimize load performance.
- Designed mappings between flat files, RDBMS sources to data warehouse, which is in Oracle.
- Wrote functions, triggers, sequences and stored procedures in oracle database.
- Developed slowly changing dimensions.
- Generated DDL from Erwin and created the objects in database.
- Created views and materialized views as required for the reports.
- Unit testing, integration testing of the developed objects.
- Code migration from development to QA and production.
- Automated daily, weekly and monthly workflows to run different jobs.
- Excellent production support.
- The project helped in producing valuable reports, which had consolidated information for the business owners.
- End user training and system documentation.
Environment: Informatica Powercenter 8.6.1, Informatica Data Explorer 9.0.1, Oracle 11g, SQL Server 2008, Windows.
Confidential, Hartford
Data Warehouse ETL Developer
Responsibilities:
- Analyze business requirements and create functional specification.
- Generated business models and use case analysis.
- Translated functional requirements into technical specification.
- Identifying data quality issues and recommending resolutions.
- Data mapping, data extraction, transformation & load.
- Assist customer development group with extracting and analyzing large complex data.
- Member of core team responsible for OLAP data warehouse implementation and decision support and data issue resolution.
- Custom development on UNIX server using PL/SQL, UNIX (Korn) shell scripting.
- Tuning PL/SQL and SQL on very large data.
- Written shell scripts for running batches.
- Designed mappings between flat files, RDBMS sources to data warehouse, which is in oracle.
- Extracted data from source systems to a staging database running on teradata using utilities like multiload, tpump and fastload.
- Created set of reusable transformations and Mapplets to create surrogate keys and to filter data, which is coming from various sources.
- Used unconnected lookups in various mappings.
- The project helped in producing valuable reports, which had consolidated information for the business owners.
- End user training and system documentation
Environment: Informatica Power Center 8.1/8.6, Oracle 10g, TOAD, Teradata V2R6, Business objects, IBM AIX.
Confidential, Roseland, New Jersey
Data Warehouse ETL Developer
Responsibilities:
- Designed and customized data models for data warehouse, supporting data from multiple sources.
- Translated business requirements into technical specifications.
- Experience in extract, transform and load (ETL) tools to maintain, design, develop, test, implement and document data warehouse solutions
- Extensively worked with informatica to load data from flat files, oracle to target database.
- Gathering, analyzing and normalizing requirements, source & target system data analysis (RDBMS, hands-on SQL), identifying data quality issues and recommending resolutions.
- Data mapping, data extraction, transformation and load.
- Involved in the overall architectural design, strategy of on-going design and migration from development to QA and production.
- Data integration analysis.
- Involved in the logical data modeling and physical data modeling using Erwin.
- Written code to access different instances of different databases.
- Responsible for the design and architecture of the ETL component of the data warehouse.
- Worked on informatica tools; source analyzer, mapping designer, mapplets, and transformations.
- Created mapplets, stored procedures, and functions and used them in different mappings.
- Written triggers for data manipulation.
- Written PL/SQL code and involved in performance enhancement.
- Used shortcuts to sources and transformations methodology in the informatica environment.
- Used parameters at session levels to tune the performance of mappings.
- Written test cases for data validation.
- Documentation of technical specifications, user requirements, ETL design specifications, mapping inventory.
- Created views and reports were generated from them.
- The project helped in producing valuable reports, which had consolidated information for the business owners.
Environment: Informatica PowerCenter 7.1.3, Oracle 10g, Oracle SQL Developer, Erwin 4.1.4, Crystal Reports, Windows NT.
Confidential, Hartford
Data Warehouse ETL Developer
Responsibilities:
- Designed and customized data models for data warehouse, supporting data from multiple sources.
- Modeled the data warehousing datamarts using star schema and warehouse using relational concept.
- Translated business requirements into technical specifications.
- Experience in extract, transform and load (ETL) tools to maintain, design, develop, test, implement and document data warehouse solutions
- Extensively worked with informatica to load data from flat files, oracle, db2, ms sql server to target database.
- Gathering, analyzing and normalizing requirements, source & target system data analysis (RDBMS, hands-on SQL), identifying data quality issues and recommending resolutions.
- Data mapping, data extraction, transformation and load.
- Strong knowledge of design and development in oracle 10g environment.
- Significant experience interfacing with DBAs, business customers, business analysts, data analysts, developers and IT operations staff.
- Data integration analysis.
- Involved in the overall architectural design, strategy of on-going design and migration from development to QA and production.
- Responsible for the design and architecture of the ETL component of the data warehouse.
- Worked on informatica tools; source analyzer, mapping designer, mapplets, and transformations.
- Created informatica repository in oracle database.
- Designed mappings between flat file, RDBMS source to data warehouse, which is in oracle.
- Created set of reusable transformations to create surrogate keys and to filter data, which is coming from various sources.
- Most of the transformations were used like source qualifier, aggregator, filter, expression, unconnected and connected lookups, and update strategy.
- Managed data warehouse development, implementation and ongoing support of automated load and validations processes.
- Improved the mapping performance using SQL.
- Created mapplets and used them in different mappings.
- Optimized query performance, session performance.
- Customized data by adding calculations, summaries and functions.
- Created database triggers for data security.
- Write SQL, PL/SQL codes and stored procedures.
- Used parameters and variables at mapping and session levels to tune the performance of mappings.
- Designed and documented the validation rules, error handling and test strategy of the mappings.
- End user training and system documentation.
- Tuning PL/SQL and SQL on very large data.
- Documentation of technical specifications, user requirements.
Environment: Informatica Power Center 6.2/ 7.1.3, Oracle 10g/ 9i, Windows NT, Trillium, IBM AIX 4.3
Confidential, Hartford
Data Warehouse ETL Developer
Responsibilities:
- Implement data warehouse processes like data access paths, data extraction/ transformation and load ETL, logical and physical data modeling using Erwin, data mapping loads, and source to target reconciliation.
- Work closely with the business community to assess business needs, define requirements and implement solutions within the data warehouse.
- Assist customer development group with extracting and analyzing large complex data.
- Member of core team responsible for OLAP data warehouse implementation and decision support and data issue resolution.
- Extensive experience in maintain, design, develop, test, implement and document data warehouse solutions.
- Used UNIX shell scripts to execute different jobs and perform operations.
- Custom development on UNIX server using PL/SQL, UNIX korn shell scripting, SQL* loader, testing and modifying the ETL procedures to optimize load performance.
- Implemented performance tuning concepts in informatica.
- Written shell scripts for running batches.
- Extensive experience interfacing with DBAs, business customers, developers.
- Used the repository manager to give permissions to users, create new users and repositories.
- Extracted data from COBOL using normalizer transformation where DB2 was the source data.
- Involved in data cleansing, especially converting the data formats.
- Strong knowledge of design and development in oracle 9i environment.
- Modified the existing batch process, shell script, and PL/SQL procedures for effective logging of error messages into the log table.
- Working with oracle 9i environment using multiple instances, parallelism and partitioning
- Extracted data from different systems into repository using Informatica power connect.
- Used different data warehouse techniques like star schema, snowflake schema.
- Incorporated traditional ER model with star schema to resolve the extra information needed.
- Worked on flat files, which are coming from mainframe sources using normalizer to load the data into oracle database.
- Created different source definitions to extract data from flat files and relational tables for powermart.
- Created ETL transformations for applying the key business rules and functionalities on the source data.
- Created sessions and batches using Informatica Powercenter.
- Written PL/SQL procedures for processing business logic in the database.
- Tuning of SQL queries for better performance.
- Interaction with the end users in requirements gathering and designing technical and functional specifications.
- Database extraction and filtering using UNIX shell scripts (awk, sed).
- Shell scripts were run through UNIX cron scheduling batch sessions.
Environment: Windows NT, Oracle 9i, DB2, Informatica 6.2 PowerCenter/ PowerMart, UNIX (Sun Solaris).