We provide IT Staff Augmentation Services!

Informatica Sr. Developer/data Architect Resume

4.00/5 (Submit Your Rating)

Austin, TX

Objective

  • Continuous learning to augment my skill set with the opportunities to deliver my best. My focus is on providing technical leadership to ensure optimization in the design, development and implementation of the delivery of an ETL solution with the expertise in designing & developing BI solutions using technologies such as Informatica, Teradata and UNIX.

SUMMARY

  • An accomplished Software Professional with 10+ Years of experience in Informatica, Abinitio, Control - M, UNIX, Oracle and Teradata.
  • Had experienced with both the traditional SDLC and Agile methodologies.
  • Creating data models, building Informatica ETL processes, dimensional data modeling primarily star schema, performing data migration, defining user interfaces, executing large projects, assuring quality and deployment.
  • Developed staging areas to Extract, Transform and Load new data from the OLTP database to Data warehouse.
  • Strong in Dimensional Modeling, Star/Snow Flake Schema, Extraction, Transformation & Load and Aggregates.
  • Strong in converting business requirements to project documentation and technical specifications.
  • Extensive experience in ETL design, development and maintenance using Teradata, Oracle SQL, PL/SQL, SQL Loader, Informatica Powercenter 9.x/8.x/7.x.
  • Extensively used Informatica client tools like Powercenter Designer (Source Analyzer, Warehouse designer, Mapping designer, Mapplet Designer, Transformation Developer), Workflow Manager, Repository Manager, and Workflow Monitor and Informatica Admin console.
  • Extensively worked on Informatica Powercenter Transformations such as Source Qualifier, Lookup, Filter, Expression, Router, Joiner, Update Strategy, Rank, Aggregator, Stored Procedure, Sorter, Sequence Generator, Normalizer, Union and XML Source Qualifier.
  • Strong knowledge and experience in using Informatica Power Center ETL tool and Informatica Developer.
  • Involved in Data profiling, and provide analysis using Informatica Data Analyst tool for the required datasets from business users.
  • Extensively worked on data extraction, Transformation and loading data from various sources like Oracle, SQL Server, Teradata and files like Flat files.
  • Responsible for all activities related to the development, implementation, administration and support of ETL
  • Hands on experience in Performing tuning mappings, identifying and resolving performance bottlenecks in various levels like sources, targets, transformations, mappings and sessions.
  • Well versed in developing the complex SQL queries, unions and multiple table joins and experience with views.
  • Experienced in environments requiring direct customer interaction during design specification, development and acceptance testing and implementation phases.
  • Ability to handle multiple tasks and demonstrate initiative and adaptability and capable of handling responsibilities independently as well as proactive team member in an Agile methodology.
  • Performed evaluation of ETL tools and recommended the most suitable solutions depending on business needs.
  • Excellent knowledge with Unit Testing, Regression Testing, Client Acceptance testing, production implementation and maintenance.
  • Experienced at Creating effective Test data and development thorough Unit test cases to ensure successful execution of the data.
  • Ability to learn & adapt to new technologies at rapid speed.
  • Excellent Communication, Presentation, Documentation, Presentation & Analytical skills using tools Visio and PowerPoint.
  • Having good knowledge on Informatica and advanced concepts of informatica like Push Down Optimization.
  • Having good knowledge on Teradata, Fast Export and Fast Load Utilities.
  • Proficient in UNIX commands and Shell Scripting.
  • Having good knowledge on Control-M scheduling tool.
  • Having knowledge on Abinitio, HANA and Data Services.
  • Having good knowledge on Data Architecture.
  • Having good knowledge on Power Designer.
  • Exposure to Informatica B2B Data Exchange that allows to support the expanding diversity of customers and partners and their data with capabilities that surpass the usual B2B solutions
  • Exposure to Informatica B2B Data Transformation that supports transformation of structured, unstructured, and semi-structured data types while complying with the multiple standards, which govern the data, formats.
  • Exposure to MFT(Managed File Transfer) for the transmission of extracted files to the Vendor.
  • Having good experience on Informatica B2B data transformation.
  • Excellent debugging and troubleshooting skills, with an enthusiastic attitude to support and resolve customer problems. Having ability to work independently.

TECHNICAL SKILLS

ETL: Informatica 9.1/8.6/7.1, Abinitio, SAP Data Services, BO

RDBMS: Oracle 8i /9i, Teradata 13/14, SQL Server, Hana, SFDC

Languages: C, SQL, PL/SQL and UNIX

Operating Systems: Windows XP/NT/ 2007, Unix

Tools: Control-M(Scheduling Tool), Quality Center, Power Designer

PROFESSIONAL EXPERIENCE

Confidential - Austin, TX

Informatica Sr. Developer/Data Architect

Environment: Informatica9.1, Informatica B2B data transformation, Teradata 14DS, HANA, Control-M, Power Designer, Oracle, SQL Server, SFDC

Responsibilities:

  • Working as a Data Architect/Analyst and ETL Lead on this project.
  • Gathering business requirements by working closely with business users, project leads and developers.
  • Analyzed the business requirements and designing conceptual and logical data models.
  • Enforced naming standards and data dictionary for data models.
  • Performing SQL Query performance analysis using Teradata and HANA databases.
  • Owning and managing all changes to the data models. Creating data models, solution designs and data architecture documentation for complex information systems.
  • Effectively articulated reasoning for data model design decisions and strategically incorporated team member feedback to produce the highest quality data models.
  • Creating the high-quality mapping documents which can be easily understandable by the development team.
  • Creating the Data models and presented in the Data Architect forum and get it sign off from the team.
  • Involving the gap analysis on the critical issues.
  • Involving in Data profiling, data quality, data organization, metadata and provide analysis using Informatica Data Analyst tool for the required datasets from business users.
  • Involving in ETL design, development and review on the entire code.
  • Proactively worked on all data related issues and data gaps.

Confidential

Informatica Sr. Developer/Data Architect

Environment: Informatica9.1, Informatica B2B data transformation, Teradata 14DS, HANA, Control-M, Power Designer, Oracle, SQL Server, SFDC

Responsibilities:

  • Worked as a Data Architect on this project
  • Gathered business requirements by working closely with business users, project leads and developers.
  • Analyzing the business requirements and designing conceptual and logical data models.
  • Performed SQL Query performance analysis using Teradata and HANA databases.
  • Owned and managed all changes to the data models. Creating data models, solution designs and data architecture documentation for complex information systems.
  • Effectively articulated reasoning for data model design decisions and strategically incorporated team member feedback to produce the highest quality data models.
  • Involved in ETL design, development and review on the entire code.
  • Involved in Data profiling, data quality, data organization and metadata.
  • Proactively worked on all data related issues and data gaps.

Confidential

Informatica Sr. Developer/Data Architect

Environment: Informatica9.1, Teradata 14, DS, HANA, Control-M, Power Designer

Responsibilities:

  • Involving in ONSITE and OFFSHORE Communication for requirements gathering.
  • Mainly involved in ETL design, development and review on the entire code.
  • Developing the code by using various transformation rules like expression, filter and SQL according to the mapping Documents.
  • Developing Global Tie-Out process.
  • Performing unit testing, documenting the results and logging the defects in the TFS.
  • Developing and Scheduling the Control M jobs for the daily/weekly load invokes Informatica workflows & sessions associated with the mappings.
  • Using Unix Script to split, zip, SCP from the GXS Server, clean up and archive the files etc.
  • Involved in Data profiling, data quality, data organization and metadata.
  • Involving in the SIT deployment process.

Confidential

Informatica Sr. Developer

Environment: Informatica9.1, Teradata 13, Control-M, Oracle, SQL Server, SFDC

Responsibilities:

  • Involved in the ONSITE and OFFSHORE Communication for requirement gathering.
  • Mainly involved in ETL design & development and review on the entire code.
  • Preparation of Impact Analysis for Efforts needed out of the scope.
  • Involved in handling production problem tickets and issue resolution.
  • Created support documents to give transition to support teams.
  • Performed unit testing and documented the results and logging the defects in the TFS.
  • Involved in the SIT deployment process.
  • Involved in Data profiling, data quality, data organization and metadata.
  • Prepared code migration documents.

Confidential

Informatica Developer

Environment: Informatica8.6, Teradata 13, Control-M, Oracle, SQL Server, SFDC

Responsibilities:

  • Involved in the ONSITE and OFFSHORE Communication for requirements gathering.
  • Mainly involved in ETL design & development and review on the entire code.
  • Preparation of Impact Analysis for Efforts needed out of the scope.
  • Involved in handling production problem tickets and issue resolution.
  • Created Technical Design Documents, code review and support documents.
  • Developed the coding by using various transformation rules like expression, filter and SQL according to the mapping Documents.
  • Developed Global Tie-Out process.
  • Performed unit testing and documented the results and sent the test data to Vendors to test the data from their side.
  • Used Unix Script to split, zip, SCP to the TumbleWeed Server, clean up and archive the files etc.
  • Involved in Data profiling, data quality, data organization and metadata.

Confidential

Informatica Developer

Environment: Informatica8.1, Teradata 13, Control-M, Oracle, SQL Server, SFDC

Responsibilities:

  • Analyzed business requirements.
  • Worked with Data Architect's on logical and physical model designs.
  • Worked closely with On-site team, Vendors and Data Stewards.
  • Analyzed data and came up with business logic and did GAP analysis.
  • Worked on tuning the queries.
  • Prepared unit test plan.
  • Developed mappings using transformations such as the Source qualifier, Expression and SQL transformation.
  • Performed unit testing and documented the results and sent the test data to Vendors to test the data from their side. To send the test data to vendors moved the data from production to dev sent to the Vendors.
  • Developed and Scheduled Control M jobs for the daily load invoked Informatica workflows & sessions associated with the mappings.
  • Used Unix Script to split, zip, SCP to the TumbleWeed Server, clean up and archive the files etc.
  • Involved in Data profiling, data quality, data organization and metadata.

Confidential

ETL Developer

Environment: Informatica8.1, Teradata 13, Control-M, Oracle, SQL Server, SFDC

Responsibilities:

  • Analyzed business requirements and worked with the vendors to get the clear idea on requirements.
  • As there is no Data Architect Assigned for this project, I myself derived the business logic and data model.
  • Worked closely with On-site team, Vendors and Data Stewards.
  • Analyzed data and came up with business logic and did GAP analysis.
  • Worked on tuning the queries.
  • Prepared unit test plan.
  • Developed mappings using transformations such as the Source qualifier, Aggregator, Expression and SQL transformation.
  • Performed unit testing and documented the results and sent the test data to Vendors to test the data from their side.
  • To send the test data to vendors moved the data from production to dev sent to the Vendors.
  • Developed and Scheduled Control M jobs for the daily load invoked Informatica workflows & sessions associated with the mappings.
  • Involved in Data profiling, data quality, data organization and metadata.
  • Used Unix Script to split, zip, SCP to the TumbleWeed Server, clean up and archive the files etc.

Confidential

ETL Developer

Environment: Informatica8.0, Teradata 8, Control-M, Oracle, SFDC

Responsibilities:

  • Analyzed business requirements.
  • Worked with Data Architect's on logical and physical model designs.
  • Worked closely with On-site team, Vendors and Data Stewards.
  • Analyzed data and came up with business logic and did GAP analysis.
  • Learnt and worked on Teradata.
  • Worked on tuning the queries and Optimizing/Tuning mappings for better performance and efficiency.
  • Prepared unit test plan.
  • Created the lrf file structures and scripts for different target tables.
  • Developed complex mappings using transformations such as the Source qualifier, Aggregator, Expression and SQL transformation.
  • Performed unit testing and documented the results and sent the test data to Vendors to test the data from their side.
  • To send the test data to vendors moved the data from production to dev sent to the Vendors.
  • Developed and Scheduled Control M jobs for the daily/weekly/monthly load invoked Informatica workflows & sessions associated with the mappings.
  • Involved in Data profiling, data quality, data organization and metadata.
  • Used Unix Script to split, zip, SCP to the TumbleWeed Server, clean up and archive the files etc.

Confidential

ETL Developer

Environment: Informatica7.1, Oracle 8i

Responsibilities:

  • Understand the business process
  • Used oracle 9i as the target database, source was a combination of flatfiles and oracle tables
  • Developed the code by using various transformation rules like aggregator, joiner, expression, filter, router according to the mapping specification
  • Created sessions and workflows to run the mapping
  • Prepared Unit test plans
  • Interacted with the Onsite coordinator to get the things done.

We'd love your feedback!