We provide IT Staff Augmentation Services!

Sr.informatica Developer Resume

0/5 (Submit Your Rating)

Newark, CA

SUMMARY

  • Business Intelligence professional with 10 years of experience including architecture, design, development and maintenance of ETL, Hadoop, DWH, BI, Analytics systems in Banking, Insurance and Sales Applications.
  • Good understing of full project life cycle using Big Data, Hadoop, Cloud Services, ETL.
  • Builing data pipelines using Python, HDFS, Hive, Shell scripting.
  • Hands - on experience in using Hive partitioning, bucketing and execute different types of joins on Hive tables and implementing Hive SerDes like JSON and Avro.
  • Expertise in writing complex SQL in various databases like HIVE, Oracle, SQL Server and Teradata using them as source and target systems.
  • Python scripts for API’s calls to bring in JSON files from external Vendors.
  • Written various Python file validations scripts and cleanup scripts using Pandas and Numpy.
  • Well versed in using Python Packages like NumPy, SciPy, Requests, matPlotLib, Pickle, Pandas, JSON.
  • Experienced in ingesting structured data from RDBMS to HDFS/Hive using Sqoop.
  • Working with different file formats like Parquet, JSON, AVRO, ORC and CSV.
  • Expert in dealing with performance optimization in Hive and MR.
  • Better understanding Map reduce, input splits, mappers and reducers.
  • Experience utilising Hive performance methods CBO, Vectorization, parallel executions.
  • Designed and development of DQ - mechanisms, Exception/Error Handling, Check-sum validations and Data Integration processes.
  • Design, Develop, Testing and implementation of Data Warehouse/Business Intelligence applications using ETL Tools Informatica 9.x/8.x and Reporting Tools Business Objects, OBIEE, Cognos.
  • Proficiency in data warehousing techniques for Slowly Changing Dimension phenomenon, surrogate key assignment, Normalization & De-normalization, Cleansing, Performance Optimization.
  • Extensive experience in working with ETL tool Informatica Power Center Workflow Manager, Workflow Monitor, Source Analyzer, Transformation developer, Mapplet Designer, Mapping Designer.
  • Extensively used the Teradata Fast Load, Multi load and Export utility with Informatica in dealing with huge data transformation.
  • Expertise in writing complex BTEQ Scripts to handle data mechanisms.
  • In depth knowledge of data warehouse architecture and designing star schema, snowflake schema, Fact and Dimensional Tables, Physical and Logical Data modeling using Erwin.
  • Informatica Performance Tuning activities - Partitioning, Analyze Thread statistics, Indexing, Aggregations and Pushdown Optimization
  • Designed and Developed CDC mechanism using Informatica checksum using MD5, CRC 32, HashKey.
  • Well versed in working with Agile process methodologies, using scrum, iteration planning, development, and story card dash boards and retrospect tracking.
  • Expertise in working with Power Exchange for VSAM, COBOL copy books. WSDL file components, Web Services, SOAP.
  • Good working knowledge in complex SQL tuning using explain plan, db hints and access paths, Indexes and Partitions.
  • Skilled in working Production models using various scheduling tools -Tidal, ESP, IBMTivoli-MAESTRO.

TECHNICAL SKILLS

Big Data Technologies: Hadoop, HIVE, HQL, Sqoop, Python

Scripting: Python, UNIX Shell Scripting, Java Script, SQL, JAVA

ETL: Informatica (Power Center 9.x/8.x)

OLAP/Reporting: OBIEE, Crystal Reports, Business Objects XI

File Formats: Parquet, AVRO, JSON, ORC, CSV and Text

Data Modeling: Erwin x, MS-Visio

Operating Systems: Windows NT/ 2000/ XP, UNIX, Shell, Solaris 8.0

Database: Teradata, Netezza, Oracle 11g, R12, SQL Server

Cloud: Sales Force

Database Tools: TOAD, PL/SQL Developer, Teradata SQL Assistant

Version Control: IBM Rational - Clear Quest, Clear Case

Schedulers: Control-M, ESP, Maestro, Kintana, Auto Sys

PROFESSIONAL EXPERIENCE

Confidential

Hadoop Developer

Environment: Informatica,HDP, HDFS, HIVE, Sqoop, Python, Netezza, Oracle, UNIX, Tidal and JIRA

Responsibilities:

  • Architectural ETL Frame work design and development to load CLAIM objects.
  • Python script to call Rest API’s to bring in Claims Processing from multiple source systems including Facets and Magellan and created ETL Pipeline to store in Hive database.
  • Building data pipelines using Python, HDFS, Hive, Shell scripting.
  • Developed Complex HQL scripts to Loaded data into HIVE refine layer tables.
  • Ingested huge amount of Claims data into Hadoop using Parquet storage format.
  • Imported data using Sqoop to load data from MySQL to HDFS on regular basis.
  • Hands-on performance improvements using Hive partitioning, bucketing and execute different types of joins on Hive tables.
  • Developed Control Table frame work using hive and shell for load statistics.
  • Implementing Hive SerDes like JSON and Avro.
  • Creating the custom UDF's and UDAF's for Hive Data process.
  • Developed python scripts using pandas to easily perform read/write operations to CSV files.
  • Extracted JSON data by Python, cleaned and transferred to CSV files.
  • Formulating various Informatica ETL- mappings for meeting the business requirements.
  • Working on various transformations like Transaction Control, Normalizer, Joiner, Look-up, Router,Sequence Generator and Update strategy supporting the enhancement requirements.
  • Production support activities - RCA, fixing the Infomatica mappings with in SLA’s.
  • Agile process methodologies, using scrum, sprint planning, development, story card dash boards and retrospect tracking using JIRA.

Confidential

ETL Developer

Environment: HDP, HIVE, Sqoop, Hbase, spark, Informatica,ORACLE EBS

Responsibilities:

  • Developing Sqoop jobs to load data from oracle EBS, sql server to Hadoop.
  • Hive table Partitions for Oracle customer billings, service agreements.
  • Offloading Data warehouse data to Hadoop Cluster.
  • Python API calls to bring in SFDC data - Account, Quotes, Campaign, Opportunity, Adjustments.
  • Working Hive external, internal tables, Partitioning, bucketing, UDF’s and complex query patterns.
  • Expertise in working with Informatica Performance Tuning activities - Partitioning, Analyze Thread statistics, Indexing, Aggregations and Pushdown Optimization.
  • Designed and developed the Data Validation, Exception handling mechanism.
  • Expertise in handling production support activities with in SLA’s.
  • Debugging Informatica mappings for performance and any data issues.
  • Data load handling mechanisms - complex data load scripts for bulk updates and deletes.
  • Working - SQL Queries, Stored procedures, Primary indexes, Secondary Indexes, Table partitions.
  • Periodical Analytics for Revenue Transactions / Revenue schedules

Confidential, Newark, CA

Sr.Informatica Developer

Environment: Informatica Power Center 9.x,ORACLE, EBS, ORACLE 11g, Tableau

Responsibilities:

  • Architecture to load the data from various sources Oracle EBS, IBM Main Frame, SQL Server, Inventory, Zyme, Vanguard
  • Formulating various ETL- mappings for meeting the business requirements.
  • Working on various transformations like Normalizer, Joiner, Look-up, Router, Sequence Generator, Transaction Control and Update strategy.
  • Working with Agile process methodologies, using scrum, iteration planning, development, story card dash boards and retrospect tracking.
  • Developed various Mapplets for the Cost ASP, COGS, MDM Customer, City and Customer Status.
  • Implementing Business & Data Quality validation mechanisms to notify the users across the Regions.
  • Design and develop the ETL - Mappings for data exception and reloading processes.
  • Expertise in handling production support activities, resolving the Quality Center tickets with in SLA duration for the daily, weekly monthly process flow.
  • Worked with XSD and XML files generation through ETL process.
  • UNIX wrapper script to load the continuous staging files to acquire layer.

Confidential

Sr.Informatica Developer

Environment: Informatica Power Center 9.1, Sales Force, IBM DB2, SQL Server, Teradata and Unix

Responsibilities:

  • Responsible for offshore/onshore team in delivering solution - Data Movement to Sales Force.
  • Coordinating with business solution architect team and designing the ETL-architecture, Common component frame work.
  • Establish specific technical solutions, and leading the efforts including ETL Informatica Power Center and Sales Force API, IBM DB2, Flat files.
  • Expertise in working on the VSAM, COBOL sources using POWER EXCHANGE, SOAP services.
  • Developed Macro, micro design document for the each layer of the architecture like Abstract Layer, Business Layer.
  • Creating various Documents like Technical Design, Micro, Peer review templates, Testing, Code Review.
  • Developed the various Loading strategies to load the objects in Teradata.
  • Expertise in loading data strategies - Teradata Fast Load, Multi load and Export utility.
  • Architectural designs for account object load methodology, developed code using PC, UNIX and Curl scripts.
  • Developed various complex BTEQ Scripts to handle data mechanisms.
  • Designed and developed the ETL component using worklet in order to best fit the common component architecture.
  • Loading the data using the Sales Force Lookup, Merge, Pick list transformation.
  • Developing the PC, UNIX, DB-Scripts to handle huge source data files load.
  • Building UAT migration, PROD migration strategies also Production job run plan using Control-M scheduler.

Confidential

Informatica Developer

Environment: Informatica 8.6.1, SQL Server 2005, Flat Files, Cognos, Windows

Responsibilities:

  • Understanding the requirement specifications and design the ETL Mappings.
  • Configuring Sessions in Workflow Manager.
  • Unit testing of ETL Mappings.
  • Creating documentation for all reports.
  • Implementation of data reconciliation process among from different data sources
  • Integration of Different Source data in Staging Area by using ETL process like Data Merging, Data Cleansing and Data Aggregation.
  • Design, Develop and Implementation of Informatica Mappings and Workflows.
  • Designed data source tables, targets tables and imported the metadata definitions for the data warehouse using Informatica ETL Tool Designer.

Confidential

ETL Developer

Responsibilities:

  • Working closely with the users for gathering the functional, technical requirements and conducting UAT Testing.
  • Designing and analyzing the facts and dimension tables using star schema dimensional modeling.
  • Expertise in working with Power Exchange for the COBOL Copy books.
  • Responsible for integrating various data sources like Oracle, SQL Server, DB2 and flat files into staging area.
  • Implementing slowly changing dimension logic as a part of the star schema implementation.
  • Performing data manipulations using various Informatica transformations like joiner, expression, look-up, aggregate, filter and sequence generator, etc.
  • Utilizing the designer tools including source analyzer, warehouse designer, mapping and mapplets designer, transformations, etc.
  • Deploying new strategies for CHECKSUM calculations, and exception population using mapplets and normalizer transformations.
  • Data loading with Informatica integration - Teradata Fast Load, Multi load.
  • BTEQ Scripts to handle data in to Set table and multi set tables.
  • Performing object environmental migrations (SIT, UAT, PROD) using version control tool Clear Quest, Clear Case, etc.
  • Coordinating with other teams like source, DBA, job schedulers, production in setting up the environments.
  • Developed JSP pages, Servlets and HTML pages as per requirement.
  • Responsible for the maintenance of code in different environments.
  • Responsible for fixing several changes requests and issues.
  • Used Object Oriented Methodology for Java application design.
  • Involved in developing unit test cases, test plans and strategies, as well as the design, development, and execution.
  • Used Find Bugs tool to detect bugs in JAVA.
  • Initiated, reviewed, facilitated and recorded several reviews for different issues.
  • Developed the necessary Java Beans, PL/SQL procedures for the implementation of business rules.
  • Used JDBC to provide database connectivity to database tables in Oracle.
  • Responsible for design and development Income using Java, Servlets, JSP
  • Java Bean, MPattern, JDBC, Oracle PLSQL, Connection Pooling using Data sources, Tomcat
  • Webserver, JavaScript, DHTML, HTML on UNIX server, Dreamweaver.
  • Developed portal screens and code designing for different Company Portal modules.
  • Active participation in Unit testing, integration testing, and acceptance testing for the Application.
  • Handled the configuration and code changes of the source with CVS
  • Developed the SQL, Stored Procedures, and Sub Queries.
  • Involved JDBC connection to access the database.

We'd love your feedback!