We provide IT Staff Augmentation Services!

Senior Informatica Developer Resume

0/5 (Submit Your Rating)

Indianapolis, IndianA

PROFESSIONAL SUMMARY:

  • Around 7 years of experience in development of various projects using Data Warehousing tools like Informatica and Databases like Oracle, Sql Server, Teradata and DB2 UDB.
  • Creating ETL mappings using Informatica Power Center to move data from multiple sources such as XML, DB2, Teradata, MS SQL Server, flat files, and Oracle into a common target area such as data marts and data warehouse.
  • Expertise in building Enterprise Data Warehouses (EDW), Operational Data Store (ODS), Data Marts, and Decision Support Systems (DSS) using Data modeling tool ERWIN and Dimensional modeling techniques (Kimball and Inmon), Star and Snowflake schema addressing Slowly Changing Dimensions (SCDs).
  • Highly motivated and goal - oriented individual with a strong background inSDLC ProjectManagement and Resource Planning using AGILE methodologies.
  • Exported the data into from UNIX to Mainframe for backup.
  • Over 2+ years of hands on experience in Informatica Data Quality (IDQ)
  • Extensive Knowledge on the development life cycle process from requirements gathering to deployment of code to production and good knowledge on the Dimensional modeling.
  • Implementing performance tuning methods to optimize developed mappings.
  • Extensive experience in Production support management involving Informatica, Oracle, UNIX and Mainframes.
  • Have significant experience in reporting using MS Excel, Easytrieve and Business objects.
  • Hands on Experience with the scheduling tools like Control M, Autosys scheduler.
  • Good Understanding and Knowledge on the Business Intelligence Tools like Cognos.
  • Experienced in Erwin data modeling, Star-Schema design, Logical data modeling, Normalization, Physical design, Data Analysis, Data Modeling, Reports and Report Design.
  • Ability to communicate requirements effectively to team members and manage applications. Extensive experience in providing 24/7 on-call support.
  • Used JDF which is a job messaging format based on XML over HTTP.
  • Expertise in performance tuning the user queries, execution of frequently used SQL operations and improved the performance.
  • Worked on environments such as MLOAD, FASTLOAD, TPUMP, TOAD, Crontab, SQL, VSS, IDE, Putty, WinSCP, BTEQ and SAP Power Connect.
  • Used BI techniques for strategic decision making.
  • Used BI server, BI publisher and BI interactive dash boards which are the components of Oracle Business Intelligence Enterprise Edition (OBIEE) for common enterprise business model and enterprise reporting.

TECHNICAL SKILLS:

Operating Systems: Windows 98/2000/2003/2005/2008 NT, SUSE Linux 9.0/10.0/11.0, Unix, AIX 5.3/6.1/7.0.

Languages: PL/SQL, SQL, UNIX Scripting, Excel VB Scripting, XML, WSDL, EasyTrieve

Scripting: Unix shell, Excel VB

Databases: Oracle 11g/10g/9i/8i/7.x, DB2 UDB 8.1, Teradata, SSIS, SQL server, VSAM and MS-Access, T-SQL

ETL Tools: Informatica Power center 9.6/9.5.1/8. x/7.x/6.x., IDQ 9.6.1/9.5.1

Database Utilities: Oracle PL/SQL developer, FTP client, Business Objects Supervisor, MQ Series, BTEQ, fastload, Multiload, FastExport, TPump, SQLPlus, BI with OBIEE, Business Objects.

Testing Area: GUI Testing, Integration Testing, Stress Testing and Load Testing

Database Access Tools: Tivoli Maestro, SQL Navigator, TOAD, Teradata SQL Assistant, Quality Center, MS office suite, Share point

PROFESSIONAL EXPERIENCE:

Confidential, Indianapolis, Indiana

Senior Informatica developer

Responsibilities:

  • Worked on different source systems (IMS, AS400, Echos, and Pure) to load the data in to the SQL server database and created test cases for data validation.
  • Dealt with huge volumes of data and thousands of tables from the source system to systematically validate and load into different staging layers (S0, S1 and S2)
  • Worked on python code to create the mappings, sessions and parameters files for the Informatica jobs and generated xml code.
  • Imported the sources, targets, mappings, sessions and workflows into the repository from the xml created using the python script.
  • Worked on the batch script to connect to the repository and automate the import process and creation of the list files.
  • Validated the data using the sql queries for the basic count validation from source to stage 0, stage 1 and stage 2.
  • As the data is fed to the MDM application, worked on the views provided to convert them into Informatica mappings and validate the results accordingly.
  • Worked on identifying Mapping Bottlenecks and improved session performance through error handling.
  • Worked on Mapplets and created parameter files wherever necessary to facilitate reusability.
  • Tuned and optimized mappingsto reduce ETL run time and ensuring they ran within designated load window.
  • Prepared the Standard Operating Procedure (Knowledge Transfer) document, which provides necessary information, required for the Maintenance and Operation of the application.
  • Performed code migration ofmappings and workflows from Development to Test and ProductionServers through deployment groups for DEV, TEST and PROD repositories in retaining shortcuts, dependencies with versioning.
  • Worked withUNIX commandsand usedUNIX shell scriptingto automate jobs.
  • Used the pmrep to connect to the repository and import the objects that are generated in the python code.
  • Worked in Agile environment to effectively meet the deliverables and provide qualitative work to the client with effectively communication and initiative to take on trouble shooting when faced with a challenge.

Environment: Informatica Power Center 10.2, Oracle 12, Microsoft SQL server management studio, Flat Files, UNIX Shell Scripting, TOAD, SQL, MDM 10, IDE, IDQ, Putty, Spyder (Python 3.7)

Confidential, Phoenix, AZ

IT Business Intelligence Analyst/Informatica Developer

Responsibilities:

  • Design Source to Target Mapping documentation according to the business requirement.
  • Extensively developed various Mappings to perform Extraction, Transformation and Loading of Source data according to the business logic.
  • Implementedslowly changing Dimension Type 1, 2according to the Business Requirements and Nature of business source data.
  • Extensively implementedRe-usableObjects likeTransformations, Mappletswhich can be used in other Mappings, also created reusable Informatica user defined functions.
  • ImplementedPL/SQL Stored Procedureswhich are helpful to process and purge data according to the business rules.
  • Usederror handlingto capture error data into tables for handling bad data and analysis.
  • Usedweb servicetransformation to retrieve deals data, solar data and create xml files.
  • Worked onHTTPtransformation to retrieve data from rest services.
  • GeneratedXSDfrom xml files and used to process xml files to load data into stage tables.
  • Createdworkflow loopinglogic to pull all the data from the source system.
  • Supported production support and closed priority one tickets meeting the SLA.
  • PerformedAdministration Taskscode deployment from one environment to another, adding user and granting access to Informatica PC Tools, Syncing LDAP in Informatica Admin console.
  • Involved in writing queries onInformatica metadatato fetch workflow names, sessions and connection information.
  • UsedDebuggerfor trouble shooting mappings and fixed bugs during Testing.
  • Extensively created Batch scripts to create a list files and move the source files to archive directory after successful load.
  • Analyzed old BI reports which were pulled using SQL and convert them into Informatica data extraction map and worked with connected look up and unconnected lookup and configured the same for implementing complex logic.
  • Worked with various look up cache like Dynamic Cache, Static Cache, Persistent Cache, Re Cache from database, and Shared Cache.
  • Created custom Denodo views by joining tables from multiple data sources.
  • Worked on fine tuning and optimizing the performance of complex views in Denodo.
  • Ensured Quality Standards and improved the overall quality of deliverables.

Environment: Informatica Power Center 10.1.1 hot fix/9.6.1,SQL,PL/SQL,SQL Server2008, Oracle 11g, Oracle 12, SQL, PL/SQL, Flat Files, Microsoft Excel, Cognos 10, Service Now, GIS, PI, Control-M, PostgreSQL, Denodo, Rapid SQL.

Informatica Developer

Confidential

Responsibilities:

  • Implementing the standardized process to archive flat files and for loading indicator files.
  • Created proper PI taking into consideration of both planned access of data and even distribution of data across all the available AMPS.
  • Implemented slowly changing dimensions methodology to keep track of historical data.
  • Expertise in performance tuning the user queries, execution of frequently used SQL operations and improves the performance of the system.
  • Interacted with people from various teams in the Project like Oracle/Teradata DBA’s, Maestro Schedulers, Spec. Creators, Reporting and UNIX etc. to aid in the smooth functioning of the Project flow.
  • Developed data marts using Cognos Powerplay and Transformer OLAP tools.
  • Exported the data into from UNIX to Mainframe for backup.
  • Developed the triggers to handle history and audit data for production environment.
  • Created Understanding documentation and Release Testing documentation on the tickets handled, and documented the issues found in a central repository.
  • Involved in Performance testing of all the workflows/Mapping, which helped in analyzing its performance before moving into production.
  • Extracted, Transformed and Loaded OLTP data into the Staging area and Data Warehouse using Informatica mapping and complex transformations (Aggregator, Joiner, Lookup, Normalizer)
  • Involved in the code reviews prepared by the teams.
  • Analyzing production support documents and finding the feasible time to run their jobs.

Environment: Informatica 7.1, SQL Tools, Teradata, BTEQ PL/SQL, Business Objects, UNIX, Windows XP.

Informatica Developer

Confidential

Responsibilities:

  • Reviewed Technical Spec’s and determined the design and recommended options and approaches.
  • Translated high-level requirements into efficient ETL process.
  • Mapped the source and target databases by studying the specifications and analyzing the required transforms.
  • Building Reports according to user Requirement.
  • Extracted data from Oracle and SQL Server for data warehousing.
  • Worked extensively with the connected lookup Transformations using dynamic cache.
  • Analyzed the data based on requirements, wrote down the techno-functional documentations and developed complex mappings using Informatica Data Quality (IDQ) 9.5.1 Developer to remove the noises of data using Parser, Labeler, Standardization, Merge, Match, Case Conversion, Consolidation, Address Validation, Key Generator, Lookup, Decision etc. Transformations and performed the unit testing for accuracy of data.
  • Used Visual Basic 6.0 to create data bases and to develop fast and easy tools for simple utilities.
  • Written and used UNIX shell scripts extensively for scheduling and pre/post session management.
  • Employing the Informatica Data Quality (IDQ) to help identify data issues and implementing the cleansing procedures with existing interfaces/migrations.
  • Used Informatica B2B data transformation to read unstructured and semi structured data and load them to the target.
  • Worked with Qlik View Governance Dashboard (QVGD) to scan Qlik View deployments and display various metrics about deployments.
  • Involved in peer-to-peer reviews.
  • Used Proactive Monitoring which alerts when it finds any data quality issues.
  • Troubleshoot the issues by checking sessions and workflow logs.

Environment: Power Center 7.1, UNIX scripting, MS SQL server

We'd love your feedback!