We provide IT Staff Augmentation Services!

Senior Informatica Developer/analyst Resume

4.00/5 (Submit Your Rating)

San Jose, CA

SUMMARY

  • Over 9+ years of experience in Various Databases and file systems which includes VSAM, IMS DB, DATACOM, Oracle 10G, DB2 UDB 8.0, MYSQL 5.2 databases. Strong applied knowledge on SQL, PL/SQL technologies.
  • Over 6+ years of experience in Datawarehousing, Data Integration and Application Information Life science Management Projects involving Informatica, IBM OPTIM, IBM info Sphere DATASTAGE.
  • Strong experience in Extraction/Transformation/Loading (ETL) design and implementation includes Requirements gathering and business rules analysis, Source to target mapping design, and also implemented various Performance Tuning techniques.
  • Worked on complex parsing techniques for semi structured and unstructured data utilizing Informatica B2B parsing techniques.
  • Involved in data modeling and generated physical and logical data models for complex legacy de - normalized structures to facilitate data integration.
  • Involved in CONTROL-M job development for newly created Informatica workflows.
  • Extensive experience in supporting and maintaining Mainframe Applications on IBM mainframe environment.
  • Over 8+ of experience in American HealthCare Industry and involvement in various Industry level initiatives and assessments. Certified FAHM from AHIP organization. Completed following certifications to achieve fellowship. AHM250 - Basic Concepts in healthcare, AHM510-Governance and regulation, AHM520-Risk Management, AHM530-Network management, AHM540-Medical Management in Health Plans.
  • Over 2+ years of experience working in various J2EE middleware/backend technologies - Hibernate/Java/Servlets/JSP/Spring MVC supporting client portals and extended connectivity using web services.
  • Exposure and experience in architecting ODS, Data modeling, Data engineering.
  • Ability to build and maintain strong working relationship with all levels of management along with Cross- Functional team of Engineers, Financial Specialists, and Sales Support.
  • Strong leadership skills coupled with excellent verbal, written, interpersonal and people management skills.
  • Strong analytical and problem solving skills. Deadline and team oriented, comfortable working in a fast paced environment.
  • Experience in Mainframe Technologies and exposure to CICS TS4 SOA and mainframe Application modernization technologies and concepts.

TECHNICAL SKILLS

Databases, DBMS/RDBMS: Oracle 10g/11g, DB2 UDB 8.0, VSAM, MySQL, CA-DATACOM, IMS-DB. SQLPL/SQ, MYSQL 5.2

ETL/ILM/DM tools: IBM OPTIM (Data Growth Solution)/ Oracle HSODBC/ DB2 Express C/ IBM OPTIM Connect (Attunity). Informatica Power Center 8.6/9.1, IBM Info sphere Data Stage 8.5, Erwin 9.2.

Programming Languages: SQL, Oracle PL/SQL, COBOL, CICS, CA-IDEAL, JCL, MQ Series, AssemblerCLIST

OLAP/Reporting: Cognos 7 Framework Manager, Cognos Report.net, OBIEE 10.1.3.3.

J2EE Technologies: JSP/Servlets, Hibernate, Spring, JAVA 5/6

Servers: CICS Transaction Server 4. Apache Tomcat 6.

Web services: MQ server, WSDL/ SOAP, REST

Development Tools: SQL PLUS,SQL Developer, Oracle TOAD 8.6. CM EndeavorABENDAID, Dumpmaster. File analysis Insync, FileMaster.

Scheduler: AUTOSYS, Control-M, CA-7.

Project Management: Rational Portfolio Management. CA Clarity, Microsoft Project Plan. OPAL for CMMI

Requirements Management: Rational composer, Rational Unified Process. Microsoft VISIO

BA /Requirements Mgmt: FP, COCOMO, Rational Method Composer. Rational Unified Process, Business Modeling, Data analysis. Microsoft Visio, Microsoft Office- Word, Excel, powerpoint.

PROFESSIONAL EXPERIENCE

Confidential, San Jose, CA

Senior Informatica developer/Analyst

Responsibilities:

  • Involve in Requirements Analysis and Validate, Verify Business/Functional requirements from the end users based on the set guidelines. Involve in requirements workshop, Elicitations and brainstorming session for selective work requests based on the size and importance. Hence ensure quality at source.
  • Involve in sprint planning and grooming sessions and estimate work efforts in terms of agile User Stories and provide technical solutions/proposals.
  • Discover and analyze structured, semi structured and unstructured legacy (Mainframe) data sources from various platforms and perform initial data analysis for early identification of issues, risks and dependencies.
  • Perform Data modeling and create schemas and data models for complex legacy de-normalized data structures and Micro strategy reporting needs for analytics utilizing EMBARCEDARO- ER DATA architect 9.0 studio.
  • Streamline existing multi-layered client architectures and frameworks for data migration and provide enhancement proposals for better performance, code coupling and implied dollar savings.
  • Involve in re-engineering Informatica code base to accommodate the system enhancement being performed as part of the project and develop automated batch frameworks to facilitate data movement.
  • Work with niche technologies -Informatica B2B DT, Informatica DVO, Power exchange and latest Informatica tool releases to configure, architect and provide technical design solutions for Legacy (Mainframes) data migration/and data integration.
  • Utilized B2B DT to parse the unstructured and semi structured legacy data (reports) .
  • Analyze Legacy data sources (Mainframe - IMS, DB2 and VSAM), identify data elements, perform Data modeling and create data maps to facilitate data movement from legacy platforms to modernized relational scalable data platforms with no information dissipation.
  • Create Informatica PowerCenter Mappings and workflows that facilitate the data movement from legacy systems to modernized relational platforms and provide design solutions for data transformations that involve complex legacy data formats to modern relational data stores.
  • Perform Validations utilizing the latest and rare Informatica toolset: Informatica - DVO and create innovative scripts that automate the validation process. Hence perform Quality assurance.
  • Conduct Performance tuning of the workflows to avoid resource consumption hence generate savings to the clients.
  • Involved in workflow dependency analysis and design/development/validation of Informatica jobs utilizing CONTROl-M desktop 7.0/CONTROL-M EM.
  • Developed metrics/attributes for user requested reports Utilizing Microstrategy 9.3
  • As a senior programmer/analyst provide technical guidance to the team on all the existing niche technologies (Informatica B2B) used on day to day basis and help technically groom the team for better productivity and quality. Provide demo sessions on technical process to be followed.
  • Worked with shell scripts and Unix servers hosting multiple Informatica nodes.
  • Involve release planning and implementation approach including Implementation checkouts, Sequence of processes etc. Coordinate with cross functional teams to implement solutions.

Environment: Informatica Power center 9.1, Power exchange, Oracle 11g, Embarcedaro ER Studio DA9.0, PL/S QL scripts, DB2 UDB 8.0.Oracle TOAD 9.7, Microstategy 9.3, CONTROl-M desktop/EM 7.0, SQL Developer, TERADATA

Confidential, St Louis, MO

ETL Developer/Analyst

Responsibilities:

  • Involved in gathering business requirements in liaison with business users and technical teams. Created requirement specification documents and identified data sources and targets.
  • Performed legacy mainframe data analysis and identified optimized solutions to perform data cleansing including normalizing the legacy data structures.
  • Responsible for Managing, monitoring and validating the data extraction, transformation, movement, loading, cleansing and updating processes into the DW environment.
  • Responsible for testing and implementing innovative technical design solutions based upon the observed pain points within and outside Informatica domain boundaries.
  • Responsible for data modeling utilizing Conceptual, logical and Physical Database Designs and Implementation for the Staging and the Data Mart databases using Erwin 9.2 forward/reverse engineering with Oracle 10g.
  • Designed mappings, Mapplets and transformations using Informatica Power center Designer. Utilized Java transformation, Normalizer Transformation, SQL transformation, XML transformation and other available transformations to ensure data cleansing and data quality with performance enhancement.
  • Utilized power center Source Analyzer to import XML sources, COBOL sources, Flat Files and Relational Tables residing across multiple Platforms-SYBASE, TERADATA, DB2, VSAM.
  • Responsible for performance tuning of the Informatica mappings using various components and techniques likes Parameter files, Variables and Dynamic Cache and other set procedures.
  • Involved in testing the development environment and data validation using SQL queries/ERwin and SQL scripts.
  • Define development and coding standards and procedures in Informatica and implement technology improvements. Defined standards and procedures to ensure that the client's business goals and objectives are met by the proposed data warehouse architecture and design.
  • Assisted the team in Design, Development and performing peer reviews as per the checklist.
  • Created users, folders & defined roles using Informatica Repository Manager
  • Tuned the reporting tables by creating table partitioning, materialized views, logs and adding function - based indexes, bit-mapped indexes, usage of hints and making use of other Oracle 11g features
  • Designed and developed efficient Error handling methods and implemented throughout the mappings.
  • Involved in Data discovery and Data analysis using Informatica developer to identify the relevant data
  • Strategically thinking and consulting organization on ETL best practices based on detailed knowledge of industry trends.
  • Designed and developed Tasks (Sessions, Commands), Workflows using Workflow Manager Tools that involved email generation, Command interface and flagging option based on file availability.
  • Lead the warehouse design and integration effort, including data sourcing and transformation.
  • Handled a team of 6 members and Developed, mentored and trained Other ETL staff providing day to day guidance and pointers for issue resolution.
  • Wrote Unix Shell Scripts for pre/post session commands and command tasks.
  • Worked in Agile Scrum development methodology and RUP methodology.

Environment: Informatica Power center 9.1, Power Center Data Analyzer, Erwin 4.x, Oracle 11g, PL/SQL scriptsDB2 UDB 8.0.Oracle TOAD 9.7, Cognos 7 Framework Manager, Impromptu 7.x/6.0,, Power Shell 1.0, Cognos 8 Report Net. Mainframe Technologies - JCL/COBOL/MVS/CA-7/9/CICS TS 4.0 etc, SQL Developer, MYSQL 5.2.

Confidential, Salt Lake City, UT

Senior Application Developer/ TTL

Responsibilities:

  • As Senior Application Developer/ TTL for Analysis, design, development, testing and implementation of Mainframe/J2EE batch Applications for Confidential Blue Cross Blue Shield of UTAH.
  • Analyze entire mainframe data components, inter and intra application interfaces, data structures, composition, supported business, critical process and perform data profiling to derive data models for archival/migration - conceptual model utilizing MS VISIO ER diagrams/ Logical and physical model utilizing Attunity.
  • Develop and maintain sophisticated product architecture for application data archival/retrieval. IBM data archival product constitutes IBM OPTIM + Oracle HS (heterogeneous Service) ODBC (open database connectivity) + IBM Optim Connect server + mainframe Daemons and extended mainframe data sources.
  • Map Cobol copybook data structures with the Attunity server data types and extract and transform data to third form Normalized RDBMS schemas from legacy unorganized/less organized file structures (IMS DB/DATACOM/VSAM/FLAT Files).
  • Transform VSAM files/IMs file to relational tables utilizing OPTIM Connect and perform normalizations on the files
  • Create Archival requests combining various data source models and schedule jobs for regular archival and maintain job request flows with performance tuning utilizing OPTIM Tool.
  • Support end user connectivity with OPTIM ODBC connection assistant and provide end user query management facility to Oracle TOAD application.
  • Responsible for architecting ODS for OPTIM data retrieval utilizing ERwin to functionally integrate de-normalized legacy Mainframe sources (CA-DATACOM Tables) and Flat files to Oracle 10G Database store.
  • Utilized Informatica Powercenter 8.6 to ETL mainframe Sources to Architected Operational Data Store.
  • Plan for validation techniques/ generate use cases and execute use cases using Optim Query facility to obtain Business Sign - off.
  • Coordinate with various Clients cross functional teams on a daily basis as part of Project governance, information and communication management. Cross functional teams include Scheduling, Contract management, Project Sponsors, Mainframe system management functioning during US day time and requires through system and technical knowledge for effective communication and information flow.
  • Assist Client analyst to derive project cost benefit analysis and decision making.
  • Handled a Team of 7 members working towards three different system plan to undergo decommission and assisted them for technical and project issues.

Environment: IBM OPTIM, OPTIM Connect, Oracle HS-ODBC, IBM Mainframes, DB2, CA-Datacom, VSAM, IBM DB2, Flat Files, Oracle 10g, Oracle Toad 8.6.

Confidential, LA, CA

Senior ETL Developer

Responsibilities:

  • Involved in gathering business requirements in liaison with business users and technical teams. Created requirement specification documents and identified data sources and targets.
  • Used relational sources and flat files to populate the data mart. Translated the business processes into Informatica mappings for building the data mart.
  • Coordinated with the offshore team for reviewing the ETL code and provided development support.
  • Worked with various Informatica Server and Client tools like Designer, Workflow Manager, Workflow Monitor and Repository manager.
  • Created complex Informatica mappings to load the data mart and monitored them. The mappings involved extensive use of transformations like Aggregator, Filter, Router, Expression, Joiner and Sequence generator.
  • Used Normalizer transformation for importing and working with COBOL files.
  • Used Lookup Transformation to access data from tables, which are not the source for mapping and also used Unconnected Lookup to improve performance.
  • Used Update Strategy Transformation and configured the mappings to handle the updates, to preserve the existing records.
  • Created tasks, sessions, worklets and workflows in the Workflow Manager and gathering status updates using Workflow Monitor.
  • Involved in dealing with performance issues at various levels such as target, sessions, mappings and sources.
  • Involved in fixing invalid Mappings and performed Unit testing and Integration Testing.
  • Created PL/SQL stored procedures to perform data validation on the business data.
  • Developed UNIX shell scripts to automate the data warehouse loading.
  • Responsible for performance tuning of the Informatica mappings using various components and techniques likes Parameter files, Variables and Dynamic Cache and other set procedures.

Environment: Informatica Power Center 8.1/8.5, OBIEE 10.1.3.3, Oracle 9i/10g, PL/SQL, TOAD 8.6,Embarcadero Data modeler, SQL *plus, UNIX, Windows XP, Mainframes, JAVA 6.

Confidential, Salt Lake City, UT

Application Developer

Responsibilities:

  • Responsible for, work item status reporting monitoring and providing timely updates to the customer. Generate Status reports.
  • Performed as Onsite Coordinator working in onshore/Offshore Model and was directly involved in early Issue identification/Mitigation, Business Value Adds and thought leadership.
  • Involved in ongoing Healthcare projects for ITS, HIPAA - Alt ID Projects and have extensive exposure to HIPAA and EDI transactions for remittance, Claims, Membership inquiries etc.
  • Develop ETL stored procedures utilizing PL/SQL, SQL and shell scripts to load data from legacy mainframes to information hub for customer portals and implement performance tuning.
  • Implemented Unix Shell scripts for EDI transaction Management, ETL loads with PLSQL, Java Backend Workflow Automation.
  • Setup reporting in infosphere Datastage 8.6 on ADHOC basis.
  • Perform Requirement analysis and document Requirement Specifications.
  • Follow water fall SDLC model and Design Develop and Unit/System test mainframe applications using COBOL/JCL/CICS/DB2/Datacom and various other tools.
  • Assist Team for technical issues and guide team members throughout Build.
  • Follow Release management principles and plan for implementation using Action Remedy System
  • Provide 24/7 Production support to ensure smooth business operation and according to the agreed upon SLA and MTM measures.
  • Analyze Mainframe to web connect applications and provide support in problem identification and problem resolution using available tools - Trace master/eclipse IDE/Oracle 10g/ SQL*plus/Oracle Toad.
  • Unit Test enhanced java applications in test environment, document results and obtain sign-off for implementation - Java (Servlets-JSP)/ JQuery /Apache tomcat 6/ Spring MVC/Hibernate/ Spring/Oracle 10g.

Confidential

Application Developer

Responsibilities:

  • Responsible for, work item status reporting monitoring and providing timely updates to the customer. Generate Status reports.
  • Perform Requirement analysis and document Requirement Specifications.
  • Follow water fall SDLC model and Design Develop and Unit/System test mainframe applications using COBOL/JCL/CICS/DB2/Datacom and various other tools.
  • Assist Team for technical issues and guide team members throughout Build.
  • Follow Release management principles and plan for implementation using Action Remedy System
  • Analyze Mainframe to web connect applications and provide support in problem identification and problem resolution using available tools - Trace master/eclipse IDE/Oracle 10g SQL*plus.
  • Design and develop Java Middleware Application enhancements supporting DB2 UDB 8.0 database access services using Spring Framework/Hibernate/Oracle 10g.
  • Unit test enhanced java applications in test environment, document results and obtain sign-off for implementation - Java (Servlets-JSP)/ JQuery /Apache tomcat 6/ Spring MVC/Hibernate/ Spring/Oracle 10g

Environment: COBOL II, TSO/ISPF, CA-DATACOM, JCL, CICS, Super BMS, Ezytrieve, Macro4 Tools, Top down, Endeavor, CA-IDEAL, IMS-DB. J2EE technologies- Hibernate, Apache Tomcat, JSP/Servlets, spring, Oracle 10g, DB2.8.0/ HTML, JQuery.

We'd love your feedback!