Programmer/support Analyst Resume
DE
SUMMARY
- Having 15 years of experience as an IT professional, working with various clients across India, North America.
- Over 11 years of experience in developing Data Warehousing solutions and Support.
- Extensive working experience in design and development of ETL, Data Migration solutions using Informatica PowerCenter and database applications in Client/Server environment.
- Providing 24/7 Operational support for EDW (Enterprise Data Warehouse) Applications.
- Support EDL (Enterprise Data Lake) Applications and involve in monitoring the job, Ingestions, scheduling, Executions etc.
- Working experience on Informatica PowerCenter (9.6.1,8.x/7.x/6.x), Oracle DB 10g/9i/8i/8/7.3, ORACLE Internet Developer Suite (Forms 6i and Reports 6i), ORACLE Developer2000 (Forms6i/5.0/4.5 and Reports 6i/3.0/2.5), PL/SQL, SQL, Developer 2000 and SQL Loader, Oracle Export/Import on UNIX and Windows platforms. Extensive experience in writing Stored Procedures and Functions, Triggers, Packages and Application tuning.
- Experience in Performance tuning of Informatica (sources, mappings, targets and sessions) and tuning the SQL queries.
- Good understanding and working knowledge of Apache Hadoop ecosystem components like HDFS, Map Reduce, Pig, Hive, Impala, HBase, Hue,SQOOP, Flume, Oozie, Spark, Kafka.
- Working experience in ingestion using Hive, Sqoop, Creating and working on Hbase integrated Hive tables.
- Good knowledge and understanding of Java and Scala programming languages.
- Good knowledge on File formats like sequence File, RC, ORC, Parquet and compression techniques like, gzip, snappy and LZO.
- Extensively worked on Hive.
- Experience in Designing Star Schema, Snowflake Schema.
- Understanding of BI tools like Cognos 7.0 and Business Objects 5.1.Creating of multidimensional cubes using Cognos PowerPlay and reports using Cognos Impromptu Administrator.
- Technically competent and result - oriented with problem solving skills.
- Ability to work independently as well as in a team environment.
- Excellent written, verbal and inter-personal communication skills.
- Enhancing skills by undertaking course -Data Science: Foundations of Data Science, offered by University of Waterloo and upgrade skills in programming languages such as Python, Analytical skills and plot graphs, and work on prediction models.
TECHNICAL SKILLS
ETL Tools: Informatica PowerCenter 9.6.1,8.x/7.x/6.2
Big data/Hadoop: Python,HDFS, Map Reduce, YARN, Hive,,Impala, PIG, Sqoop, Oozie, Flume, Zookeeper, HBASE, Kafka, Hue
Apache Spark: Spark Core, Spark SQL, Spark Streaming
Hadoop Distributions: Cloudera and Hortonworks
Data Modeling Tool: Erwin 4.1.2/ 7.X
RDBMS: DB2 v9.7.X, Oracle 10G/9i/ 8i/8/7.3, SQL Server, Ms-Access, Netezza, Big SQL
Scheduling tools: Tivoli Workload Scheduler(TWS) 8.4, Cisco Tidal Enterprise Scheduler 6.2.1
Languages and Tools: Java/JDBC, Scala, Python,C,C++, Linux shell scripts, SQL, PL/SQL, Oracle Enterprise Manager, ADI, TOAD, SQL NAVIGATOR, SQLLoader, FormsBuilder, ReportsBuilder, SchemaBuilder, ProcedureBuilder, Graphics Builder, SQL*Loader, Export, Import
SAS Tools: Base SAS 9.4/9.3, EG 5.1, DI 5.1
Reporting Tools: Cognos7.0, Business Objects 5.1
GUI: Developer/2000(Forms 6i/5.0/4.5& Reports 10g/6i/3.0/2.5), Visual Basic
Web Technologies: Html, Xml, ASP, Java Script, VB Script, Servlets, JSP
Operating Systems: Windows NT/2000/98/95, Windows XP, DOS 6.0/3.0 and UNIX-AIX, SCO UNIX
PROFESSIONAL EXPERIENCE
Confidential
Programmer/Support Analyst
Responsibilities:
- Involved in all phases of Software Development such as modeling, system analysis and design, code generation and testing using AGILE Methodology.
- Participated in daily stand up meetings.
- Providing 24/7 Operational support for EDW (Enterprise Data Warehouse) Applications.
- Support EDL (Enterprise Data Lake) Applications and involve in monitoring the job, Ingestions, scheduling, Executions etc.
- Work on Data ingestion thru Hive, HQL, Next Pathway Ingestion frame works.
- Work on to process different formats of data.
- Support sqoop jobs which export/Import data to HIVE,Hbase,RDBMS tables.
- Navigate and run queries on Hue interface.
- Analyze job abends and work for resolution.
- Update Queries in loads, take backups and tweak data on fly to handle load abends, delta loads.
- Run sql queries to check for data integration and correctness, per user requirements.
- Check/Analyse the unix logs/scripts for abends, load status and functionality.
- Work on file transfers, update of files (process files and actual files) in unix directories, per user requests and functionality.
- Co-ordinate and involve in handling of Problem/Incident and Service tickets.
- Involve in code changes for fixes, Performance tuning, enhancements and development initiatives.
- Played a Key role in Change Management by taking lead responsibilities in managing entire CM Activities(work with Project teams for code promotions, do Impact Analysis, Schedule Change management, Resource Allocation/Management, Deployments ..).
- Work with Audit Teams in providing necessary inputs and get signoff.
- Collaborate with business for their requirements and involve in load improvements/maintenance.
- Coordinate with offshore, onsite team for accomplishment the tasks.
- Being admin for ICEM (Integrated Central Environment Manager), Diyotta tool, involved in Access creation, maintenance of tool etc.
- Worked round the clock during critical loads.
- Created Informatica Mappings, session and workflows to accomplish client requirements.
- Played major role in execution of EDL Application, in preparing instructions for upcoming loads, performing ingestion, proposing solutions for constant improvements, efficiency.
- Mentor team by providing Architectural solutions, knowledge of Applications, ways for constant Environment improvements.
- Work on data quality issues and implement Data fixes.
- Deployment of objects- Informatica, Unix scripts, DB2, SQl Server, SAS, SASDI, Diyotta, Python scripts over to Cloud, TWS jobs and so on.
- Support different applications in Informatica, Unix, Sas, Dataflux, DB2, SQL Server, Hadoop - Hive, Sqoop, Diyotta, python.
Environment: Informatica PowerCenter 9.6.1, DB2 v9.7.X, Hadoop 2.7.X, Reflection for Secure IT Client 7.2, Tivoli Workload Scheduler(TWS) 8.4, Hive 1.2.X, Sqoop 1.4.6.2, HBase 1.1.2.X, ICEM 3.26.20, Diyotta DI Suite 3.4.2, Cisco Tidal Enterprise Scheduler 6.2.1, BgSQL,Hue
Confidential
ETL Consultant
Responsibilities:
- Analyzed requirements and developed source to target matrix.
- Designed the technical specifications to describe the staging and Data Mart load.
- Walk through the entire ETL design process to the architect and the project manager.
- Developed type2 dimension mapping to load the data into users dimension.
- Created mapplets to get the reference keys to load the data into fact and factless tables.
- Used source qualifier, expression, look up, sorter, router, aggregator, update strategy transformations.
- Created incremental load mappings using mapping variables.
- Created parameter file and used in the session.
- Used SQL OVERRIDE to improve performance while fetching the data.
- Created session and workflows in the workflow manager.
- Created test data and unit tested the mappings.
- Developed shell script to automate the load process.
- Supported UAT and resolved the issues.
Environment: Informatica PowerCenter 8.6, Oracle 10g, SQL, PL/SQL, TOAD, AIX 6.1
Confidential
ETL Consultant
Responsibilities:
- Worked within data migration and data warehouse team to develop and extract the information into relational DB.
- Involved in the initial design of dimensional and fact tables using ERWIN data model tool.
- Actively participated in functional and technical meetings for designing the architecture of ETL load process.
- Created mapping and technical specifications.
- Created mappings to Extract, Transform and Load (ETL) data from source to the staging and Data Warehouse by using Informatica ETL tool.
- Developed functions, procedures and packages to implement the business logic in the DB side.
- Worked with different sources like Oracle and Flat files.
- Worked on several transformations such as Source Qualifier, Expression, Sorter, Filter, Aggregator, Joiner, Lookup, Router, Update Strategy, Sequence generator transformations.
- Designed and developed mapplets in Power center designer.
- Developed Type 2 mappings to load data into dimension tables.
- Developed incremental load mappings using mapping variables.
- Created parameter files to define the business constant values and database connection information.
- Created workflows in the workflow manager. Used session, command, decision tasks in the workflow.
- Created UNIX ksh scripts for pre checking the sources and triggers to start the load process and monitor it.
- Promoted objects to the SIT and UAT environment.
- Actively involved in testing strategy creation and implementation of the testing scenarios for SIT and UAT.
- Documented the Informatica Load processes and submitted to the client.
- Improved the mappings and sessions performance while loading data into DW by sql overrides.
- Tuned the sql scripts for better performance, backing up the objects as required for testing purpose.
- Created catalog in the Impromptu Administrator.
- Provided security using the access Manager.
- Published the canned reports on to the Impromptu web reports (WR) server.
- Involved in creating Powerplay transformer model. Created IQD files to be accessed in the transformer.
- Involved in pre & post production support.
Environment: Informatica PowerCenter 8.1/7.1, Oracle 10g, SQL, PL/SQL, Erwin 4.1, Cognos 7.1(Impromptu Administrator, PowerPlay Transformer), AIX 5.3, TOAD, IBM Rational ClearCase, Windows XP
Confidential, DE
Data Warehouse/ ETL Consultant
Responsibilities:
- Went through the Business requirements and developed source to Target matrix.
- Designed and developed Mappings as per ETL Specifications.
- Created mappings to read data from files and load into the staging.
- Worked on transformations source Qualifier, Expression, Lookup, Joiner, Filter, Sorter, aggregator, router and Update Strategy.
- Created Parameter files and developed incremental load mappings using Mapping variables.
- Created partitioning using Hash partitioning.
- Used Unix Shell Scripts for File Archiving.
- Developed workflows to load the data into DW DB.
- Developed PL/SQL procedure to build business rules to share some of the common logic across different applications.
- Created Unit Test Cases, Developed Test Data and did the unit testing and provided results to the team lead.
- Reviewed the mappings of other developers and provided the feedback to improve the process.
- Involved in the pre-production support.
Environment: Informatica PowerCenter 7.1, Oracle 9i, SQL, PL/SQL, TOAD, Windows XP, AIX.
Confidential, Mount Laurel, NJ
ETL Consultant
Responsibilities:
- Went through the DW functional requirement documents and developed technical specifications including source to target Matrix.
- Designed the overall flow of ETL mappings and explained to the team.
- Designed and Developed mappings. Used Target load order to load data into the staging tables.
- Loaded data from various source systems (SAMBA, FIN DOMESTIC, SQL Server leasing DB) into Staging.
- Developed mappings to load from staging area into the Leasing Data Warehouse system.
- Developed mappings to read data from the fiat files and load into the staging for insurance payment processing data Mart.
- Developed mapping to load data into slowly changing dimensions.
- Used Source qualifier, aggregate, expression, Lookup, update strategy, router transformations.
- Created mapplets to reuse the transformations.
- Used Constraint Based loading (CBL) while loading data into staging tables to load reference and transaction tables in the same mapping.
- Created Shell scripts to move the files from FTP directory to the source processing directory.
- Involved in production support.
- Tuned the load process to improve the performance.
Environment: Informatica PowerCenter 7.0, MS SQL Server 2000, Oracle 8I (8.1.7), Windows NT 4.0, Unix- Solaris 2.8, Erwin 4.0, PVCS 6.7
Confidential, Atlanta
Programmer/Analyst
Responsibilities:
- Creating Data Structures. I.e. Tables & views and applying the referential integrity using SQL.
- Written Database Triggers, Procedure & Functions using PL/SQL.
- Analyzing requirement of client, generating and designing reports.
- Developed forms using Forms 6i.
- Design and development of data entry screens.
- Created from level objects i.e. Libraries, property classes.
- Implemented Security by creating different users/roles and assigning roles to users forms.
Environment: Oracle 9i, SQL, PL/SQL, ERWIN 4.0, Developer 2000 (Forms6i, Report10g/6i), SCO-UNIX, JAVA and WindowsXP.
Confidential, Columbus
Oracle Programmer
Responsibilities:
- Developed PL/SQL code for various levels of front-end validations.
- Designed and developed various front end screens for User Interface using Forms 6i.
- Created various forms & Data Blocks and managed passing of data/Parameter between Forms.
- Developed PL/SQL Functions, Procedures and Packages using PL/SQL tables in Oracle 9i.
- Created various Database triggers using PL/SQL.
- Created various Control and Data Blocks as per the requirements of the application.
- Standard and Graphical Reports are being developed as for business requirements using Reports 6i.
- Analyzed various SQR’s and Shell Scripts to understand the job process going on.
- Developed UNIX Shell scripts for the process.
- Data is being imported from Flat files to Database tables Using SQL Loader.
- Various LOV’s, Triggers, Programming Units are being developed in Forms.
- Performance is being checked and tuned by the Explain Plain in TOAD.
- Interacted with JSP to test the functionality of stored procedures.
- Java Servelets are used to interact with Databases.
Environment: Oracle 9i, SQL, PL/SQL, SQL Navigator 4.0, TOAD, Reports 6i, Forms6i, SQL Loader, Java, Sun salaries, and WindowsXP.
Confidential
Oracle Programmer/Analyst
Responsibilities:
- Worked on complete SDLC (system development life cycle) including system analysis, high level design, detailed design, coding and testing.
- Worked on various backend Procedures and Functions using PL/SQL.
- Created various SQL and PL/SQL scripts for verification of the required functionalities.
- Created numerous web based forms and reports using Forms 6.0i, Reports 10g/6i.
- Created various Database triggers using PL/SQL.
- Developed Unix Shell scripts and pl/sql procedures to extract and load data for month end batch processing.
- Did oracle data loading into oracle database from flat files using korn shell scripts.
- Created croon shell scripts for automating some production tasks.
- Generated a number of reports for management to review for system functionality as against the old legacy system.
- Worked with various functional experts to implement their functional knowledge into working Procedures.
- Worked on optimizing existing procedures and functions using PL/SQL.
- Created various Database triggers using PL/SQL.
- User interfaces are created with forms.
- Various LOV’s, Triggers, Packages are being developed in Forms.
- Standard and Graphical Reports are being developed as for business requirements.
- Automation of Backup and Recovery operations using Shell Scripts and stored scripts.
Environment: Oracle 9i, SQL, PL/SQL, Developer 2000(Forms 6i, Reports 10g/6i), TOAD, Java, Sun Solaris, Windows XP/2000.
Confidential
Programmer Analyst
Responsibilities:
- Preparation of functional, Design specifications, Test Cases.
- Involved in coding and testing.
- Involved in the database design, Reports generated using Crystal Reports 4.0.
- Involved in testing both interface and functional using automated tool Win Runner.
Environment: Oracle 7i, JDBC, Crystal Reports 4.0 and DAO, Win RunnerWindows 95.