Principal Administrator Resume Profile
Summary:
- Over 17 years of experience in IT industry with knowledge base in Data warehouse and Big Data technology solutions. I have experience in management, development, administration support. I am involved in full-lifecycle including requirement gathering, data modeling, development, administration and documentation.
- Experience in Administration of Hadoop, Informatica, Talend, Data stage and AWS.
- Experience in Development of ETL applications for Data Warehousing / Business Intelligence applications using Informatica, Talend and IBM Data stage, Quality Stage.
- Experience with Development of GoldenGate replication in heterogeneous environment.
Technical Skills:
- Tools : Informatica 9.1.0, Talend, Hadoop 1.1.2/2.2 ,
- IBM Info Sphere Data Stage, Quality Stage 8.1/7.5/6.2 ,
- Golden Gate
- No SQL : Cassandra, MongoDB
- Cloud Computing : AWS EC2 virtual server ,Amazon Linux,
- Amazon S3 storage
- Scripting Languages : Python 3.4,Shell scripting
- Database : Oracle 11g,Teradata, SQL Server,
- Informix7.x/9.x, Sybase,Progress
- Operating System : Linux RHEL, Centos, OEL , UNIX HP, AIX ,Windows.
Professional Experience:
Confidential
Principal Administrator/Lead Developer
Confidential through its subsidiaries, University of Phoenix has established itself as a leading provider of higher education programs for working adults by focusing on servicing the needs of the working adult.
Hadoop Administrator
Administration:
Provided operational support services relating to Hadoop infrastructure and application installation
- Supported technical team members in management and review of Hadoop log files and data backups
- Supported technical team members for automation, installation and configuration tasks.
- Provided technical assistance for configuration, administration and monitoring of Hadoop clusters.
- Commission, Decommission of data nodes and adding additional data nodes etc.
Senior Informatica Administrator/Developer
Administration:
- Involved in administration of ETL tool like setting security model.
- Applied Informatica hotfix3,hotfix4, Salesforce and Hadoop Plug-in
- Involved in automating and tuning of informatica session query
- Installed, configured and implemented Proactive Monitor and Jasper Soft tools.
- Setup security model for Proactive Monitor tool in QA and PROD in the Data warehouse environment. The PROD job failures can be addressed through third level support.
- Written UNIX shell scripts to automate certain process in the data warehouse environment.
- Setup user group, roles and user.
Developer:
- Design, Developed the Dimension Fact for university of phoenix Data warehouse
- Design and Developed informatica jobs for third party application using informatica.
- Developed on several enhancement projects like forecasting.
- Involved in enhancement of current data warehouse by changing the process using golden gate tool more efficiently.
confidential
- Developed and Implement Data Replication between Heterogeneous Databases
- Supporting real time data replication using Golden Gate for several database.
Talend Administrator
Installed Talend on AWS ec2 instances
- Mounted S3FS bucket in all Talend instances that will be added to virtual job server cluster
- Configured SVN for the code to be managed
- Installed and managed Amin console to manage resources
- Installed and configured plugin like MongoDB and MySQL
- Upgraded from Talend5.0.1 to 5.1.1
- Software: Informatica 9.1,Informatica Metadata manager, Informatica IDQ, Informatica Proactive Monitor 2.5,Goldengate,Talend 5.1,Oracle 11g,Sqlserver,MongoDB,MySQL
- Hadoop 1.2/2.2 HIVE, HBase, SQOOP, OOZIE, Map/Reduce, HDFS,YARN
confidential
ETL Architect/ Developer
HP is a technology company that operates in more than 170 countries around the world and it is one of the fortune 10 companies with the revenue totaling 126 billion. Providing solutions that is appropriate to domain, involved in designing/developing/supporting applications for US ARMY. ETL applications were developed using DataStage8.1 server/parallel versions. Also was involved in re-writing some of the DataStage7.1 ETL applications to DataStage8.1 using some of the new stages available in the latest version.
As an architect was involved in managing and working on upgrades from DS7.1 to DS 8.1
- Replaced stored procedures in Sybase with the Data Stage jobs using the CDC change data capture plug-in available in DataStage8.1 parallel job.
- Used some of the stages available in the latest version while rewriting the application from DS7.1 to DS8.1
- Routines in DS 7.1 was rewritten using Unix shell script in DS 8.1
- Analyzing server routines in the DataStage7.1 to convert to parallel routines DataStage8.1 which will be implemented using C programming language.
- Developed and Implemented applications on DS 8.1 parallel version
- Developed stored procedure in Sybase
Software: IBM InfoSphere DataStage8.1/7.1, Quality Stage,SunSolaris 10,Sybase,Oracle 11g
confidential
Data warehouse Manager/Architect/Administrator/Developer
- As a Service Owner, I had a fabulous opportunity to work with all the major Production IT Functions on all aspect of Service Ownership including Service Transition and Service Roadmaps.
- Managed, Developed and supported multiple projects for Business Intelligence BI and Sales Marketing Domain. BI domain used Ascential DataStage Prism as an ETL tool with Teradata Informix as back-end. Sales Marketing domain used Informatica as an ETL tool with Oracle SQL Server as back-end. Cognos Actuate were used as the front-end reporting tools.
- DHL is the global market leader of the international express and logistics industry. DHL's international network links more than 225 countries with worldwide revenue of 80 billion.
- Experience in design, development, deployment and support of data warehouse with Ascential DataStage and Informatica as an ETL tool.
- Analyzed business requirements and provided system enhancement recommendations.
- Created transformations like Expression, Filter, Joiner and Lookups for data cleansing and data consistency.
- Created Mappings for populating target table in efficient manner using Sorter Aggregator.
- Involved in defining technical and functional specifications for ETL process.
- Used both Pipeline and Partition Parallelism for improving performance.
- Tuned mappings to provide optimum efficiency
- Data transformation mappings were built between ODS to Data marts.
- Data Files were loaded into Teradata using Multi Load and Fast Load utilities.
- Exported data files using Fast Export utility and also developed Bteq scripts.
- Involved in rewriting Legacy applications.
- Involved in data migration from Legacy applications to DHL Data marts.
- Developed Shell Scripts to automate file manipulation and data loading procedures.
- Managed DataStage repository, defined custom routines, Imported and exported jobs between different systems.
- Hash files were used as an intermediate to extract and write data in a job.
- Used COSORT utility in shell scripts for sorting the ASCII files.
- Scheduled jobs using Control M and CA7.
- Responsible for Design Documents reviews, Performance and Technical acceptance review.
- Worked with offshore and onshore team in implementing the application on a timely manner.
Software:AscentialDataStage8.x/7.x/6.x,Teradata,QualityStage,Informix,
- Cognos,Actuate, Prism, PL/SQL, Oracle, Sql Server,ControlM, CA7
confidential
Data warehouse consultant
- ETL tool was used as a back-end for data extraction, transformation, loading and Crystal Reports was used as a front-end reporting tool.
- The Insight project integrates data from multiple operational platforms i.e. Mainframe, SAP, JDA and PeopleSoft into a single Enterprise Warehouse.
- Solar Turbines is a Leader in Designing, Manufacturing and Servicing Industrial Gas Turbine Power System Solutions for the Oil, Gas and Power Generation Industries.
- Involved in defining technical and functional specifications for ETL process.
- Involved in performance tuning of the ETL process.
- Analyzed the existing informational sources and methods to identify problem areas and make recommendations for improvement. This required a detailed understanding of the data sources and research for possible solutions.
- ETL tool was used to extract data from source systems, aggregate the data and load into the database.
- Used various Transformations like Expression, Filter, Joiner and Lookups for better data quality and consistency.
- Designed Migration Mappings and created data template documents.
- Developed various Informatica Mappings, Mapplets and Transformations for migration of data from various existing systems to the new system using Informatica.
- Documented the process flow from source to target.
Software: Informatica 6.2, Crystal Reports, AIX, PL/SQL, Oracle Discoverer, Oracle Reports, Oracle Express, Oracle Designer.
confidential
Data warehouse Consultant
- Created complex functions, stored procedures and packages by using PL/SQL.
- Dahlberg Inc., a wholly owned subsidiary of Bausch Lomb, which also operates under its Miracle Ear trade name.
- Identification and implementing business rules constraints, triggers etc. and views.
- Developed SQL Applications for extracting the data from the Oracle tables.
- Utilized SQL Plus as the main interface with Oracle 8i.
- Involved in the development of a migration tool to convert data from the existing database to the required database format.
- Shell Scripts were written to automate the stored procedures, triggers and functions.
- Developed packages by using PL/SQL for automating the transactions happening in the tables.
- Used Collections like variable arrays, nested tables extensively.
- Used SQLLDR to load data from CSV file to database.
- Created front-end procedures and functions with Forms.
- Various features of PL/SQL such as Dynamic SQL and parameter passing of PL/SQL tables were evaluated.
- Involved in PL/SQL Debugging and support.
- Worked on tuning of SQL after analyzing the table
- Involved in Unit Testing of the modules,
Software: Oracle 8i, PL/SQL, HP-UX 10.x, SQL Loader, Oracle Forms Reports 6i, SQL Plus, Toad.
confidential
PL/SQL Developer
- PL/SQL was extensively utilized for coding and writing complex database triggers, stored procedures and packages.
- Patient Analytics Solution provides analysis services to the billing department through interactive client tools.
- SQL Loader was used to convert the data from the Excel to oracle tables.
- Wrote Stored Procedures for effective inserting and updating of the data to the tables when the application is online reducing the network traffic.
- Sub-queries and joins were extensively used in stored procedures.
- Created Shell scripts to automate the load of data from Text files to Oracle Tables.
- Created Packages to automate transactions from the Oracle Forms.
- Created database triggers to automate the transactions that take place online.
- Subversion technology used to maintain the source code.
- Used UTL File Packages to Load data from Excel Sheet to Oracle Tables.
- Used Oracle Forms to design and develop the interface for the system.
- Created matrix reports using Reports.
- Created Object Groups to create to screen templates with Forms.
- Used Import Export tools to create dump files for exporting and importing data to the database.
- Created front-end procedures and functions with Forms
Software Oracle 8i, PL/SQL, HP-UX 10.x, complex Stored Procedures, triggers and packages , SQL Loader, Oracle Forms Reports 6i, SQL Plus, Toad.
confidential
Lead
- Responsible for developing complex back-end database applications using Oracle technologies.
- Also Worked Human Resources Management System. This package keeps a detailed track record of employees in an organization regarding their recruitment, attendance, leave, salary structure, etc., developed various forms and reports using Forms Reports.
- Worked Sales Order Processing System. Package with different modules like Receiving Orders, Executing Orders, and Dispatching etc. maintains various tables like Orders, Stock Register, Customer Details, etc. Developed reports Daily Orders, Packaging Slip, Pending Orders, etc. using Reports and forms like Order Entry, Pending Orders, Customer Details, etc.
- Unilever have 270 manufacturing sites across six continents. Unilever employs 174000 people in around 100 countries worldwide with worldwide turnover with 40 billion.
- Create and developed highly complex applications using Oracle as back-end with expertise in design and development of Oracle PL/SQL Packages and Procedures. Superior skills in designing and developing Oracle objects such as Tables, Views, Indexes, Stored Procedures and Functions in PL/SQL, Packages in PL/SQL, Materialized Views, and Dynamic SQL.Analysis, Design, and Development of the Data Entry Subsystem.
- Responsible for setting up development standards for database development within PL/SQL.
- Contribute to a high-performing workgroup by applying interpersonal and collaboration skills to achieve corporate goals. Provide high level of attention to detail while completing multiple tasks demonstrating a commitment to quality while meeting project deadlines.
- Expertise in designing, developing Extraction, Transformation, and Load scripts utilizing SQL, PL/SQL utilities and provide solutions to critical issues enhancing performance and productivity to the project
- Construct complex SQL queries with sub-queries, inline views as per the functional needs in the Business Requirements Document.
- Review the business requirements with the Analysts, Business Users and Developers to construct SQL queries for implementing the same in PL/SQL packages.
Software: Oracle8 Forms Reports, MS Access 97, Windows 95.
Training:
- ITIL Process Configuration Management,Service Desk Management,Incident Problem Management,Change Management,Release Management,Service Level Management,Availability Management,Capacity Management,Continuity Management,Financial Management
- Ascential DataStage Developer
- Informatica Developer/Administrator
- Hadoop Administrator
- Teradata Design and Development
Certification:
Teradata Certified Professional.