Technical Manager Resume
Albany, NY
SUMMARY:
- Data Warehousing: Full life - cycle project leadership, business-driven requirements, capacity planning, gathering, feasibility analysis, enterprise and solution architecture, design, construction, data quality, profiling and cleansing, source-target mapping, gap analysis, data integration/ETL, SOA, ODA, data marts, Kimball methodology, star/snowflake design.
- Business Intelligence: Requirements analysis, Key Performance Indicators (KPI), metrics development, sourcing and gap analysis, OLAP concepts and methods, aggregates / materialized views and performance, rapid prototyping, tool selection, semantic layers (BI reporting tools (Qlikview, Tableau, OBIEE)
- Architecture: TOGAF/Zachman Framework enterprise strategy and architecture, business requirements definition, guidelines and standards, process/data flow, conceptual / logical / physical design, performance, partitioning, optimization, scalability/throughput, data quality, exception handling, Metadata, Master Data design and implementation, auditing, capacity planning, use cases, traceability matrix etc.
- Worked on MDM (master data mangement)/ Data Management(RDM) Integrations.
- Logical and physical database designing like Tables, Constraints, Index, etc. using Erwin, DB Artisan, ER Studio, TOAD Modeler and SQL Modeler.
- Experience in leading the team of application, ETL, BI developers, Testing team .
- Well versed with all phases of SDLC, quality process and procedures like business requirements definition, enterprise strategy and architecture, guidelines and standards
- Significant experience in Data Modeling for OLTP, canonical modeling, Dimension Modeling for data ware house like Star Schema, snowflake schema and MDM.
- Responsible for detail architectutral design and Source to target mapping .
- Worked on Informatica Power Center tools-Designer, Repository Manager, Workflow Manager.
- Actively involved in data wrangling, data profiling to ensure data quality of vendor data.
- Excellent PL/SQL programming skills like Triggers, Stored Procedures, Functions, Packages. etc in developing applications.
- Experience in top distributions giants like Cloudera(CDH), HortonWorks (HDP) ecosystems and related technologies (Hadoop, HDFS, MapReduce, Hive, Pig, Impala, Splunk).
- Monitoring and optimizing the performance of the database through Tuning Applications, Tuning Memory, Tuning Disk etc.
- Worked with enterprise architects, business architect, analysts and development team to deliver rapid iterations of complex solutions.
- Involved in designing annual technology business plans.
- Collaborated with new and existing external vendors and provide support to all objectives and assist to create various data architecture.
- Experience in Capacity Planning, Space Management, Storage Allocation Performed database administration like User Management and implement Security using Roles, Privileges, Grants.
- Tuning of SQL statements using tools like Explain plan, Auto trace, Gathering statistic, analyzing tables and indexes, using Pipeline function, parallel query, Inline views, analytical function, Query rewrite, hints etc..in oracle sql server and teradata.
- Performed database administration like User Management and implement Security using Roles, Privileges, Grants, tablespace management and capacity planning etc.
- Experience in Table partitioning and rebuilding, validating the Indexes in Oracle.
- Experience in transforming business requirements into technical specifications.
- Hands on Integration and conversion projects and also excellent in handling 24x7 Support, Solid People management skills, demonstrated proficiency in leading and mentoring individuals to maximize level of productivity, while forming cohesive team environment.
TECHNICAL SKILLS
Job Functions: Project Management, Incident,Configuration Management, Application Support, DBA tasks, Handling Database, Analysis, Design, Coding, and Testing Documentation, Maintenance, Team Management.
Big Data Technologies: Hadoop, MapReduce, Pig, Hive, Sqoop, Oozie, Flume, Splunk, HBase
Cloud: AWS/EMR/EC2/S3
Databases: Oracle (9i, 10g,11g), Informix, DB2,MS SQL Server, Terradata
BI Reporting Tools: Tableau, QlikView, QlikSense, Business Objects
Data warehouse Tech Tools: Informatica Power center, Informatica Power Exchange, IBM DataStage 7.5Erwin, ERStudio, TOAD Modeler, SQL Modeler
Languages: PL-SQL, Python, Java, C, C++, CORBA, COBOL, Informix 4GL, Visual Basic 6.0
COTS: Mycom(OSI) NetExpert, Metasolv, Sasktel Martens & Open Switch Gate(OSG)
Tools: ServiceNow, BMC Remedy, SVN, VSS, Clear Case, TOAD,PL/SQL Developer, MS office, MS Project,ClarityWORK EXPERIENCE:
Confidential -,Albany, NY
Sr. Data Architect/Technical Manager
Environment: Oracle, Sybase, ER Studio/Erwin,TOAD, SQL Developer, MS Visio, OBIEE and Qlikview
- Responsible for detail architectutral design .
- Developed Data Modelling, Dimension Modelling (Star/Snowflake), Database Designing using ER Studio and later converted to Erwin Model.
- Developed ER Studio automated scripts using SAX Programming.
- Converted business processes, domain specific knowledge, and information needs into a conceptual model and convert conceptual models into logical models with detailed descriptions of entities and dimensions using Erwin/ER Studio Data Modeling tools.
- Developed and maintain fully defined conceptual, logical and physical dimensional data models to ensure the information models are capable of meeting end user and developer needs.
- Developed data models and data migration strategies utilizing sound concepts of data modeling including star schema, snowflake schema.
- Analysed data grain and created Dimension, Facts tables. Modeled the data warehousing data marts in Star join schema.
- Responsible building ETL Process using Informatica in accordance with technical specification.
- Analysis of old databases systems (Sybase) and performed reverse engineering with Power Designer.
- Developed mapping of all data sources and movement and analyze it to ensure appropriate quality of all data. Data Lineage preparation and validating the data after ETL Load is completed.
- Conversion of Sybase model to Oracle Model and with Erwin.
- Experienced in using Sybase Central and Interactive SQL tools for Sybase database analysis and analysed data and business rules.
- Developed of Ad-hoc reports using OBIEE 11g, Tableau and validate reporting model and canned reports.
- Understanding and solving of issues with Performance Tuningfor Oracle RDBMS usingExplain Plan,DBMS PROFILER
- Implementation of Enterprise standards and procedures.
- Designed security and capacity planning to the project.
- Reverse engineering of databases for performance and business needs
- Profile data and identify data quality rules. Data map and gap analysis
- Sprint planning, daily scrums, guiding the team
- Project plan development and updates using MS Project
Confidential, Plymounth meeting, PA
Sr. Consultant (Lead/Architect level)
Environment: Teradata, Teradata Studio, TOAD, SQL Developer, MS Visio, Qlikview
Responsibilities:-
- Responsible for detail architectutral design .
- Gathered the requirment from business and implimented the new projects.
- Developed Data Modelling, Dimension Modelling, Database Designing using Erwin.
- Worked on BTEQ scripts, Teradata utilities fast load, multi load, tpump.
- Developed on source to target mapping for informatica.
- Worked on key performance indicator for reporting purpose.
- Reverse engineering of databases for performance and business needs.
- Profile data and identify data quality rules. Data map and gap analysis
- Developed strategies for data acquisitions, archive recovery, and implementation of a database.
- Actively involved in data profiling to ensure data quality of vendor data.
- Monitoring and optimizing the performance of the database through Tuning Applications, Tuning Memory, Tuning Disk Usage etc.
- Design of Qlik Marts for Actual onhand inventory, Annual Financial Plan and Update and Annual Financial Plan-adjustments and Update-adjustments
- Involved in integration of data sourced from Confidential with client and third-party information sources, creating a single, trusted view that in turn can be accessed by the Nexxus commercial applications( Confidential One Cloud on AWS).
- By use of Star-Schema and adherence to best practices in QlikView, optimum performance of the dashboard was ensured.
- Design, build, test and debug QlikView solutions based upon specified requirements
- Developed optimized load strategies; including Binary, Partial and Incremental Load
Sr. Consultant (Implementation Architect/Lead)
Environment: Oracle, MS SQL Server, TOAD,Informatica 9.1, SQL Developer, Erwin, PVCS, Qlikview
- Worked on design, build, and maintain the logical and physical data models to support the business requirements using Erwin.
- Developed Data Modelling, Dimension Modeling (Star/Snowflake), Database Designing using Erwin.
- Was responsible in managing tasks and deadlines for the ETL teams both Onsite and Offshore.
- Was the point person on the ETL team for other teams such as Reporting, Testing, QA and updates on Project Status and issues.
- High level & low-level ETL flow design
- Design and Development of SCD’s (Slow Change Dimensions), Identification of GOLDEN record
- Implemented Master Data Management (MDM) integration from DW system
- Extracted data from MSSQL Server by understanding the business rules in procedures
- Developed on PL/SQL, stored procedures, functions, triggers etc.
- Profile data and identify data quality rules. Data map and gap analysis
- Shell scripting, Automation of the system.
- Data Lineage preparation and validating the data after ETL Load is completed.
- Developed key metrics for tests on data and ensure integrity of same on data architecture.
- Prepared documents for data architecture and maintain knowledge on large data structure as well.
- Tuning of SQL statements using tools like Explain plan, Auto trace, Gathering statistic, analysing tables and indexes, using Pipeline function, parallel query, Inline views, analytical, PPI, function, Query rewrite, hints etc..in oracle sql server and teradata.
Confidential, Vancouver,WA
Architect/ Lead
Environment: Oracle, TOAD, SQL Developer, MS Visio, IBM DB2, Unix, Perl, Qlikview,Shell Scripting
Responsibilities:-
- Worked as Portfolio\Data warehouse architect/lead .
- Worked on Data modelling, Dimension Modelling.
- Conceptual, logical and physical database designing
- Created reusable transformations to load data from operational data source to Data Warehouse and involved in capacity planning and storage of data.
- Maintain Meta-Data content (structure, level of detail, and amount of history)
- Created optimal tables and stored procedures for report design
- Physical and logical data model designs using TOAD Data Modeler and MS Visio.
- Created a stored procedures, functions to process data by applying the business rules from staging tables
- Implementation of Slow Change Dimensions
- Integration with Master Data Management (MDM) systems to master records, Mergers implementation
- Developed complex procedures and functions using TOAD and SQL Developer
- Worked on PL/SQL, stored procedure, function, trigger, packages etc.
- Monitoring and optimizing the performance of the database through Tuning Applications, Tuning Memory, Tuning Disk Usage etc.
- Conducted user, responding to user questions, and resolving problems.
- Accomplished information systems and organization mission by completing related results as needed.
- Developed key metrics for tests on data and ensure integrity of same on data architecture.
- Prepared documents for data architecture and maintain knowledge on large data structure as well.
- Analyzed information flow and recommend appropriate technology to provide support to all business processes.
- Migrating code across development, testing and production environments
- Design, build, test and debug QlikView solutions based upon specified requirements
- Developed optimized load strategies; including Binary, Partial and Incremental Load
- QlikView Integrated with SAS, R to predict the expected churn.
Architect/ Lead
Environment: Oracle, Hadoop, Sqoop, Hive, Pig, Hue,TOAD, SQL Developer, MS Visio, IBM DB2, Unix, Perl, Qlikview,Shell Scripting
Responsibilities:-
- Lead the team of application,ETL, BI developers.
- Designed a physical data model for data warehouse storage and created the physical data model by translating the logical data model into a physical data model.
- Defined partitioning strategy, included process metadata columns, audit columns as required.
- Defined the processes and procedures required to meet data retention requirement including processes to archive all data including landing area, staging area, data warehouse and data marts, access archived data and restore archived data.
- Documented the processes and procedures required to meet backup and recovery requirements, strategy to recover data in event of a system-wide failure including the approach that will be used including planning, processes, and procedures.
- Developed Data modelling, Dimension Modelling, Database Designing using SQL Modeller.
- Worked with the Application Development team to implement data strategies, build data flows and develop conceptual data models.
- Worked with Agile Methodologies and have used scrum in the process
- Worked on key performance indicator for reporting purpose.
- Tuning of SQL statements using tools like Explain plan, Auto trace, Gathering statistic, analyzing tables and indexes, using Pipeline function, parallel query, Inline views, analytical function, Query rewrite, hints etc..
- General database administration like tablespaces management.
- Migrated datbases from MS excees, teradata, sql server to oracle.
- Shell scripting, Automation of the system using cron.
- User Management and implement Security using Roles, Privileges, Grants .
Confidential, USA
Technical Manager/Lead/Architect
Environment: Informix 4GL, Informix, HP-UX, Shell Scripting, ESQL/C, Perl, Oracle, VC++, Windows, Handheld Devices, MS Visio,Erwin, BO, SQL Developer
Responsibilities:
- Responsible for program and portfolio management activities for small to medium projects.
- Worked with Business Intelligence team for reporting
- General database administration for oracle 11g.
- Worked on PL/SQL, Created and maintained stored procedure, function, trigger, packages etc.
- Created summary tables, materialize views for query rewrite purpose etc.
- Created partitioned table, index according to the need of application
- Lead the development/maintenance of SAP Business Objects Reporting solutions with Crystal Reports.
- Lead the maintenance of SAP Business Objects Performance Management solutions.
- Develop and/or update existing best practices for SAP Business Objects Reporting solutions.
- Data loading using ETL tool, Faster Batch Processing, SQL Loader, external tables, UTL FILE package
- Dashboards, KPI and Daily, Weekly, Monthly, custom report using BO to control dispatch issues (2011 onwards)
- Meeting SLA requirements and Application Support on 24x7 basis.
- Guide team to Development of 4GL Program for as per Help Desk requirements and preparing the various documents as per Confidential requirements.
ETL Lead/Architect
Environnent: Informatica V8.x 9.x, Oracle 11g/10g, Informix, SQL Server, PL/SQL, CDC, XML, Flat Files.
Roles & Responsibilities:
- Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Informatica Power Center.
- Parsing high-level design specifications to simple ETL coding and mapping standards. Designed mapping document, which is a guideline to ETL Coding.
- Analyzed and created Facts, Dimension tables. Modeled the data warehousing data marts in Star join schema.
- Developed a frame work to migrate the data from different sources RDBMS using ETL tools, Packages, Procedures and scripts.
- Experienced in SQL in an Oracle and Informix environment with complex SQL (multi joins, sub-queries, unions) and creation of database objects (tables, views, materialized views, etc.). Ability to utilize performance tuning utilities (Explain Plans, etc.) to optimize queries.
- Experience in integration of various data sources likeOracle,SQL ServerandFlat Filesusing Active/PassiveStages
- Experience in using the third party Scheduler likeAutosysto schedule the jobs.
- Excellent knowledge of studying the data dependencies using Metadata ofInformaticaand preparing job sequences for the existing jobs to facilitate scheduling of multiple jobs.
- Coordination between business/IT teams on both sides.
- Involving infrastructure teams as and when required to overcome the technical needs.
- To involve in business discussions to identify functional gaps between the source and destination applications.
- Monitor traffic and estimate capabilities during System and UAT testing.
- Perform analysis of current WFM load versus anticipated load after conversion.
- Escalate as an overall project risk. This is not limited to WFM, Confidential and TAP.
- Address the throughput issues and perform cross application stress testing.
- Provided functional and technical designs for the conversion programs and the data scrubs.
- Created summary tables, materialize views for query rewrite purpose etc.
- Created partitioned table, index according to the need of application logical backups (Import / Export), Hot/Cold Backup
Confidential - Confidential
Technical Manager/Lead/Architect
Environment: Mycom (OSI) NetExpert, Oracle 11g, Unix, Perl, Shell Scripting, Java, Spring Responsibilities:
- Requirement gathering, Design of Fault Network Management Systems using the NetExpert OSI development suite
- Handled Projects, Portfolio report Management and also owned complete delivery responsibility for few small to medium projects.
- Handling the client calls, risks in projects
- Architecture and Integration diagrams development using MS Visio
- Design of Dashboards, user driven reports, comparison reports for Fault Management
- Design and guiding the team to develop the procedures and triggers in various phases to integrate with other systems.
- Wrote Perl/shell scripts to automate the manual process.
- Closely monitoring the SLA and application support
- Ability to provide high level architecture/design of NetExpert
- Demonstrated skills to provide the user with tools to diagnose the different operational and provisioning changes in the network
- Liaison in setting up testing environment with other groups
- Integration of UNE/POTS, ADTRAN, Remedy ITSM with NetExpert