We provide IT Staff Augmentation Services!

Sr. Big Data Developer Netezza Resume

3.00/5 (Submit Your Rating)

Bellevue, WA

SUMMARY

  • 9+ years of Industry experience as a Data Analyst with solid understanding of Data Modeling, Evaluating Data Sources and strong understanding of Data Warehouse/Data Mart Design, ETL, BI, OLAP, Client/Server applications.
  • Expert in writing SQL queries and optimizing the queries in Oracle, SQL Server 2008, Netezza.
  • Excellent knowledge in Data Analysis, Data Validation, Data Cleansing, Data Verification and identifying data mismatch.
  • Performed data analysis and data profiling using complex SQL on various sources systems including Oracle.
  • Strong working experience in Informatica Data Integration Tools - Repository Manager, Designer, Workflow Manager and Workflow Monitor.
  • Strong experience in using Excel and MS Access to dump the data and analyze based on business needs.
  • Worked on databases Oracle 11g/10g/9i/8i, SQL Server 2000/2005/2008, DB2.
  • Performed data cleaning and data manipulation activities using NZSQL utility.
  • Handled data loading operations from flat files to tables using NZLOAD utility.
  • Extensively used NZSQL and NZLOAD utilities.
  • Expertise in Developing Mappings, Mapplets and Transformations between Source and Target using Informatica Designer and writing Stored Procedures.
  • Involved in full life cycle development of building a Data Warehouse.
  • Extensive Experience in Extraction, Transformation, Loading (ETL) data from various sources into Data Warehouses and Data Marts using Informatica Power Center Tools.
  • Extensive design and development of transformations: Expression, Router, Filter, Normalizer, LookUp, Sequence Generator, Joiner, Update Strategy and Union.
  • Expertise in implementing complex business rules by creating re-usable transformations, and robust mappings/mapplets.
  • Experience in implementing update strategies, incremental loads and change data capture.
  • Experience in developing slowly changing dimensions.
  • Experience in performance tuning techniques on Targets, Sources, Mappings and Sessions.
  • Expertise in Data Repairs and Data Cleanup.
  • Extensively used SQL, Transact SQL and PL/SQL to write stored procedures, functions, packages and triggers.
  • Experience in Unix Environment developing shell scripts and perl scripts.
  • Excellent knowledge and experience in Technical Design and Documentation.
  • Strong leadership, excellent communication, analytical, design and development skills with customers, end users and colleagues.
  • Provided support and assistance to onshore and offshore Team Members.

TECHNICAL SKILLS

Languages: C, Java, SQL, PL/SQL, Shell Scripting, HTML, CSS

ETL Tools: Informatica Power Center (Designer, Workflow Manager, Workflow Monitor, Repository Manager) 9.0.1, 8.6/8.1/7.1/6.2

Databases: Oracle 11g/10g/9i, MSSQL Server 2000/2005/2008, IBM DB2 v 8.0/9.1/9.5, MS Access 2003/2007/2010, Sybase 12.5/15.0

Operating Systems: UNIX (Solaris, HP-UX, IBM-AIX), Windows XP/2000/NT/98/95.

Data Modeling Tools: Erwin 7.2/4.1.2/3.5

PROFESSIONAL EXPERIENCE

Confidential - Bellevue, WA

Sr. Big Data Developer Netezza

Responsibilities:

  • Involved in full SDLC from gathering requirements, analysis, design, modeling, development, testing and implementation needed for loading and updating the warehouse.
  • Troubleshooting the SQL code and optimizing it to give better performance in regards to Netezza appliance as well as SQL.
  • Generate the extracts on NET PROPHET to meet business requirements.
  • Developed 4DB scripts in NET PROPHET.
  • Upgradation of NET PROPHET version from 2.5.0 to 2.5.1
  • Interact with business customers to determine the inputs from them and provide respective solutions for the inputs.
  • Analyzing and transforming complex business requirements into appropriate technical requirements.
  • Develop code for the complex business requirements and deliver the output on time.
  • Wrote the SQL queries, Stored Procedures, Functions, and Views.
  • Maintain positive communications and working relationships at all business levels.
  • Responsible for managing the business dashboards which is the primary source for campaign marketing and sales.
  • Ensure best practices are applied and integrity of data is maintained through security, documentation, and change management.
  • Using Excel and VBA in the development of robust and flexible reporting systems.
  • Performed and conducted complex custom analytics as needed by clients.
  • Involved in extensiveDATA validation by writing several complex SQL queriesand Involved in back-end testing and worked with data quality issues.
  • Lead the team with JIRA-service desk ticketing system.
  • Responsible for on time delivery of all the extracts to every business team in the organization.
  • Actively Involved in ETL migration from NET PROPHET to Pentaho PDI/BI
  • Schedule and monitor daily, weekly and monthly jobs through the scheduling tool Control M
  • Perform Data Integrity tests and deploying pre-checks to provide the real quality data across the platform.
  • Loading bulk data into Hadoop environment from netezza using SQOOP.
  • Node level analytics to forecast the data volumes.
  • Actively involved in parallelizing the existing production methodologies to increase production capacity and performance to sustain outages.
  • Key member in TF120 FDT (2.6 to 4.1.1.1) and RHEL (5.5 to 5.8) firmware Upgradation
  • Performing maintenance jobs like groom, eliminating skew, keeping stats up to date and monitoring health of TF120 and TF96.

Environment: Netezza 7.0.2.15-P1 (TF120, TF 96), Aginity work bench V 4.5, NetProphet 2.5.0/2.5.1, Control M version 8, Windows 8, Linux

Confidential - Philadelphia, PA

Sr. Netezza Database Administrator

Responsibilities:

  • Participating in User meetings, gathering requirements and translating user inputs into Technical Specification documents.
  • Involved in full SDLC from gathering requirements, analysis, design, modeling, development, testing and implementation needed for loading and updating the warehouse.
  • Worked on optimizing and tuning the Netezza SQLs to improve the performance of batch.
  • Troubleshoot specific issue related to the optimization in Netezza, including but not limited to data integrity and additional queries.
  • Create necessary queries, Extract Transport Loads and schedule tasks for automatic data extracting and manipulation from different data sources.
  • Execute scripts to create objects and populate data on the objects.
  • Create new user / new group and provide necessary Grant/Revoke priviliges to nz user or nz group ids based on the application requirement.
  • Redesigned mapping to handle multiple updates and deletes.
  • Nzevents creation and monitoring.
  • Migrating databases from one host to another.
  • Managing storage management and capacity planning.
  • Writing shell scripts and automating tasks.
  • User tables were created so that other user information is masked from becoming visible to unauthorized users.
  • Extensively used Aginity Netezza work bench to perform various DML, DDL...etc operations on Netezza database.
  • Creating new objects from data models / altering/ dropping physical database/ table/ external table/ views/ materialized views objects.
  • Check the pg logs, dbos logs, etc. with respective to error the application faced.
  • Cluster Based Table (CBT), materialized views creation based on application performance need after analysis.
  • Creating Netezza utility scripts.
  • Tuning the queries to improve performance of SQL using EXPLAIN. Also analyze the cost of the SQL.
  • Performed extensive Unit Testing and System Integration Testing on the developed Mappings and was also involved in the documentation of Test Plans and testing with the users (UAT).
  • Involved in the preparation of Best Practices documentation.
  • Creating Release manuals (RM) to move the code to QA to production.
  • Used Netezza groom to reclaim the space for tables, databases.
  • Extensively managed the data skew all the Netezza database tables.
  • Involved in UAT support to resolve the issues encountered during the catch up loads and regular data loads.
  • Tracking tasks for their timely completion.
  • Reporting status to the client on a daily basis.
  • Coordinating deliverables from offshore development team

Environment: Netezza 7.0.4(TF3, TF6), Informatica 9.5, Aginity work bench 2.1, Oracle 11g

Confidential - Warren, NJ

Sr. Netezza/Informatica Developer

Responsibilities:

  • Developed LINUX Shell scripts by using NZSQL/NZLOAD utilities to load data from flat files to Netezza database.
  • Performed Unit Testing, User Acceptance testing to verify whether the data was loaded into target is accurate to meet the user requirements.
  • Implemented LINUX functions using korn shell scripts for code optimization.
  • Responsible for maintaining all the sessions that are under execution (running), scheduled, completed and failed.
  • Modified the existing ETL code (mappings, sessions and workflows) and the shell scripts as per the user requirements. Monitoring workflows/mappings in Informatica for successful execution.
  • Extensively worked on developing Informatica Mappings, Mapplets, Sessions, Workflows and Worklets for data loads.
  • Landed the files from one server to another server using FTP command.
  • Involved in creating reusable transformation and mapplets to use in multiple mappings.
  • Involved in creating and monitoring workflows and tasks using Informatica PowerCenter work flow manager.
  • Involved in creatingsessions,workletsandworkflowsfor carrying out test loads.
  • Involved in fixing invalid mappings, testing of Stored Procedures and Functions, Unit and Integrating testing of Informatica Sessions.
  • Used SQL tools like TOAD to run SQL queries and validate the data.
  • Resolved the loading failures by verifying log files.
  • Loaded data directly from Oracle to Netezza without any intermediate files.
  • Performed Incremental loading and involved in standardizing Database connection parameters for source and target tables.
  • I have written several Netezza SQL scripts to load the data between Netezza tables.
  • Designed and created several mapping, mapplets and reusable transformations like Lookup, Filter, Expression, Stored Procedure, Aggregator, Update Strategy etc.
  • Taking back-ups of already developed Informatica mappings in XML format and for restore purpose.
  • Used Rapid SQL statements to test the SQL’ Confidential and to view the data coming from different resources.
  • Interacting with the off-shore development team to fix any release related issues.
  • Extensively migrated data from different source system to ODS, Data marts and Data warehouse.
  • Performed Unit testing, Integration testing, System testing, and Performance testing.

Environment: Informatica 9.1, Netezza 6.0.8, Oracle 11g, Aginity Workbench

Confidential, New York

ETL Developer (Informatica)

Responsibilities:

  • Performed analysis on different data models for various source systems to implement mapping transformations.
  • Implemented various database schema objects such as indexes, packages, procedures, functions, triggers using SQL and PL/SQL.
  • Designed a database for implementing data loading in the Staging areas.
  • Performed transfer of data from various data sources like SQL Server, Oracle, Flat Files etc into Staging areas.
  • Prepared specification documents for different mappings between Source Systems and Data Warehouse.
  • Widely used various Informatica client tools like Source Analyzer, Warehouse designer, Mapping designer, Mapplet Designer, Transformation Developer, Informatica Repository Manager.
  • Loading of data from various resources like Oracle, SQL server, Flat Files into a single Data Warehouse.
  • Implemented several Mappings, Mapplets and Transformations.
  • Implemented Performance tuning of the ETL mappings and different Workflows by utilizing partitioning, external loaders, lookup cache etc.
  • Written several UNIX Korn Shell Scripts for batch processing.
  • Designed and created several mapping, mapplets and reusable transformations like Lookup, Filter, Expression, Stored Procedure, Aggregator, Update Strategy etc.
  • Taking back-ups of already developed Informatica mappings in XML format and for restore purpose.
  • Developed various Database objects like Oracle PL/SQL Stored Procedures, Functions, Packages and Triggers.
  • Performed Data Modelling using Erwin.
  • Written several UNIX shell scripts for moving several Staging files into Netezza database using bulk loads.

Environment: Informatica Power Center 8.6.0/8.1.1, Oracle 9i/10g, SQL, PL/SQL, TOAD, Erwin, SQL Server 2005, Windows XP

Confidential

ETL Developer

Responsibilities:

  • Interacted with Business Analyst to understand the business requirements.
  • Involved in the preparation of technical specifications documentation.
  • Design, develop and implement Extract, transformation and Load (ETL) process.
  • Extracted data from various sources like Oracle, flat files, XML and DB2.
  • Created mappings for dimensions and facts.
  • Used various transformations like Normalizer, Source Qualifier, Expression, Lookup, Aggregator, Update Strategy, Router, Filter, Sorter and Joiner.
  • Created connected, unconnected and dynamic lookup transformation for better performance.
  • Implemented Slowly Changing Dimensions to update the dimensional schema.
  • Involved in creating reusable transformation and mapplets to use in multiple mappings.
  • Involved in creating and monitoring workflows and tasks using Informatica PowerCenter work flow manager.
  • Involved in creatingsessions,workletsandworkflowsfor carrying out test loads.
  • Involved in fixing invalid mappings, testing of Stored Procedures and Functions, Unit and Integrating testing of Informatica Sessions.
  • Used SQL tools like TOAD to run SQL queries and validate the data.
  • Resolved the loading failures by verifying log files.
  • Involved in writing UNIX shell scripts in KSH to Transfer the files between multiple systems and to load data from files to database.
  • Involved in tuning the database for better performance by analyzing the table, adding Hints and by Query Tuning methods.
  • Involved in Tuning the Informatica objects to increase the Performance of the Informatica loads.
  • Worked on database connections, SQL joins, aliases, views, aggregate conditions and also wrote various PL/SQL procedures, Functions and Triggers for processing business logic in the database.

Environment: Informatica Power Center 7.1, Metadata Manager, UNIX, Oracle 9i, DB2, Flat Files, XML Files, SQL and PL/SQL, ERWIN 4.0, Toad

Confidential

Oracle PL/SQL Developer

Responsibilities:

  • Created lots of Triggers for validation.
  • Created lots of procedure, package, and functions.
  • Worked on performance tuning of the SQL queries to improve data conversion process.
  • Created the index for increasing the performance of data conversion process
  • Performed Unit Testing, Integration Testing and Performance Testing.
  • Load the data from Excel file to oracle database.
  • Done Configuration Management using VSS.
  • Tuned SQL queries and performed code debugging using TOAD.
  • Conducted Oracle database and SQL code tuning to improve performance of the application, used Bulk binds, in-line queries, Dynamic SQL, Analytics and Sub-query factoring etc.
  • Prepared and updated design specifications.
  • Developed complex SQL queries including Sub queries, correlated queries, and Nested queries, Unions, Intersect and Aliases.
  • Data loads are been performed by using the SQL loader, Imports & UTL files based on file formats.
  • Created lots of SQL script for ETL Process.
  • Used Analytic functions, Decode, case statement while writing complex SQL queries.
  • Managed performance and tuning of SQL queries and fixed the slow running queries in production with utilities like Explain, Trace, and Stored Outlines.
  • Exported the database and imported the same into development and test environment whenever required.
  • Created Oracle Database Users, Roles and managing Security. Implement effective database security practices and procedures.
  • Created Oracle Tables, Views, Constrains, Synonyms, Sequences.
  • Created Oracle Materialized Views.
  • Used the inner join, outer join, cross join while writing the complex SQL Queries.
  • Used Collection objects, ref cursor, and bulk collect, aggregate functions while Writing SQL Queries.
  • Created lots of UNIX shell script to automate the process.
  • Experience in writing shell scripting for various ETL needs
  • Design and development of UNIX Shell Scripts to handle pre and post session processes.

Environment: Oracle 9i, MS Access 2003, MS Excel, TOAD.

We'd love your feedback!