We provide IT Staff Augmentation Services!

Senior Etl Consultant Resume

0/5 (Submit Your Rating)

OBJECTIVE:

  • Pursuing as a Senior Data Warehouse ETL Developer/SQL developer to utilize 18 year work experience in SSIS, Ab Initio, Informatica, DataStage and SQL in Confidential, BMO, Citibank, RBC, IBM, CIHI, etc.

SUMMARY:

  • 18 years IT experience in Confidential, Confidential, Citibank, Royal Bank of Canada, IBM, CIHI, etc (15 years in financial industry including banking, insurance and capital market).
  • 15 year experience in ETL tools such as SSIS, Ab Initio, Informatica, DataStage, Talend and BI tools such as SSAS/SSRS, COGNOS, PowerCube, Transformer, Tabular Reports and BusinessObjects etc.
  • 12 years Data Warehousing experience in ETL, Data Architecture and Modeling including STAR and multidimensional schema and snowflake schema desig as well as Hadoop development.
  • 15 year data modeling, data analysis and development experience in SQL, TSQL, PL/SQL, HiveQL, MDX including 10 year Oracle, 7 year DB2, 7 year MS SQL Server and 5 year Sybase experience.
  • 10 year intensive coding experience in Unix Scripting, C/C++/C#, Java, Perl As well as 5 year mainframe experience in TSO/JCL/COBOL/SAS.
  • In depth knowledge in Agile development, Big Data/HADOOP, Cloud, Tableau, distributed storage and processing, Jenkins, Tableau, Anti Money Laundering (AML), Web Service, SDLC/PMLC.
  • Positive and responsible attitude and willingness to work hard in a fast - paced, multi-tasking and collaborative team environment.

TECHNICAL SKILLS:

D/W Technologies Development Tools: D/W Lifecycle, Normalization, Denormalization, Hierarchy, Star schema, Snowflake schema, Dimension, Multi-dimension, Data Modeling, SCD, COGNOS Impromptu, IWR, PowerPlay, Transformer, Catalog, Ad-hoc reports, PowerCube, MDX queries

Unix Technologies: GNU, Unix Libc, Unix kernel, Crontab, Berkeley socket, POSIX threads, Solaris threads, TCP Client/Server, IPC, Message Queue.

Database: ORACLE 8/9i/10g/11g/12c, ORACLE Application Server, DB2, Informix, Microsoft SQL Server 2008/2010/2012/2014/2016

Database Tools: ERwin, Coolgen, ORACLE Developer, Designer, ORACLE SQL*Loader, Materialized View, Data Dictionary, ORACLE Reports, Toad

Languages: Unix C, Perl, ORACLE PRO*C, PL/SQL, B/C/Korn Shell Script, COBOL, JCL, SAS, Java, J2EE, Python, Scala, Store Procedures, Kafka, ASP.NET, .NET

Platforms: Solaris, HP UNIX, AIX, Linux, Mainframe/OS/390, WIN NT, XP, Lotus Notes, Hadoop Distributed File System (HDFS), WINDOWS

Networks: TCP/IP, FTP, MQ-Series, SNMP, SNA, VPN

Development Tools: Ab Initio, InfoSphere DataStage, Informatica, Microsoft DTS/DQS/MDS, Power BI, VB Script, Quality Stage, Talend, Hadoop, Source safe, UltraEdit, MS Project and Office, MS Visio, Mercury Test Director, Lotus Notes, AQT, ESP, Autosys, Share Point, Confidential Compare, Notepad++, Web API, WINSCP., Eclipse, Spark, ESP, Aginity 4.3, COGNOS Impromptu, SAP, SQL Server Profiler, Kafka, Json, Cloudera

PROFESSIONAL EXPERIENCE:

Confidential

Senior ETL Consultant

Environment: Oracle, SQL server, SSIS/SSAS/SSRS, C#, Tabular, Anaconda, Python 2/3, Jenkins, UrbanCode Deploy, Tableau, Visio, Tibco, GEMS, Jira, Agile development, BloombergWrite SSIS packages and C# to extract/transform/load data from EDW to CSAD data mart using execute SQL task, script task, script component, JOIN, derived column, Lookup and etc.

Responsibilities:

  • Use SSAS to create OLAP and Tabular Cubue and SSRS/Tableau to create Reports/Dashboards and data discovery.
  • Work with Business Analyst to identify data sourcing requirement and draw out ETL mapping and design document.
  • Use Explain Plan to tune SQL performance to identify the bottle neck of slow queries and improve the performance.
  • Use Python to implement complex business logic to compute data Slippage and trade efficiency for Equity, Future and Option product.
  • Use Jenkins and UrbanCode Deploy to deploy the codes to SIT/UAT/Production and automate the development and testing.

Confidential

Senior Data Warehouse ETL Consultant

Environment: SQL Server 2012, T-SQL, SSIS/SSAS/SSRS, IBM Netezza 7.0, Abinitio, Microsoft Visual Studio 2010/2012, TFS, C#/C++, DevOps, Jira, SharePoint, Cognos, Hortonworks HDP, Bitbucket, github, Java 8, Core Java, Web Service, Hive, HQL, Hbase, NoSQL, SPARK, Scala, Python, Kafka.

Responsibilities:

  • Gathered business requirements, definition and design of the data sourcing and data flows, data quality analysis, worked with the data architect on the development of logical and data models.
  • Identified and fixed a hidden critical issue for Central Banking System (MECH), saving 6 million transactions from 6 regions while originally only taking in one region.
  • Extensively designed and created mappings using SSIS transformations such as Lookup, Derived Column, Data conversion, Aggregate, SQL task, Script task and Send Mail task etc.
  • Used SQL server framework database to store the META data such as parameters and logs and run SSIS packages to load the source files into Netezza Staging DB and then different layers.
  • Used C#, Java to preprocess and read, verify, split and merge source flat files, ebcdic files and XML files and the used SSIS to load into IDP to be used by downstream and web services.
  • Wrote complex sqls to transform data using inner, outer joins, sub queries and WITH queries.
  • Expertise knowledge on writing scripts using environment variables and parameters to dynamically deploy the packages to DEV/SIT/UAT/PROD.
  • Moved source data into Hadoop HDFS and wrote HiveQL to load from HDFS into Hive Level 1/2/3 and Hbase, converted Netezza SQL into HiveQL and moved Data Warehouse into Hive.
  • Designed and developed SSAS databases and OLAP cubes and created Matrix reports, Chart, List, Subreports and Tabular Reports for data visualization in SSRS for financial reporting.

Confidential

Senior ETL developer

Environment: Ab Initio 3.1.5, EME, AIX 7.1, Oracle 11G, KSH scripting, Jira, blueprint, SOAP, SAS 9.4, PL/SQL, Java, C/C++/C#

Responsibilities:

  • Completed client linkage project with zero UAT defect and deployed to PROD within one month, 30% ahead of schedule, which is to identify new born babies sharing health cards with mothers.
  • Took key ETL design and development role in client linkage and HSMR projects, developed Ab Initio graphs using various transformations such as lookup, rollup, filter, scan and normalize etc.
  • Developed Ab Initio graph and use SOAP component to call SAS codes to calculate performance trending values, which can pass 100 organizations per record to save time by 95%.
  • Competed complex queries using Union, Sub query, outer JOIN to extract data from source systems, use execution plan to optimize queries and improve performance.
  • Positive attitude and clarified requirement proactively with BA, identified 3 potential requirement issues and fixed them ahead of time before creating defects later in QA.
  • Attended issue review meetings daily with technical lead, QA, users, data modeler and project lead and developed code and supported QA/UAT/Stress Testing in an agile development.
  • Wrote Unix wrappers and set parameters if needed to run Ab Initio jobs by air command.

Confidential

Senior Data Warehouse ETL Consultant

Environment: SSIS/SSAS/SSRS, Ab Initio 3.1.7/3.0.3 , EME, Solaris, Sybase IQ, DBVisualizer 8.0.2, C/C++/C#, Java, QTP, LoadRunner, Autosys 11, SAS Enterprise guide 4.3, ITIL, Microstrategy, ASP.NET, SQL, mainframe.

Responsibilities:

  • Took ETL design and development role in MACH2 and CAD Project, which is to convert BMO credit card Consumer data into TSYS and create a central data repository.
  • Urgently took over coaching tool migration and completed the development in 1 month while it has been paralyzed for 4 months due to lack of requirement communications.
  • Worked on ETL transformation on different projects such as Confidential Credit Risk Project and MECH Mortgage transactions and create financial and regulatory reports.
  • Designed SSISPackagesto transfer data from flat files and mainframe ecbdic files to Sybase IQ data warehouse using Business Intelligence Development Studio.
  • Used Package Configurations and created deployment scripts, environmetn variables to make use of same SSIS package in Dev/Testing/Prod Environments
  • Extensively used SSIS transformations such as Lookup, Derived column, Data conversion, Aggregate, Conditional split, SQL task, Script task and Send Mail task etc.
  • Used Ab Initio to develop various transformations including join, rollup, scan and normalize to fulfill business rule as well as created EME tag to promote codes to QA/Production.
  • Use KSH/C/C++ to run unix wrappers and air commands as well as data cleansing and data standardization...

Confidential

Senior Data Warehouse Developer

Environment: SSIS/SSAS/SSRS, Ab Initio 3.0/2.15, EME, Talend, SQL Server, Sybase, HDP Hadoop,Hive, Java 7, Solaris, Linux, Hermes, IssueView, Unix Script, Agile, ITIL, .NET, C/C++/C#, BCP.

Responsibilities:

  • Reviewed QA test plan, provided QA support and delivered the codes to PROD weekly in an Agile environment and resolve the production issue if any within half an hour.
  • Develop SSIS packages using SSIS data flow tasks, script task and look up transformations and to extract, transform and load the data into data warehouse.
  • Created SSIS packages for File Transfer from one location to the other using FTP task.
  • Used Ab Initio and Store Procedures to convert Prime Finance and Trade OLTP data into OLAP data warehouse, use SSAS/SSRS to create cube and various reports and monitor market risk.
  • Extensively wrote Talend ETL logic using tMap, tAggregate, tNormalize, tJavaRow, tHiveConnection, tHiveRow and Java and loaded into OLAP and Hadoop Hive Database.
  • Tuned SQL and Store Procedures using explain plan, identified bottle neck, created proper index and used efficient archival strategy to improve performance.
  • Led production release and check out, provided 7*24 production support in rotation.

Confidential

Senior Data Warehouse ETL Developer

Environment: Ab Initio, BRE, InfoSphere Information Sever, Cognos, DataStage, QualityStage, SSIS/SSAS/SSRS, IBM IDE Rational, AIX, DB2, Oracle, MQ, IssueView, SOAP, NDM, Mainframe.

Responsibilities:

  • Reviewed the functional spec and mapping spreadsheet with BA/BI/QA, provided valuable feedback and recommendation and drew out technical design based on the requirement.
  • Drew out technical design, led the GR resources from Europe and India, monitored the progress, reviewed the codes and test result to deliver to PROD in time with good quality.
  • Used Ab Initio to write ETL transformation logic based on mapping rules as well as implemented continuous flow using MQ Subscribe/Publish and continuous components.
  • Replaced the “Join with DB” component with unload and take advantage of “Partition with Load balance” component, reduced the total time of processing 12 billion records from 12h to 3h
  • Wrote Korn Shell scripts/C/C++ and used SED and AWK to pre format the source and do data cleansing.
  • Used Prism to create Cobol ETL codes and uploaded it to the mainframe, the output data files are pushed to AIX via NDM jobs once the Cobol programs are compiled and run through ISPW

Confidential

Data Conversion Consultant

Environment: DB2, SQL Server, Erwin, DataStage, Mainframe, GladStone, COBOL, SAS, Focus.

Responsibilities:

  • Designed and created logical and physical data models based on business requirements using Erwin and generate DDL using forward engineering.
  • Used GladStone to generate COBOL codes and Submitted JCL jobs on mainframe to unload data from DB2 and download to PC using universal command.
  • Transformed and converted data from 3rd party Unum Insurance into IASP (policy), FINEOS (claim) data using DataStage and scheulde the jobs in Director.
  • Extensively wrote SAS programs to do the data analysis, merge files and generate reports to fulfill users' requirement.
  • Collaborately and effectively worked with US colleagues and provided guidance and review to contribute to high quality code delivery.
  • Proactively worked with BSA to received and reviewed the conversion requirement, raised questions to make sure requirement is fully understood and suggestion to improve if necessary.

Confidential

Team Leader

Environment: Solaris, AIX, ORACLE, CoolGen, Erwin, B/C/Korn Script, C/C++, Cron jobs, Perl 5, Informatica, Cognos Impromptu/PowerCube, Autosys, J2EE, WebSphere, mainframe COBOL.

Responsibilities:

  • Due to deliveries with good QA and high satisfaction from customers, won a star in April 2005 out of 305 candidates and grew the team from 4 to 11 team members.
  • Modeled data using Coolgen/Erwin and produced the logical and physical models, built up the start schema and snowflake schema using Normalization and Denormalization methodologies.
  • Successfully implemented ETL process with Informatica, Perl, C Shell, C++ and PL/SQL Scripts. Used Autosys and Crontab to schedule the job flow dependency.
  • Designed and built ORACLE Materialized Views (Snapshots) to implement data extraction from central data warehouse EDW to Mutual Fund Data Mart, saved ETL effort by 70%.
  • Used CoolGen to generate COBOL codes to implement complex business logic.
  • Used Cognos Transformer and PowerPlay to build Multidimensional Cubes, alternate drill down and CubeGroups, created the Crosstab, Pie graph and Bar charts etc.

Confidential

Data Warehouse Designer/Developer

Environment: AIX, ORACLE DBMS (8/9i), Ascential DataStage, Mainframe Z/OS, PL/SQL, Store Procedures, SQL*Loader, Erwin, CTRL-M, CTRL-D, C/C++, Crontab, Cognos, TCP/IP, TSO, JCL.

Responsibilities:

  • Analyzed OLTP systems data, defined business requirements, and produced good solutions with Data Warehouse architectural specifications as well as Star Schema and Dimensional Modeling.
  • Designed and successfully implemented the ETL process using DataStage to build PCDW Data Warehouse as well as Marketing and customer relationship data marts.
  • Captured data change from Oracle, DB2 and SQL Server tables and replicated CDC to data warehouse staging area.
  • Wrote Unix Korn scripts to perform various batch jobs including file systems backup, Oracle store procedure execution and flats file formatting and data cleansing.
  • Developed ETL programs using PL/SQL functions, store procedures, triggers, DataStage transformers to execute the initial and incremental data loads.
  • Managed Cognos Impromptu catalog and created Crosstab, List, Summary, Ad Hoc and Sub-query reports for business users.
  • Constructed COBOL codes to read data from and write into IDMS database.

Confidential

ETL Developer

Environment: Solaris, AIX, ORACLE, Toad, Informatica, Pro*C, SQL*Loader, Cognos, TCP/IP, SNA.

Responsibilities:

  • Designed Star and snowflake schemas for GLCA data warehouse, implemented data extraction, transformation and loading (ETL) using PL/SQL and C/Korn Scripts.
  • Indexed Database Tables and tuned SQL through the execution plan.
  • Build client-server platform using SOCKET-C and create bank daily journal and report using UNIX C and SQL

Confidential

Application/Database Developer

Environment: HP-UX, SCO-Unix, Oracle 7, Informix, Korn Shell, PL/SQL, Esql-C, Socket C, C/C++.

Responsibilities:

  • Wrote Unix C/C++, PL/SQL, Pro-C, Esql-C programs to manage data in Oracle/Informix server and generated various business reports including journal, balance sheet, GL checking report.
  • Developed Bank-Security Fund Transferring System between the stock account and banking account on Unix platform using Socket C, PL/SQL and store procedures.

We'd love your feedback!