We provide IT Staff Augmentation Services!

Lead Consultant Resume

4.00/5 (Submit Your Rating)

Chicago, IL

SUMMARY

  • Seasoned IT software professional with experience in Analysis, Design, Development, Testing, and Implementation of business application systems for Financial Banking, Healthcare, Telecommunications & Retail industry.
  • I have extensively worked in traditional technologies like Databases, Data Warehousing, ETL, Unix, Linux and gained expertise in them.
  • Looking for an innovative company with potential growth opportunities in the area of ETL, Database, Data Warehouse and Big Data Hadoop on AWS platform, Hive, Impala, Sqoop and Unix/Linux related technologies.
  • Involved in full Life Cycle Development including System Analysis, Design, Data Modeling, Implementation and Support of various applications in OLTP, Data Warehousing, and OLAP applications.
  • Extensive experience of Informatica Power Center 10.x/9.x/8.x/7.x and Power Mart 6.x to carry out the Extraction, Transformation and Loading process.
  • Experience in AGILE Methodologies - Sprint Planning, Retrospective and Scrum.
  • Worked on various projects related with various Relational Databases (RDBMS), Data Warehouse and ETL Tools.
  • Experience on Dimensional Modeling, Data Migration, Data Cleansing, and Data Staging of operational sources using ETL process and providing data mining features for data warehouses. Worked on Data Modeling including creating of Logical and Physical Models using Power Design, ER-win.
  • Good Knowledge in Administration tasks including Configuration of repositories, importing/exporting mappings, copying folders over the Dev/QA/Prod environments, managing users, groups, associated privileges and performing backups and restore of the repositories.
  • Good experience working on client/server, OLAP and ETL application development and tools using Informatica, Unix Shell scripting, Teradata, GreenPlum, Oracle PL/SQL & SQL, PostgreSQL, HTML/DHTML,SQL Server 2000, DB2(UDB), MS Access, Data Extraction, Transformation and Loading, Data Movement Strategies and Analysis.
  • Good knowledge of Big Data, Hadoop, MPP system and Big Data ecosystem - Hadoop, Hives, Impala, Sqoop, HDFS, MapReduce, Linux, etc.
  • Good knowledge of Big Data querying tools -- Pig, Hive and Impala, Sqoop, Map Reduce.
  • Have knowledge of Spark.
  • Good experience working with STAR, SNOWFLAKE and 3rd Normal Form Data Warehouse schemas.
  • Expertise in writing Scripts using Teradata parallel load/Unload Utilities like: BTEQ, Teradata SQL Assistant, Tpump, Fast Load, Multi Load, OLE Load, Fast Export.
  • Experience in Database Programming, SQL Programming (Stored Procedures, Functions, Macro and Triggers) for faster transformations and automation.
  • Excellent experience working with LINUX/UNIX and shell scripting.
  • Expertise in handling high volumes of data on daily basis and loading millions of data into data warehouse in a given load window.
  • Good working knowledge of GreenPlum and PostgreSQL and Proficient in SQL Tuning and troubleshooting skills.
  • Have training and good working knowledge of TALEND.
  • Expertise in handling high volumes of data on daily basis and loading millions of data into data warehouse in a given load window.
  • Good Knowledge of Perl and in data transformations, conversion and interface, data loading and performance tuning.
  • Good experience in writing SQL queries in multi database environment.
  • Ability to meet deadlines and handle pressure in coordinating multiple tasks in a work/project environment. Strong communication, organizational and interpe rsonal competencies along with detail-oriented and problem-solving skills in the technology arena.
  • Ability to work individually as well as in a team with excellent problem solving and trouble-shooting capabilities.
  • Self-motivated and committed person with excellent interpersonal communication skills and willingness to learn new tools, concepts and environments.

TECHNICAL SKILLS

Languages: HTML, SQL, PostgreSQL, PL/SQL, Unix Shell Scripting, Batch Scripting, Perl, SQL, Embedded SQL, ANSI SQL, NoSQL

Databases: Oracle 7.x/8x/9x/10g, Teradata v2r5.0/v2r6.0/12, DB2 (UDB) 9.7, GreenPlum, SQL Server 2000.

Big Data Tools: Cloudera CDH5.x,Hadoop, Hives, Impala, PIG, Sqoop, Spark, HDFS, MapReduce, Linux, etc.

Data Warehousing Tools: Informatica Power Center 10.x/9.x/8.x,7.x/6.x, Informatica Power Mart 5.1.2, Power Exchange 5.x/8.1,Power Connect, PowerCenter Data Analyzer, Mapping Designer, OLAP, OLTP, ETL, Data Marts, Data Mining, Talend.

Data Modelling: Erwin 7.3,5.2, Star-Schema Modeling, Snowflakes Modeling, Fact & Dimension Tables

Operating Systems: Unix, Windows 95/98 and Windows NT 4.0/2000/XP

Tools: /Utilities: Rapid SQL, Toad, Teradata Manager, Erwin, ER-Studio, Queryman/SQL Assistant, Teradata Multiload, Fast Load, BTEQ, Fast Export, Tpump and SQL Navigator.

Others: MS Office (Word, Access, Excel, Power Point, Outlook, FrontPage), Client/Server Applications, Skilled in use of Macintosh computers

PROFESSIONAL EXPERIENCE

Confidential, Chicago, IL

Lead Consultant

Responsibilities:

  • Worked on multiple projects and co-ordinate with BMO business users(FSG) to understand requirements and changes need to be performed.
  • Involved in complete SDLC including Envisioning, Planning, Design, Development, User Acceptance Testing and Deployment.
  • Reviewed Business requirements and developed detailed ETL estimates and resource allocation for large scale ETL projects.
  • Involved in detailed Data Analysis of the different source data elements from each of the interfaces and then identify the rules to construct the XAM facility in the MRP application.
  • Modeling of the dimensions and fact tables.
  • Developed Technical specifications from Business specs.
  • Interviewed, trained and led a group of offshore developers to effectively supplement the onsite ETL work.
  • Design and develop for any new process and use Informatica Maps and Unix scripts to integrate the data to the existing data structure.
  • Created and modified PeopleSoft projects when ETL specifications required new DB2 tables or changes to existing DB2 tables.
  • Provide Production Support on rotational basis.
  • Perform ETL migration and co-ordinate database migration with DBAs.
  • Performance tuning of long running queries and ETL maps.

Environment: Informatica PowerCenter 10.1, Oracle, SQL Server, Salesforce, DB2(UDB), SQL, PeopleSoft Designer, Shell Script, Unix, Rapid SQL, Command Editor, ESP Scheduler.

Big Data Tools: Big Data Hadoop, Hive, HiveQL, Sqoop, Impala, AWS, Cloudera, HDFS, Linux.

Confidential, Lincolnshire, IL

Sr. Consultant

Responsibilities:

  • Followed Agile methodology and daily Scrum meetings.
  • Involved in architecting the ETL process and design the ETL components.
  • Involved in managing/delivering multiple projects.
  • Handling offshore team and managing their work and technical issues.
  • Design specs for offshore team and build and Technical requirement document.
  • Weekly meet with them on work and progress.
  • Perform process management.
  • Work and co-ordinate tasks with Operations, BTAs and other Technical team.
  • Code in linux scripts and Informatica on the projects allocated.
  • Perform Production deployment.
  • Provide production support and bug fixes on as needed basis.
  • Perfom datamining of the HDFS files using HIVE.

Environment: SQL Server, TFS, Informatica PowerCenter 9.x,10.x, DB2, SQL,TSQL, Microsoft Dynamic CRM, Shell Script, LINUX, HDFS, HIVE, HiveQL and Sqoop

Confidential, Chicago, IL

Teradata Architect

Responsibilities:

  • Performed analysis of the existing manual reports and converting them into SQLs.
  • Discuss and analyze the new design for automation.
  • Implemented physical design changes on database objects.
  • Implemented MVC table compression on Accums related tables.

Environment: Teradata, DB2, Teradata SQL Assistant, Teradata Utilities (MLoad, FastLoad), UNIX Shell Script, MVS.

Confidential

Sr. Consultant

Responsibilities:

  • Worked with Orbitz BI group on ETL conversion projects from Teradata to GreenPlum.
  • Understand the specification and analyzed data according to client requirement.
  • Involved in analyzing the Risk and Benefits of each requirement and change request and prioritize them accordingly.
  • The ETL process consists of both the use of hand coded ETL processes and using ETL tool(Informatica).
  • Informatica transformations were developed for complex transformations rules. Used various Informatica transformations like Union, Router, Mapplets, Filter, Aggregators, Update Strategy and Salesforce power exchange plug-ins.
  • Data was extracted from multiple sources like Data files, Oracle, Salesforce, DB2 (UDB), GreenPlum and Teradata.
  • Involved in design, development, implementation and testing of the loading processes into database for Salesforce, PivotLink and CompApp applications.
  • Work with DBAs to implement physical design and database objects and changes (DDL) per specifications.
  • Teradata utilities(BTEQ, MLoad, FastLoad, Tpump) were coded and used as needed.
  • Unix Scripts were written for file manipulations, archival and FTP process.
  • Wrote extensively SQL queries and performance tune them as needed.
  • Involve in the co-ordination with different groups to fulfill dependencies.
  • Involved in designing the logical and physical tables for CompApp process using Erwin.
  • Scheduling and process dependencies were done in Opswise.

Environment: Teradata, GreenPlum, Oracle10g, Informatica Power Center 8.6/9.1, SQL, PostgreSQL, Salesforce, Teradata Utilities (MLoad, FastLoad, BTEQ), Teradata SQL Assistant, Toad, UNIX Shell Script, UNIX.

Confidential, Bensenville, IL

ETL Developer

Responsibilities:

  • Worked closely with business analysts to understand business needs and requirements for DNC and RingBack tone project and designing ETL processes using Informatica Power Center.
  • Created mappings using Informatica for extracting data from sources like Flat Files, Oracle, SQL Server and DB2 loading them into CIS Data Mart on DB2 (UDB) using various transformations (Source Qualifier, Aggregators, Filters, Router, Joiners, Sequence, Lookups, Update Strategy) to build Business rules.
  • Point of contact for the DNC project and Coordinate between different teams. Created Technical ETL document from functional requirements.
  • Created mappings using various transformations like Sequence Generator, Lookup, Joiner, Source qualifier etc. in Informatica Designer.
  • Created Mapplets and resuable transformations to use them in different mappings.
  • Developed ETL constructs to send Third-Party Subscriptions(TPS) Data from the CARES source system to Enterprise Data Hub(Inbox) and implemented business rules to transform the EDH data to Outbox(Outperform) in the form of Flat files to the Vendor Synygy.
  • Extensively used Post Session, Pre session, Decision and Event tasks as part of SDF methodology using informatica for Data Loading.

Environment: Informatica Power Center 8.6, Oracle10g, SQL Plus, Toad, Flat Files, Unix, Maestro, Clear Case, Clear Quest and Windows XP professional.

Confidential

Sr. Teradata Architect/Consultant

Responsibilities:

  • Implemented physical design and database objects and changes (DDL) per DBA specifications.
  • Closely worked with business analyst on understanding the business requirements.
  • Designed and developed the unit test plans.
  • Implemented Partion Primary index for range queries.
  • Performance tune the SQL queries by creating Secondary index, Join index, PPI, checking the skew Factor, collect statistics.
  • Monitor the tasks and queries using the Teradata Manager.
  • Designed and developed the ETL processes to extract data from DB2 and Oracle tables into Teradata.
  • Extensively wrote the BTEQ scripts to incorporate the transformation rules.
  • Designed and developed the Teradata tables and perform analysis on skew index and selecting primary Indexes.
  • Data conversion was done using Teradata Utilities, SQL and UNIX shell script and load them into Teradata tables.
  • Extensively coded the complex SQL queries for extraction of legacy data and HCA reporting.
  • Performed SQL query rewrites and performance tune the sql using utilizing derived tables, global temp and volatile tables to improve performance.
  • BTEQ, MLoad, FastLoad and Fast Export scripts were created as needed for data migration and ongoing data load.
  • Involve in the co-ordination with different groups to fulfill dependencies.
  • Unix Crontab was used for scheduling the UNIX scripts.
  • Converted Oracle PL/SQL scripts to Teradata BTEQ scripts (invoked via UNIX shell scripts) and stored procedures.

Environment: Teradata v2r6, DB2, Oracle10g, Teradata Utilities (MLoad, FastLoad, BTEQ),Teradata Manager, Toad, Squirrel, UNIX Shell Script.

We'd love your feedback!