We provide IT Staff Augmentation Services!

Data Analyst Resume

2.00/5 (Submit Your Rating)

Irving, TX

SUMMARY

  • 5 years of Professional experience with extensive experience in Design, Development and Implementation of Data Warehousing and Business Intelligence domain.
  • Worked with various industries like Telecom,Mortgage,Healthcare.
  • Involved in the SDLC (system development life cycle) of a data warehouse and responsible for designing, coding, testing, implementing, supporting the ETL processes for data warehouse solutions/ data marts and implementing the reporting requirements.
  • Hands - on experience on Python and libraries like Numpy, Pandas, Matplotlib, Seaborn, NLTK, Sci-Kit learn,SciPy,Collections,BeautifulSoup.
  • Expertized in web scraping using python libraries like Requests,Urllib,Xslxwriter,Openpyxl.
  • Experience in database programming in PL/SQL (Stored Procedures, Triggers and Packages).
  • Well versed in developing the complex SQL queries and Views.
  • Hands on experience in design and configuration of landing tables, staging tables, base objects, hierarchies, foreign-key relationships, lookups, query groups, queries/custom queries and packages.
  • Experience in project deployment using Jenkins.
  • Experience in creating and managing versions of the programs and documents using Github .
  • Good knowledge on Kimball/Inmon methodologies.
  • Worked in Waterfall, Agile (Scrum, Kanban) Environment.
  • Working experience in Data Warehousing, Data Migration, Data Integration using ETL Tool Informatica Power Center.
  • Good Knowledge in Dimensional Data modeling, Star/Snowflake schema design, Fact and Dimensional tables, Physical and Logical data modeling.
  • Good exposure to Testing, Debugging, Implementation, documentation, End-user training and Production support.
  • Team player and self-starter with good communication skills and ability to work independently and as part of a team.

TECHNICAL SKILLS

Databases & Tools: Oracle 11g/10g/9i/8i, DB2, Teradata 15, SQL Server 2016/2017,SQL Developer, Teradata SQL Assistant, Aginity Workbench.

Languages: Python,Unix Shell Scripting.

Data Modeling & BI Tools: PyCharm,Visual studio, Jupyter Notebook,Erwin,Tableau,MS Visio.

Environment: Windows, Unix, Linux.

Informatica TOOLS: Informatica Power Center 10.1,9.6.1

PROFESSIONAL EXPERIENCE

Confidential - Irving, TX

Data Analyst

Responsibilities:

  • Involved in AI SOI Accessory personalization recommendation project for Confidential Wireless Products.
  • Responsible for data analysis and actively involved in gathering and analyzing the needs of end user requirement and system specifications and worked on deriving metadata, data dictionary and Source to Target mapping documentation.
  • Created bridge tables to create a unified view by writing SQL queries using Joins, sub queries and windowing functions and partitions.
  • Developed a job to perform Data munging on the raw data set and loading it into Oracle database in Python.
  • Developed a aggregation script to aggregate the data layer tables for hourly, daily, monthly, quarterly and yearly rollup using Python library like, numpy, scipy.
  • Created ad-hoc performance dashboard for analysis,performed data visualization using matplotlib library.
  • Worked on developing some meaningful insights for the leadership team to better understand and improve the business performance.
  • Worked in Agile environment .

Environment: Python 2x/3x, numPy, sciPy, matplotlib, MS SQL Server 2017,MS Word/Excel/PowerPoint, AGILE.

Confidential - Plano, TX

Data Analyst

Responsibilities:

  • Involved in the cloud migration of Operational and Analytical Data of Confidential Financial Services using Snowflake DB in AWS Cloud services.
  • Scrape data from various sources, cleaned and processed unstructured data to structured data.
  • Performed data analysis and data profiling using complex SQL to ensure accuracy of the data between Snowflake DB in AWS and existing legacy source systems using python libraries.
  • Expert in all areas of SQL Server development including Tables, Derived Tables, views, stored procedures.
  • Good experience in using Aggregates, Analytical functions like RANK, DENSE RANK, ROW NUMBER, Sequence in the queries.
  • Contribute to streamline the workflow, including generating the schema and access pools for each business section, and initializing query automation by Python programming.

Environment: AWS SnowFlake, AWS S3/AWS,MS SQL Server 200 Python3.0/2.7, SQL, PL/SQL,Unix Shell Scripting, Visual Studio,SQL workbench,.

Confidential - Florence, KY

Informatica Developer

Responsibilities:

  • Primary role is to Load the data from legacy systems to SQL Server 2016 by using Informatica.
  • Gathered requirements for the scope of loading data from various sources to database.
  • Documented the Mapping and Transformation details, user requirements, implementation plan and schedule.
  • Developed SQl Queries and Informatica Mappings.
  • Responsible for Data quality analysis to determine cleansing requirements.
  • Extract Data from various sources, flat files to Transform and load into Staging.
  • Designed and developed ETL Mappings using Informatica to extract data from flat files and XML, and to load the data into the target database.
  • Developed Workflows, Worklets and Tasks by using Workflow manager.
  • Written Pre and Post session SQL commands to load metadata tables.
  • Used various Transformations like Router, Filter, Joiner, Update Strategy and connected and unconnected lookups for better data messaging and to migrate, clean and consistent data.
  • Used AutoSys for Scheduling and monitoring using Informatica Scheduler.

Environment: Informatica Power Center Designer 9.6.2, Informatica Repository Manager, shell scripting, SQL Assistant, Windows, AutoSys.

Confidential - Richmond, VA

Jr. ETL Developer

Responsibilities:

  • Worked for requirement analysis phases and prepared the necessary documents.
  • Designed the Data Model for roles, rights, applications, user and organizations dimensions table as SCD Type 2.
  • Created User Role Fact table and Application Role Fact Table.
  • Used the concept of bridge tables to maintain the one to many relationships between Designed Fact less fact tables.
  • Used target Update Override to update table based on natural key.
  • Worked on designing of all fact and dimension tables ETL load and on Reload Strategy for all mappings.
  • Use Bulk load while loading Temp tables with Pre-processing scripts to disable indexes and post processing scripts to enable indexes and analyze tables.
  • Designed and executed unit test cases and performed Unit Testing using SQL based approach for various tables.

Environment: Oracle 11g, Informatica Power Center 9.6.1, UNIX, SQL scripts, PL/SQL codes, Shell Script.

Program Analyst

Confidential

Responsibilities:

  • Involved in the analysis of business requirements and keeping track of data available from various data sources, transform and load the data into Target Tables using Informatica Power Center.
  • Involved in building and supporting the extraction flows.
  • Identification and Documentation of various data sources including the detailed Table/File/Field level mappings between the source and target systems.
  • Created Informatica Mappings to build business rules to load data using transformations like Source Qualifier, Aggregator, Expression, Joiner, Connected and Unconnected lookups, Filters and Sequence, External Procedure, Router and Update strategy.
  • Tested mappings and sessions using various test cases in the test plans.
  • Implemented SCD methodology including Type 1, Type 2 to keep track of historical data.
  • Extensively worked on Mapping Variables, Mapping Parameters and Session Parameters.
  • Created post-session and pre-session shell scripts and email-notifications.

Environment: Informatica Power Center 8.6.1, Business Objects, AutoSys, Ingres, Shell Scripts, HP UNIX, Windows XP, Erwin.

We'd love your feedback!