We provide IT Staff Augmentation Services!

Data Analyst Resume

0/5 (Submit Your Rating)

San Francisco, CA

SUMMARY:

  • A certified SAS programmer with an experience of 8plus years in both clinical and financial industries.
  • Worked in banking sector with a very good involvement in various phases of Software Development Life Cycle.
  • Mostly worked with business team in preparing Business requirement document along with gathering the requirements and involved in coding, debugging, application design, integration, documentation, maintenance and deploying.
  • Adept in data analysis, requirement analysis, data mining, processing, manipulation, integration and transformation.
  • Experience object oriented programming (OOP) concepts usingPython, C++ and PHP.
  • Experienced in WAMP (Windows, Apache, MYSQL, andPython/PHP) and LAMP (Linux, Apache, MySQL, andPython/PHP) Architecture.
  • Good knowledge onHadoop, Pig, Hive.
  • Extensive experience on Data Warehousing DatabaseTeradataDatabase12, 13 & 14.
  • Very good understanding ofTeradataRDBMS architecture such as Shared Nothing, Nodes, AMPs, BYNET, Partitioning, Primary Indexes etc. Extensively used different features ofTeradatasuch as BTEQ, Fast load, Multi load, SQL Assistant, DDL and DML commands.
  • Proficient in Data Warehousing concepts including in Data mart and Data Mining using Dimension Modeling (Star Schema and Snow - Flake Schema Design).
  • Strongly knowledgeable inTeradataSQL programming using Joins, Union, Rank functions, Aggregate functions, time functions, Sub select, Grouping, Logical expressions, Explain and Statistics.
  • Extensively worked on Monitoring system capacities, system growth, and usingTeradataSQL implanted data compression techniques to improve application performance.
  • Involved in Understanding and translating the business requirements and to High Level and Low level design for ETL processesinTeradata.
  • Got good experience withNoSQLdatabase Worked extensively on DBC tables along with Aggregates and building efficient views.
  • Strong working experience in planning and carrying out ofTeradatasystem Loading Processes, Data Warehousing, Large-Scale Database Management.
  • Extensive experience in SAS programing for extracting data and reporting procedures.
  • Very good at SQL, Data validation, Data exploration and trigger.
  • Worked with various different software development life cycle models: Agile, Waterfall and Scrum models.
  • Strong experience in Advanced SAS programming procedures like PROC SQL, PROC FORMAT, PROC TRANSPOSE, PROC ACCESS, PROC TABULATE,PROC IML, PROC FREQUENCY, PROC APPEND, PROC GPLOT, PROC GCHART, MACROS.
  • Worked with generating reports using SAS Add In for MS Office, Browser interfaces like SAS web report studio, windows clients SAS Enterprise Guide,
  • Have in depth knowledge of Macro Programming with SAS/BASE, SAS/MACRO, SAS/CONNECT, SAS/ODS, SAS/SQL, SAS/STAT, SAS/GRAPH in both Windows, Oracle and Tera database.
  • Good experience in table analysis using PROC FREQ with LIST, OUT, MISSING options.
  • Hands on experience in generating Ad hock reports using BTEQ and UNIX.
  • Worked with Mainframe SAS and Mainframe JCL.
  • Extensive experience in bug tracking and verifying the defects in the existing code.
  • Worked on SAS Intelligence platform architecture for accessing large amounts of data efficiently.
  • Performed data migrations and visual analytics using tableau.
  • Good knowledge about ETL techniques used to the extraction, transformation and loading of data among different databases.
  • Strong command over UNIX shell scripting.
  • Experience in working with extraction transformation and loading(ETL) process using SQL and PL/SQL
  • Strong knowledge in Tuning PL/SQL code for handing large volume of data and improvising the performance of data.
  • Hands on experience in using the SAS BI web application to create the dashboards.
  • Good experience in Perl scripting procedure.
  • Experience in working with R, Hive, Cassandra and Pig.
  • Good knowledge about SAS Migration Utility and SAS Migration version analysis.
  • Hands on experience in SAS Grid manager with in the SAS management console.
  • A solid grounding in Preparing and analyzing the Business Requirements Documents (BRD), Technical documents and Functional Requirements Documents.
  • Good experience in working with both business and development teams.
  • Versatile team player with good communication and problem solving skills with all management levels.

TECHNICAL SKILLS:

Programing Languages: C/C++, SQL, PL/SQL, ORACLE, PHP, XML

Operating systems: Windows (95, 98, 2000, XP), Linux, UNIX, Solaris

Databases: MS SQL Server, MS Access, Oracle 9i/10g/11g, DB2

SAS Expertise: SAS/BASE, SAS/SQL, SAS/MACROS, SAS/CONNECT, SAS/STAT, SAS/SERVER, SAS/REPORTS, SAS/REPORTS, SAS/ACCESS, SAS/GRAPH, SAS/ODS

SAS Procedures: Print, SQL, Report, Sort, Format, Import, Export, Tabulate, Summary, Compare, Frequency, Transpose, Gplot, Gchart, Means, Regression, Correlation, Univariate

SAS BI Tools: SAS Enterprise Guide, SAS Data Integration Studio, SAS Management Console, SAS Information Map, SAS Customer Intelligence Studio, SAS Information Delivery Portal, SAS BI Dashboard, SAS Web Report Studio, SAS Add-In for Microsoft Office.

Web Technologies: HTML, Macromedia Flash, Confidential Photoshop, Dream Weaver.

Scripting: Perl scripting, Shell Scripting.

BI Tools: OLAP, Qlikview, Cognos, Tableau

PROFESSIONAL EXPERIENCE:

Confidential, San Francisco, CA

Data Analyst

Responsibilities:

  • Worked in collaboration with the business team to perform data integration with SAS coding.
  • Used Dynamic LIBNAME, PROC APPEND, PROC FORMAT, PROCSQL, PROC IMPORT, PROC SORT, PROC EXPORT, PROC COMPARE
  • Worked on building VBA Macro for processes to pull data for specific requirement in case of recurring jobs.
  • Analytical function applied across data in multiple sources (Hadoop, Teradata Excel).
  • Provided Support in SAS Programming and technical assistance in the development steps of SDLC phases.
  • Developed SAS/Macros extensively to decrease the redundancy and to automate the SAS code.
  • Worked with team of developers onPythonapplications for RISK management.
  • DevelopedPython/Django application for Google Analytics aggregation and reporting.
  • Used Django configuration to manage URLs and application parameters.
  • Worked onPythonOpen stack API's.
  • UsedPythonscripts to update content in the database and manipulate files.
  • GeneratedPythonDjango Forms to record data of online users
  • UsedPythonand Django creating graphics, XML processing, data exchange and business logic implementation
  • Created dashboardSSRSreports under report server projects and publishingSSRSreports to the reports server.
  • Developed functional and technical documentation to assist in the design, development and/or maintenance of deliverables.
  • Use Hadoop andNoSQLpackages to process and store microarray data
  • CreatedSSRSData Model projects using Microsoft Visual Studio 2005 and using Report Builder for report server to facilitate Ad-hoc reporting by the business users
  • Worked on optimizing and tuning theTeradataSQLs to improve the performance of batch.
  • Collected Multi-Column Statistics on all the non-indexed columns used during the join operations & all columns used in the residual conditions
  • Tuned various queries by Collecting Statistic on columns in the Where and Join expressions
  • Worked extensively withTeradataUtilities like BTEQ, Fast export and all load utilities
  • Extensively used complex SQL, Indexes, Joins and Triggers.
  • Worked on converting theTeradataresource utilized tasks to a Mainframe tasks to saveTeradataCPU resources and database space
  • Used PROC/SQL for retrieving the data from various data sources which include data warehouse and SQL server and UNIX.
  • Worked with Teradata in creating tables, views according to the requirements.
  • Developed analytical process, statistical models and quantitative approaches, leveraging SAS Ad hoc reports.
  • Debugging, documentation and maintenance of the Ad hoc reports according to the demand.
  • Worked with SAS Pass through for converting Oracle Data tables in SAS Data files and used SAS Db load Procedure for uploading SAS data files into ORACLE Tables.
  • Used UNIX Scripting for data encryption and file transferring using FTP, SFTP and MFT Processes.
  • To obtain desired format for metrics used SQL Queries and created SAS datasets.
  • Prepared various metric formats for reporting different business functions like loss prevention and loss mitigation.
  • Performed QA audits to ensure that all the delivered files were validated by matching the output results to the business requirements.
  • Using SAS Enterprise Intelligence platform worked on delivering various intelligence strategies which provide good support in the day to day changing needs of the enterprise at different levels of departments.
  • Also used Enterprise intelligence platform in data extraction form variety of operational data sources on multiple platforms in building data warehouses and data marts which help to integrate the extracted data.
  • Worked on Qlikview platform for creating the batch files for automatic running of Qlikview scripts. Was also involved in design and development of integrated and expandable database using Qlikview platform.
  • Involved in analyzing the threat reports produced by the SAS Cybersecurity teams.
  • Worked in close relation with the Fraud detection team in understanding the new claim threats, fraud claims and other fraudulent activities.

Environment: SAS/BASE,SAS/STAT,SAS/MACRO,SAS/ETL,SAS/CONNECT,SASEnterprise Guide,SASEnterprise Miner,SASWeb Report Studio,SASManagement Console,SASOLAP & EBI Server,Teradata 14, R, Python,MS SQL,Hadoop, Hive, Pig, MahouTeradata SQL Assistant, MLOAD, TPT, FASTLOAD, BTEQ, TPUMP, Oracle 11g, Informatica Power Center 8.6, UNIX,Teradata13, UNIX shell scripting,NoSQL, Hadoop, Tivoli, Ab-Intio, MS SQL server, Oracle, DB2, SAP BW and Business Objects

Confidential, Herndon, VA

Data Analyst

Responsibilities:

  • Worked with the statistical reporting procedures used for regular data collection and data analysis.
  • Extensively used the SAS/MACROS for creating SAS reusable programs which can be modified according to the timely requirements.
  • UtilizedSSRSreports from different source formats satisfying clients by representing reports and presentations.
  • Used various options like DROP, ADD, KEEP, and RENAME to make changes to the existing datasets.
  • Worked with tableau admin in creating the users, group, projects and data connection settings.
  • Involved in creating the dashboards and report schedules on tableau server.
  • Used Perl scripting for handling the data feed, implementing business logic, communicating with web services through SOAP lite module and WSDL.
  • Worked on migrating the existing campaign execution from SAS to Unica.
  • Prepared various customer lists using SAS for marketing procedures.
  • Involved in development of Ab Initio graphs for loading the tables in Oracle and also helped in redesigning of the existing graphs in Ab Initio along with the documentation of changes made according to the requirements.
  • Used macros for integrating the complex business rules.
  • Performed various metadata functions like information map, OLAP, Enterprise Guide.
  • Worked with the Credit risk Analysis reporting system based on the fico sores. Prepared repots which can help to prepare credit risk management policies.
  • Involved in moving large datasets across Mainframes, Teradata, UNIX and SAS.
  • Used Modeling SAS data stores like SAS/OLAP, SAS/MOLAP Cubes to support the reporting teams and the business intelligence group.
  • Manage and Maintain the Application Reporting Database.
  • Responsible for reporting Viewership Metrics of U-verse and direct TV usage.
  • Design and Development of SSAS Cubes to sustain the Application Demands and Needs.
  • Identified, analyzed, and interpreted trends or patterns in complex data sets to the end users.
  • Implemented Many SSIS Packages and Jobs to assist with daily running queries.
  • Created complex stored procedures and developed dynamic SQL Queries to adopt dynamic environment.
  • Worked with Huge Databases inTeradataand Vertica and SQL Environment.
  • Successfully created Snapshots Report, Report Subscription, Data Driven Subscription and Report caching using Report Manager.
  • Designed user defined hierarchies in SSAS including parent-child hierarchy.
  • Involved in dimensional modeling to design data warehouse.
  • Developed Dimensions, Data Source Views, SSAS Cubes. Deployed and Processed SSAS Objects
  • Created Partitions for faster response of Data in SSAS
  • Tuning of SQL Queries for Performance improvements inTeradataand SQL.
  • Hands on Experience withTeradataand working through huge data sets and overcoming the Spool Space Errors
  • Extensively worked on SQL Server Integration Services (SSIS) and SQL Server Reporting Services (SSRS)
  • Involved in preparing the testing data for the testing jobs in User Acceptance Testing environment.
  • Prepared various summary reports of different models with the help of SAS Factory miner which enables the detailed data drill down.
  • To detect the data issues prepared interactive variable distribution graphs using SAS Factory Miner.
  • Responsible for migration from SAS 9.2TO SAS 9.4 version.
  • Provided Scripting support and enabling MapReduce programing with in the SAS Environment with help of SAS Hadoop.
  • Supported the change manager during the migration by reporting the progress up to date and analyzing the cost and benefit tracking.
  • Identified the threats and provided SWOT Analysis reports to the change manager to identify and mitigate the possible risks.
  • Designed and developed MI reports to meet the specified business requirements by using Teradata SQL.
  • Designed the dashboards and the BI reports with the help of SAS Visual analytics also worked in creating interactive charts for Hadoop data.

Environment: TeradataV12,TeradataSQL Assistant, BTEQ, FLoad, FExport, MLoad, TPT, TPump Erwin4.1.4, Informatica 8.1, Business ObjectsXiR2, Quest Toad 9.3, UNIX Shell Scripting, SQL*Loader, Smart Putty, SQL Server, Windows XP, UNIX.SQL Server 2012/2008, Vertica,Teradata13, SSRS, SSAS, SSIS, Microsoft Visual Studio 2012/2008, Relational DBMS, Client-Server Architecture, Clustering, MS Office- 2013. MSSQL 2014/ 2 2008 R2, MongoDB,Hadoop, VMware, Citrix Server, IIS, windows OS, win 7, NoSQL,PostgreSQL,Windows Cluster Server, BIDS 2005/2008/R2, SSDT 2012, SQL Server Management Studio, Query Analyzer, SQL Profiler, Performance Monitor, SSIS, SSRS, Remedy, Lite Speed, TSQL, Stored Proc’s, SOX, Triggers, UDF’s, Constraints, DTS, BCP, Data Compression, Partitioning, Joins, DDL, DML, Cursors, Server Migration, Batch Files, Centralized Schema Repository, SAN, PowerShell, PostgreSQL, Active Directory, ODBC

Confidential, DE

Data Analyst

Responsibilities:

  • Creating Database libraries in both UNIX and Linux environment.
  • Worked with SAS EBI tools for application development and customizing dashboards and portals.
  • Worked for data integration Data integration studio for transformation to support Hadoop.
  • Hands on experience in preparing the test case and test scripts for Integration Testing and also worked on preparing the use cases for Clint systems.
  • Worked on finding out the missing values and frequencies of the missing values in the mortgage data.
  • Involved in requirements gathering, analysis, design, deployment and user .
  • Involved in Physical / Logical design of database, creating and managing the database and all the database objects in Sql server 2008.
  • Developed E-R diagrams (logical and physical using Erwin) mapping the data into database objects.
  • Designed and developed various DB Objects like Stored Procedures, User defined functions, triggers and used Indexes for accomplishing various tasks.
  • Created Views to facilitate easy viewing implementation and for security
  • Used SQL Server Agent to automate the process of rebuilding indexes at regular interval for better performance.
  • Good understanding ofSSRSwith Report authoring, Report management, Report delivery and Report security and Created reports with Analysis Services Cube as the data source usingSSRS2005/2008.
  • WrotePythonroutines to log into the websites and fetch data for selected options.
  • Used Pentaho Kettle as ETL Tool for Extracting data from different sources to production servers and from there to the OLAP Data ware house.
  • Created Logins and Job roles.
  • Responsible for Backup of databases.
  • Worked with team of developers onPythonapplications for RISK management.
  • Built the SSAS Cubes and optimized them.
  • Loaded Data Daily into SSAS cubes using Process Incremental and Caching of SSAS Cubes to have even more faster response
  • Created packages using various Data transformations and tasks and scheduled jobs to perform filtering operations and to export the data (reports) on daily, monthly basis on to SFTP server using SSIS.
  • Transformed data from various data sources using OLE DB connection by creating various SSIS packages.
  • Successfully deployed SSIS Package into Production environment and used Package configuration to export various package properties to make package environment independent.
  • Worked with deployment, maintenance, usability and portability of SSIS Packages
  • Implemented Event Handlers and Error Handling in SSIS packages and notified process results to various user communities.
  • Used SQL server reporting services (SSRS) delivering enterprise, Web-enabled reporting to create reports that draw content from a variety of data sources.
  • Worked on SSRS 2008 to generate more than 80+ reports for various user groups and developer groups.
  • Developed Reports, Sub Reports, Drilldown reports, Data driven subscriptions, and used various features like Charts, graphs, filters etc.
  • Created stored procedures and used as the datasets for the Report Design.
  • Developed Linked reports in SSRS.
  • Providing the data quality reports to the credit bureau using the updated QA metrics in a timely manner.
  • Leveraged Machine Learning to create diagnostic classfiers and clustering analysis.
  • Analyzed and designed feature set to maximize model fit with R.
  • Implemented the machine learning algorithm into production software utilizing Python.
  • Applied SVM machine learning algorithm to non-linear data to fit and predict.
  • Wrote Algorithm programming with R and python. Sometimes Matlab.
  • Developed BTEQ scripts and worked on the performance tuning on the same.
  • Data extraction and validation, applying rules transformation in the flow of data from source to target.
  • As per business requirement, validating the source and target data usingTeradata.
  • Created the low level design document.
  • Creating test case scripts inTeradatato validate different stage
  • Involved in post implementation support and resolved product related bugs.
  • Digging out the raw data using SAS Enterprise Miner.
  • Eliminated the duplicate observations in the datasets by using the NODUPKEY and NODUPRECS.
  • Validating the LINUX servers by the doing pre-check like all user ID are created or not, correct pythonversion, correct java version, correct file directories and their spaces requirements.
  • Provided documentation of methodologies, data reports and the model results to the project manager.
  • Processed data using Hive for developing the web reports regarding the insurance claim analysis.
  • Using Data Null Techniques generated the SAS customized reports.
  • Worked with dynamic data exchange protocol to pull out and transfer the data from excel spread sheets to other applications.
  • Responsible and managing the data from various types of data from various sources like SQL Server, Oracle 11g/10g,MongoDB, Teradata, and etc.,
  • Prepared various BTEQ scripts to run the complex queries on the Teradata database. Delivered queries for breaking the complex one into simpler queries using volatile tables.
  • Participated in home equity loan loss forecasting SAS model implementation and forecast analytics for regulatory stress testing (CCAR/DFAST) as well as BAU forecasts and annual planning
  • Managing and streaming the analytical workflow by tracking the customer workflow using SAS Model manager.
  • Developing and designing Oracle PL/SQL scripts for import and export of data.
  • Implemented HDFS and NO SQL techniques in big data for extracting and analyzing the data.
  • Worked with Informatica for extracting the source data, applying required transformations and loading it back into the target source location.
  • Implemented Informatica data warehousing for integrated quality management and also for end to end metadata management.

Environment: SAS/BASE,SAS/STAT,SAS/ODS,SAS/SQL,SAS/MACRO,SAS/GRAPH, Enterprise Guide, Oracle9i, MS Excel, Python,UNIX,Teradata14/13.10, R, R-Studio, Rattle,Hadoop, Teradata Oracle 11g/10g, Informatica Power Center 9.1.1, ER Viewer, Windows XP, UNIX Scripting, LinuxTeradataV2R5,TeradataSQLAssistant,TeradataManager,TeradataAdministrator, Oracle 8i, Informatica Power Center 7.1, MicroStrategy 8, MS-Access, MS-Excel, TOAD, SQL, UNIX and Windows NT, CVS.SQL Server 2008, Hadoop, Python, MYSQL 5.2, SQL Server Management Studio (SSMS), Pentaho Kettle, SSRS, SSAS, NoSQL MongoDB, Business Intelligence Studio (BIDS), Windows Server 2003

Confidential

SAS Programmer

Responsibilities:

  • Developed SAS Programs for validation, analysis, data cleaning and generating reports.
  • Prepared Case report tabulations.
  • Created reports by identifying the irregular data entries by reviewing the SOPs.
  • Generated reports using different SAS Output delivery systems like HTML, PDF and RTF.
  • Involved in retrospective validation procedures.
  • Preparing the design document.
  • Application of statistical programming language (R, SAS, Python) and programming language python and java for API’s
  • Data modeling with AWS and hadoop.
  • Data analytics using R, SAS, Python, Spark and MS Excel, sql visualization with API and Tableau as well as application of relational database such as sql and oracle and non-relational database such as hbase, mongoDB and redis
  • Responsible for building a new Services BI database with the respective tables and views required as given by the business team
  • Responsible for connecting to the Sales force cloud using Cozyroc and BIExpress via SSIS
  • Design and Development of SSAS Cubes from the created database
  • Implemented Many SSIS Packages and Jobs to assist with daily running queries.
  • Created complex stored procedures and developed dynamic SQL Queries to adopt dynamic environment.
  • Responsible for Creating Calculated measures, Partions and aggregations for performance of the Cube.
  • Administered SQL Server with Client/Server tools including SQL Server Enterprise Manager, Profiler and Query Analyzer
  • Created AD-Hoc reports using Report Builder and maintained Report Manager forSSRS.
  • Involved in development and modification of existing Financial Reports and Marketing Reports, which include Corporate Consolidation Reports, Financial Reports and Planning Reports for management.
  • Created number of standard reports and complex reports to analyze data using Slice & Dice and Drill Down, Drill through usingSSRS.
  • Was actively in the Server migration and the FreshQlikView Installation process.
  • Created reports using SQL Server reporting servicesSSRS, which is then exported as PDF. Reports variables are passed from ASP.NET web pages.
  • Created stored procedures to build the Fact tables in the data mart for Multi-Dimensional analysis using (SSAS) andproduced ad-hoc, standard and super user reports usingSSRS
  • Created reports using SQL Server reporting services (SSRS), which is then exported as PDF. Reports variables are passed from ASP.NET web pages.
  • Involved in creating Dataset in the MS reporting services (SSRS) for calling the stored procedures and passed the parameters to the dataset.
  • Creating Multi-Dimensional-Queries (MDX) for complex calculation and better performance of Cube.
  • Developing programs inSASfor the conversion of Oracle Data, for a phase II study intoSASdatasets using SQL Pass through facility and Libname facility for importing external files in toSASenvironment
  • Mapping of raw Data to SDTM data to generateSASprograms for creating SDTM domains such as Adverse Events (AE), Demo Graphics (DM), Concomitant Medication (CM), Medical History (MH), Vital signs (VS) reports.
  • Generate tables, listings and graphs, including Safety and Efficacy profiles for FDA submissions.
  • DevelopedSASprograms usingSAS/BASE,SAS/SQL,SAS/STAT andSAS/MACROS for Statistical
  • Customizing existing programs and Macros as per the SAP and statisticians requirement.
  • Responsible for providingSASprogramming and analysis support for Phase II and Phase IIIclinicalstudies.
  • Perform validation programming usingSAS/Macros by define variables, merging datasets, creating derived datasets and Code review.
  • Created Analysis datasets from raw datasets based on the necessity of new variables.
  • Created PMA documents for the submission of FDA
  • Extensive use of Data Null, Summary, Means, Freq and Chart procedures ascertaining quality and standards for the code Performed statistical works to resolve and developSASapplications using proc reg, proc univariate, proc corr, proc anova, proc means etc.
  • Formatted HTML, RTF and PDF reports, usingSASoutput delivery system (ODS).
  • Good understanding in creating CRT data sets, MedDRA, WHO drug dictionaries, IND, ICH guidelines
  • Generated tables, listings and graphs, including Patient Demography and Characteristics, Adverse Events, Laboratory etc.,
  • DevelopedSASprograms usingSAS/BASE,SAS/SQL,SAS/STAT andSAS/MACROS for statistical analysis data display.
  • Produced RTF, MS WORD and HTML formatted files usingSAS/ODS to produce ADHOC reports for presentation and further analysis.
  • Developed summary reports and graphs usingSASProcedures like proc freq, proc gplot, proc summary, and proc compare etc.
  • Proactive monitoring of Database performance.
  • Responsible for creating an SSRS report to track all the SQl Agent Job runs and its status dynamically.
  • Responsible for having meeting with the clients and getting the business requirements
  • Created user defined data types and added them to model database for future purpose.
  • Involved in Database design and worked with Production DBA.
  • Was single point of contact between the Business Users and the Technical team to make sure they get the desired result. Also responsible of giving Technical presentation to demo the applications to users.
  • Setting up database backup and recovery procedures for production, quality and development servers.
  • Monitored and scheduled new jobs on production environment.
  • Experience on migrating the data to Microsoft Parallel Data Warehouse (PDW) servers.
  • Mentor the front end users on how to use the Datamart and the reports created out of the Cube.
  • Responsible for Creating Data Design Document (DDD) which documented everything in the project.
  • Successfully deployed SSIS Package into Production environment and used 2012 SSIS latest features like project parameters for better integration.
  • Worked with deployment, maintenance, usability and portability of SSIS Packages
  • Implemented Event Handlers and Error Handling in SSIS packages and notified process results to various user communities.
  • Providing the reports according to the requirements of the business team with minimal risk.
  • Generated case report tabulation according to the standards for regulatory submissions.
  • Participated in project meetings for updating the research findings.
  • In data manipulation and transformation process extensively used SQL, Transpose, Copy, Sort procedures.
  • Analyzing and tracking the test results using quality center 9.0
  • Helped my team members regarding the technical issues.
  • Prepared various data models using the

Environment: TeradataV2R5,TeradataAdministrator,TeradataSQLAssistant,TeradataManager, BTEQ, MLOAD, FLOAD, FASTEXPORT, ERWin Designer, Quality Center, UNIX, Windows 2000, Shell scripts. SQL Server 2005, Python, SQL Server Management Studio, SQL Server Business Intelligence Studio, Visual Studio 2005.

We'd love your feedback!