Sr. Sas Programmer Resume
NC
SUMMARY
- Experience in SAS/BASE, SAS/STAT, SAS/SQL, SAS/MACROS, SAS/GRAPH, SAS/ACCESS, SAS/ODS, SAS/EG, SAS/ETL Studio, PROC SQL on Windows, UNIX, and Mainframe for Data Manipulation of large datasets and carried out statistical analysis of large - scale data.
- Experience in Data analysis and Reporting in health care, finance and Insurance sectors. SAS experience includes detailed knowledge of statistical analysis of financial, marketing and health care data and in the production of reports, tables, graphs and listing
- Experience working withNoSQLdatabase including MongoDB and Hbase.
- Experience in working with CassandraNoSQLdatabase.
- Experience in developingNoSQLdatabase by using CRUD, Sharding, Indexing and Replication.
- Experienced inSASreporting procedures like PROC REPORT, PROC SQL, PROC FREQ, PROC MEANS, PROC RISK, PROC CHART, PROC PLOT, PROC UNIVARIATE, PROC TABULATE, PROC RANK, PROC TRANSPOSE, PROC IML, PROC PRINT.
- Firm understanding of SAS Programming and List Processing. Utilize technical skills with SAS, SQL, and Database concepts to develop audits, test and control segmentation, vendor files, customer account updates, and offer tracking. Ability to turn business requirements into Technical Design specifications and documents, SAS Programs, and SQL Queries as appropriate. Capable of working with cross-functional teams to manage projects, determine prioritization, and create appropriate timelines.
- Extensive experience in working with SAS, SQL, R, Hadoop.
- Experience in Python for data analysis and processing.
- Hands on experience in using IBM mainframes for executing SAS programs.
- Used Job Control Language (JCL) to access the mainframe data files. Involved in moving data between production systems and across multiple platforms
- Strong emphasis in Shell Scripting and Autosys to integrate the component into the existing framework.
- Good understanding in Agile, Scrum methodologies and Sprint planning to achieve project goalsTEMPeffectively and efficiently.
- Strong skills on Ab Initio graph programming, Parameter Definition Language (PDL) and Data Manipulation Language (DML).
- Working with Data Stage Manager, Designer and Director.
- Used Enterprise stages in Databases like Oracle, DB2, Teradata & Netizza for optimal performance while loading or updating the data.
- Experienced in integration of various data sources (DB2-UDB, SQL Server, PL/SQL Oracle and MS-Access) into data staging area.
- Hands on experience on using Hive Query Language (HQL) for extracting data from BIGDATA
- Working experience on BIG-DATA and its architecture.
- Strong hands on experience in transforming data imported from disparate data sources into analysis data structures, using SAS FUNCTIONS, OPTIONS, ODS, ARRAY PROCESSING, MACRO FACILITY, and storing and managing data in SAS data files.
- Proficient in creating and maintaining large datasets using Merge, Modify, Update, Set, Append, and PROC SQL.
- Experience in building Data Integration, Workflow Solutions and Extract, Transform, and Load (ETL) solutions for data warehousing usingSQLServer Integration Service (SSIS).
- Creating SSIS Packages and involved in Package configurations and deployments between Development and QA and Production servers.
- Created a new SSIS package and added a Dataflow and a Script Component configured to Source.
- Business intelligence project work Expert withSQLand PL/SQLExperienced withSQL-Server databases
- Experience in creating SSIS packages to automate the Import and Export of data to and fromSQLServer 2005 using SSIS tools like Import and Export Wizard, Package Installation and BIDS.
- With BIRT, added a rich variety of reports to our application.
- Source code deployment experience using deployment tools such as CVS, clear case.
- Extensive experience in report generation usingSQLServer Reporting Services (SSRS) from both relational databases and OLAP cubes.
- Experience in R Programming in Business Analytics predictive models using Liner, Logistic Regression, K Means Cluster Analysis and Time Series Forecasting methods.
- Implemented Predictive modeling to evaluate riskfacts using linear and non-linear regressions mixed models, logistic regression and survival models.
- Expertise in installing, configuring, managing and testing theHadoopcomponents.
- Involved in managing and reviewingHadoopLog files.
- Used Teradata SQL Assistant and SQL Pass-Through Facility to build SQL queries.
- Good command in importing and exporting complex internal data to and from Microsoft Excel and Microsoft Access using PROC IMPORT and PROC EXPORT.
- Modified existingSASprograms and created newSASprograms usingSASMacros to improve ease and speed of modification as well as consistency of results with efficiency.
- Good exposure onSASData warehousing (ETL-Extraction, Transformation and Loading) tool-SAS Data Integration Studio 4.2/4.3.
- Gathering data sources like QVD files, Excel, Oracle & Teradata DWs, binary QVW, Inline and .txt files in developing theQlikViewData Models.
- Experience in building data warehouses/data marts by extracting and transforming data from Oracle/Teradata systems using Teradata Utilities.
- Expertise in Performance Tuning & Deployment ofQlikViewApplications.
- Experience in designingQlikViewDocument/User Setting, Layouts to make consistent and professional optimized look to clients.
- Extensive experience in usingSASEnterprise Guide in developing stored processes and performing statistical analysis.
- Assessed changes in business and ITrisks, organizationalriskculture,risktolerance and relevant IT related business initiatives to establish acceptablerisklevels.
- Involved in every phase of Software Development Life Cycle (SDLC) process in the areas of Analysis, Design, Development, Implementation and Testing of Software Applications.
- Expertise in extracting and loading data to and from various RDBMS including ORACLE, MS SQL SERVER, TERADATA, FLAT FILES, XML FILES and IBM DB2.
- Handled more than 10,000 transactions, withfraud protection for consumers and guaranteed payment for merchants.
- Very good experience in troubleshooting and debugging SAS programs.
- Coordinating with business to close the issues.
- Innovative, goal-oriented and creative approach to delivering results.
TECHNICAL SKILLS:
SAS Tools: SAS/BASE, SAS/SQL, SAS/MACROS, SAS/ODS, SAS/STAT, SAS/ACCESS, SAS Enterprise Guide, SAS Data Integration (DI) Studio, SAS Enterprise Intelligence Platform 9.1., Hadoop, Forecast Studio.
Programming: SAS, R, SQL, PL/SQL and Unix, Perl Scripting 5.8, Shell scripting, PYTHON.
Databases: Oracle 11g/10g/9i, MS Sql Server 2008/2005, MS Access, DB2.
Package: MS Word, MS Excel, MS Power point, MS Visio, MS Project.
Environment: Windows, Unix, IBM Mainframes, Teradata, Sybase Management.
Development Tools: Excel, Word, Visio, Publisher, Lotus Notes, Outlook, SharePoint, Info path.
BI Tools: Tableau, Cognos, Qlikview.
PROFESSIONAL EXPERIENCE
Confidential, NC
Sr. SAS Programmer
Responsibilities:
- AnalyzedWells FargoAdvance Visa Card portfolio data and generate analytical reports usingSAS, MS SQL and Tableau.
- Worked on Code Migration from Pc SAS to Grid (SERVER).
- Upgraded SAS Grid Software from SAS 9.4M2 to 9.4M3 - Metadata, Compute and Web.
- Worked with users to migrate the processes currently running on 9.3 to 9.4 Grid Platform and resolve any issues that they have.
- Prepared model data and built various Predictive Models using R Programming Time Series forecasting, Linear & Logistic Regressions and K-Means Clustering in Business Analytics.
- Interacted with Business users for requirement gathering.
- DevelopedSASMacrosfor Data cleaning, Data mining and Reporting and to support routing processing.
- Extracting data from different sources likeWells Fargo"HUB" system where information is stored in Oracle tables, Excel, Access, and text/CSV files usingSAS/ACCESS,SAS/SQL inSASenvironment and creating SASfiles.
- Got good experience withNoSQLdatabase.
- Worked with Apache Crunch library to write, test and runHADOOPMapReduce pipeline jobs.
- Used SQL to download data and SASPROC SQL pass through facility to connect to Oracle tables.
- Managing and scheduling batch Jobs on aHadoopCluster using Oozie.
- Involved in performance tuning the DB2 queries used in batch job which saved MIPS units and mainframeCPU utilization for our Clients.
- Wrote newSASprograms from scratch, automated and fine-tuned existingSAScode usingSAS Macro and Shell scripts and reduced manual dependencies and generated HTML/PDF/RTF Reports and Graphs/Charts.
- UsedQlikViewScripting and Complex SQL coding for loading, and transforming.
- Knowledge and experience in creating different sheet objects like List boxes, Buttons, Multiboxes etc.
- Developed Macros to provide custom functionality inQlikViewApplication.
- ValidatedQlikViewapplication at base level against QVD's and Excel Test files.
- Implemented Security & involved in Deployment ofQlikViewApplication.
- Created user accounts for usage of application and Monitored QVWs.
- MergingSASdatasets using various SQL joins such as LEFT JOIN, RIGHT JOIN, INNER JOIN and FULL JOIN as well as usingSASprocedures such as SET, MERGE, PROC APPEND etc.
- Extensively usedInformatics-Power Center for extracting, transforming and loading into different databases.
- Wrote PL/SQL stored procedures and triggers for implementing business rules and transformation.
- Created Source and Target Definitions in the repository usingInformaticaDesign Source Analyzer and Warehouse Design.
- Implemented Collaborative Planning, Forecasting.
- Develops new tools and reports to improve the forecasting method used and optimize the compaigns.
- Worked with my team to develop and build a new Long Range Forecast for business planning.
- Meet with Executives on daily basis to review daily forecasts and long range staffing requirements.
- Involved in requirement analysis,ETLdesign and development for extracting data from the heterogeneous source systems like Oracle, flat files, XML files and loading into Staging and Enterprise Data Vault.
- Created technical specification for the development ofInformaticaextraction, transformation and loading (ETL) mappings in order to load data into classified tables.
- Worked in highly structured division acrossInformaticadevelopers/Data Architects (data modeling team)/DBAs/Change Management etc.
- Wrote Pythonscripts to parse XML documents and load the data in database.
- Creating analytical reports (periodic and ad-hoc) for different credit card portfolios reports using PROC REPORT, PROC TABULATE, PROC SUMMARY, ODS statements and generating outputs in HTML, Excel and RTF formats.
- Installed and configured Oozie onHadoopcluster.
- Compares the performance of previous mailings to the forecasts to build new models that helps generate a more TEMPeffective and accurate forecasting
- DevelopingSASMacro programs and using macro functions (%LET, CALL SYMPUT, %NRSTR, SYMGET etc.) to automate business as usual (BAUs) reporting process to improve process efficiency.
- Interacting with business stakeholders to understand their decision making parameters, analyzing the available data to build Tableau Dashboards and managing the delivery.
- Prepare the summary reports across the products (Credit Cards, Loans like home loans, vehicle loan etc.)
- Used SAS Visual Analytics which renders reports and visualizations that can be easily shared with others using iPad and Android mobile devices.
- Worked on Mobile BI to provide various levels of details by using filtering, data brushing or section linking.
- Involved in generating proper data in excel for Tableau Dashboard.
- Designing and developing Tableau visualizations which include preparing Dashboards using calculations, parameters, calculated fields, groups, sets and hierarchies.
Environment:SAS9.3, SAS Visual Analytics, SAS Mobile BI, R Programming, Python 2.7, Tableau, Hadoop MapReduce, Oozie, MS SQL Server, MS Word/ Excel, NoSQL,Oracle, Unix.
Confidential - Philadelphia, PA
Sr. SAS Developer/Data Analyst
Responsibilities:
- Developed PD scorecards using Credit Scoring nodes in Enterprise Miner. Run various interactive grouping nodes and inspected results.
- Build and publish Tableau reports utilizing complex calculated fields, table calculations, filters, parameters.
- Used excel sheet, flat files, CSV files to generated Tableau adhoc reports
- Involved in creating and visualizing dashboards using Tableau Desktop
- Generated Tableau combination charts using Tableau visualization software
- Generated Tableau dashboards for showing campaign TEMPeffectiveness, comparing b/n campaigns and with quick filters and sets for ICD9 codes connecting to Oracle Database
- Generated adhoc Tableau reports on demand for providers with trend lines, option to select various geographic areas, segments etc. to look at the performance of a particular segment
- Generated Tableau Dashboards with Oracle Essbase multidimensional database.
- Used filters, quick filters, sets, parameters and calculated fields on Tableau reports.
- Performed incremental extracts for OLAP database.
- Created actual sales and target sales view with Trend lines, table calculations to see the difference between each quarter.
- Generated dashboards to compare the expected weekly sales vs projected sales used dual axis for comparison.
- Generated Dashboards by joining multiple complex tables, generated dashboard for Finance team, provided security by using user filters.
- Preprocessed data for LGD modeling using linear regression and performed predictive modeling using various drivers like debt type, seniority, industry impact, firm impact, economy impact.
- Analyzed Cumulative accuracy profiles (CAP) for PD modeling, Exposure at Default (EAD) modeling for revolving credit.
- Accessed and analyzed the data based on the requirement using SQL, SAS and MS Excel.
- Maintained and enhanced existing SAS reporting programs for marketing campaigns. Involved in the code review to make sure the output is as we expected and the efficiency is met.
- Performed ETL tasks in extracting the data from databases using SAS procedures like SAS/SQL, SAS/Access and creation of datasets.
- Used proprietary statistics tool to test how the marketing campaigns fared.
- Performing data analysis, data migration, data preparation, graphical presentation, Statistical analysis, reporting, validation and Documentation.
- Analyzed large data sets consisting of million records using PROC SQL, PROC SORT, PROC PRINT, PROC TABULATE procedures.
- Actively involved in analyzing the credit card campaign data using SAS Enterprise Guide and used to analyze competitor's data forecasted results for new credit card offers by region.
- Retrieved the original data and convert them into SAS readable format. Make the SAS data sets, analyze data as per given requirement.
- Maintain and enhance existing SAS reporting programs for marketing campaigns.
- Interactively meeting the teams for getting the required data to generate the reports as per business requirements.
- Created UNIX Korn shell scripts to export environment variables and deleted old files.
- Developing and analyzing key performance indicators relevant to emerging business needs using PROC FREQ, PROC TABULATE, PROC MEAN, PROC UNIVARIATE, PROC GLM, PROC TTEST, PROC TRANPOSE, and DATA NULL
- Developing SAS programs using SAS/BASE, SAS/SQL, SAS/STAT, SAS/ACCESS and SAS/MACROS, etc for statistical analysis and data displays
- Generating reports to communicate key findings and articulate strategic implications using PROC PRINT, PROC REPORT, PROC TABULATE, PROC MEAN, PROC GPLOT and PROC GCHART.
- Cubes are built using the Teradata, Netezza tables and created some intermediate required data sets which will be joined using the Cube Studio.
- Built summary reports after identifying customers, occupancy period and revenue generated using PROC SUMMARY, MEANS and FREQ. Updated Revenue item table on a daily basis.
- Generated listings and reports from SAS programs using MACROS, ODS and PROC TEMPLATE/REPORT/TABULATE and using Word, Excel and PowerPoint as well
- Used the JDBC for data retrieval from the database for various inquiries. Performed purification of the application database entries using Oracle 10g.
- Documented purpose and methods used in each SAS program.
Environment: SAS9.3, UNIX (AIX 5.2), XP, Base SAS v9, VI Editor, SAS Enterprise Guide, SAS/MACRO, SAS/GRAPH, SAS/STAT, MS Excel, Oracle 10g.
Confidential, Los Angeles, CA
SAS Programmer
Responsibilities:
- Worked closely with data modeling team and management like Marketing managers, Campaign managers, financial managers etc., to analyze customer related data, generate custom reports, tables and listings.
- DevelopedSAScode usingSAS/BASE,SAS/Macro to clean the invalid data from the database while reading the data from Oracle, TERADATA, Excel, XML and flat files intoSASsoftware.
- Extract and Transport data from DB2 and Oracle tables in UNIX. Extract huge volumes of data from legacy systems and uploaded into Oracle using SQL*Loader.
- AutomatedSASjobs running on daily/weekly/monthly basis usingSAS/BI,SASMacro and Unix Shell Scripting.
- Extensive usage of web services fromAbInitio.
- Worked closely with business forAbInitioGDE, ACE/BRE (Business Rules Engine) setup, design, coding and training. Used Metadata Hub for files, dml's, xfr's etc.
- Co-Ordinated withAbInitiosupport to resolve the peculiar issues encountered.
- Extensively usedSASprocedures such as Print, Report, Tabulate, Freq, Means, Summary, Transpose and Data Null for producing ad-hoc and customized reports and external files.
- Validate Autosys BOX's and CMD's deployed to scheduleAbInitiographs and scripts to automate jobs and co-ordinate with Autosys Admin team to ensure Job dependencies are set as expected.
- Extensive use of SQL,SAS/Access,SAS/Connect to connect to various databases (ORACLE, DB2) such as development and production database (DB2 on UNIX). Also worked on Mainframes, MVS/JCL to read, modify and create newSASdatasets as per the business needs.
- Generated custom reports with line sizes, page breaks, Header message, and Bottom message using PROC SORT, PROC PRINT, and PROC REPORT.
- DebuggingSASprograms using data null, Put statements and using data step debug.
- Involved in testing the XML files and checked whether data is parsed and loaded to staging tables.
- Responsible for call center efficiency and productivity through forecasting and scheduling. Provide adhoc reporting upon request.
- Used Proc SQL, Proc DBLOAD, Proc Tabulate, Proc Report, Proc Sort, Proc Freq, Proc Transpose, Proc Summary, Proc Compare, Proc Means, Proc ANOVA, Proc Univariate and Data NULL .
Environment:SAS/EnterpriseGuide,SAS/Base9,SAS/Macros,SAS/SQL,SAS/ETL,SAS/Graph, SAS/ODS,SAS/Stat,SAS/Connect, Excel,AbInitio, MS Access, Oracle, Teradata, DB2, SQL*Loader, Windows, MS office, Unix, IBM/S390, MVS, JCL and XML.
Confidential, Rockville, MD
SAS Programmer
Responsibilities:
- Worked with business users to design and develop several ad-hoc reports and information maps using Proc Report and Proc Tabulate.
- The reports created include Monthly summary reports as well as detailed weekly reports.
- Connected to the DB2 database and extracted the relevant data (using LIBNAME or Pass-through facility) and tan processed the data using business logic to produce reports. Made heavy use of Proc SQL, Proc Report and ODS for producing these complex reports.
- Developed online claims reporting system to retrieve client records and create SAS Proc tabulate tables, Proc Freq, Proc Means, and Proc Report.
- Creating excel pivot table reports using Cycle Time data. These reports consist of Benefit level data, claims level data. These reports are very useful to get a greater insight into the further plan design.
- Extensively used SAS Data Steps, Statistical models, and Flowchart Design, Listing, Graphing, Summarizing and Reporting procedures.
- Performed variety of tasks including data extraction, manipulation, Analysis using claims data.
- Used SAS Enterprise Guide to access data, to manage data, to create reports and to validate data and data mining.
- Used SQL Pass through Facility to create and update Teradata and Oracle tables.
- Tested and executed marketing campaigns using SAS Marketing Automation.
- Used EDW processes for replacing the existing data, updating and appending the data and this data is used for generating reports as per the requirement.
- Created various transformation procedures by using SAS ETL and SAS Enterprise guide.
- SAS Procedures used are Proc Append, Macros, Import and Export, Proc Transpose, Proc Datasets.
- Collected and maintained the customer information through CIS.
- Developed and deployed SAS OLAP cubes for browsing of financial data via the SAS Information Delivery Portal.
- Proficient in sorting, merging and using different table lookup techniques to get the required information to generate the report.
- Handled the tasks of writing reports and manipulating data by using Unix platforms
- Produced reports and complex data analyses using SAS, including SAS/stats, extracted data from MVS and Oracle files using SAS/Views, SAS Proc SQL and SQL.
- Debugging SAS programs using data null, Put statements and using datastep debug.
- Generated automated graphs for monthly run using Proc Gplot. Exported these graphs to Pdf File using SAS/ODS.
- Customizing the existing Macros according to need and requirement, testing and debugging the Macros and creating more complex and reusable Macros.
- Developing the Unix shell scripts in order to run the report as a part of daily schedule.
- Exported SAS Reports to the Excel sheet by using ODS/Tagsets/ Excel XP to get output in multisheets. Used several Style Options.
- Produced statistical reports for various business needs and expertise in data preparation, data cleaning, analysis, and reporting.
- Expert in preparing Analysis Datasets, Tables, Listings, and Graphs according to the Business Requirement.
Environment:SAS/9.2, SAS/Base, SAS/Connect, SAS Macros, SAS/Access, SAS/ODS, SAS/Marketing Automation, SAS/ETL, SAS/Access, SAS/SQL, Oracle, Windows, MS Office, HTML, SAS Enterprise Guide, and Unix.
Confidential
SAS Developer/Data Analyst
Responsibilities:
- Responsible for credit card account data extraction, and data manipulation, and using the statistical data analysis involving multiple quantitative methods to evaluate fraud management business problems.
- Independently handled responsibilities for extracting internal/external data, data cleaning, and validation, analysis and report generation.
- Rewrite existing Python/Django/Java module to deliver certain format of data
- Used Django Database API's to access database objects.
- Wrote python scripts to parse XML documents and load the data in database.
- Generated property list for every application dynamically using python.
- Researched existing consumer, small Business Card and Mortgage defect identification logic to facilitate automation of defect and dashboard reporting.
- Worked with different data management tasks such as cleaning and querying.
- Used Teradata/SQL, MS Office (Especially advanced in Access, Excel VBA/Pivot table and Power point) and exposure to business objects.
- Used PROC GCHART, PROC GPLOT and PROC G3D for generating graphs such as pie-charts, histograms, bargraphs, scatter plots from processed data.
- Developed SAS programs for statistical analysis, data displays and working with various SAS products like
- SAS/BASE, SAS/SQL, SAS/STAT, SAS/ACCESS and SAS/MACROS etc. to develop solutions.
- Involved in updating and maintaining the databases like Sybase and Oracle using SQL.
- Used Dynamic Data Exchange (DDE) to acquire data from Excel spread sheets.
- Convert MS-Word documents, MS-Excel, SQL tables into datasets.
- Used SQL and PROC SQL pass through to work with Oracle, DB2.
- Used SAS/V8 on the mainframe in conjunction with Teradata/Fastload to create and load Teradata tables.
- Customized existing SAS programs and created new programs using SAS/Macros to improve the consistency of the results.
- Improve the efficiencies of marketing activity, through the targeting, planning and campaign evaluation.
- Analyze the customer data to evaluate campaign activity and use this insight to maximize targeting and overall campaign TEMPeffectiveness.
- Create process flow for the SAS jobs using MS Visio, design docs and send to testing team.
- Designing Unit and System test cases for testing the SAS jobs.
- Used in SAS/Graph and SAS/STAT to generate the plots of the variables and regression lines.
- Bug fixing of SAS programs in production.
- Joined tables using SQL joins.
- Involved in code changes for SAS programs and UNIX shell scripts.
- Wrote BASE SAS and Macro code to validate the data sets before running the statistical models.
- Data was extracted from several UNIX Oracle databases via SAS SQL Pass Through queries and uploaded to the IBM mainframe through SAS CONNECT.
- Develop and implement SAS jobs in UNIX environment to generate required reports for the business through SAS/BASE.
Environment: SAS/BASE, SAS/GRAPHS, SAS/STAT, SAS PROC Reports, Teradata, SQL, Excel, Oracle, DB2, SQLPass Through, MS Visio.
Confidential
SQL Developer
Responsibilities:
- Involved in writing T-SQL programming for implementing stored procedures and functions for different tasks.
- Responsible for creating Databases, Tables, Cluster/Non-Cluster Index, Unique/Check constraints views, stored procedures, Triggers and Rules.
- Creating and maintaining UI prototypes and specifications.
- Collaborating with Product Managers and Software Engineers to provide guidelines on solid UI design.
- Keeping up to date with the latest industry trends in UI design and usability.
- Translating market and product requirements into UI designs in the form of Conceptual models, Wireframes and prototypes.
- Creating new datasets from raw data files using Import Techniques and modified existing datasets using Set, Merge, Sort, Update, and conditional statements. Involve in Unix Shell programming using Bash and set up CRONTAB jobs for SAS application batch run.
- Used HTML, XML, AJAX, JavaScript, CSS and pure CSS layouts.
- Designed the user interfaces using Web Server Controls which are built on ASP.NET Server Controls.
- Developed Business Logic Component using Web Services, WSDL.
- Implemented SOAP (Simple Object Access Protocol) for communication of application.
- Used LINQ to Objects for WCF service call.
- Used SSRS for SQL reporting services.
- Used AJAX for rich user experience in designing the screens.
- GUI was developed using C#.NET and ASP.NET.
- Used .NET Framework 3.0 throughout the project and converted existing code from 1.1 to 3.0.
- Modifications were made to the existing GUI using WPF for better look and feel.
- Consumed Web services using WCF and WPF for online transactions using C# and exposed them through HTTP.
- Optimized the performance of queries by modifying the existing index system and rebuilding indexes.
- Created SSIS packages to transfer data from Oracle to SQL using different SSIS components and used configuration files and variables for production deployment.
- Developed SSIS packages to consolidate date from various data sources and also data loads from various types of source files like EXCEL, ACCESS, Flat files and CSVs and converted XL to SQL reporting.
- Involved in daily loads (FULL & INCREMENTAL) into staging and ODS Aras, troubleshooting process issues and errors.
- Used various transformations in SSIS dataflow, control Flow using for loop constraints and Fuzzy lookups created for several staging databases.
- Used SSIS and T-SQL stored procedures to transfer data from OLTP databases to staging area and family transfer into data marts and performed action in XML.
- Wrote complex quires, triggers, stored procedures, views and user defined functions.
- Design and implement database logical and physical schemas.
- Responsible for database logical and physical schemas.
- Maintenance of development, system test, assembles test and integration servers and databases.
- Query tuning, Index tuning and performance tuning.
- Creation and maintenance of database objects.
- Users and User group creation and maintenance.
- Crated custom and standard reports using Crystal reports.
- Developed parameterized crystal reports existing reports for presentations using cross-tab reports, running totals, and alerting.
- Created and modified stored procedures to Create Reports.
Environment:MSSQLServer 2005 and 2000, MSSQLServer 2005, MSSQLServer Reporting Services, MSSQLServer Integration Services, Proclarity, MSSQLServer Analysis Services 2000, C#, ASP.NET,VB.NET, OLAP, UML, Erwin, DTS/SSIS, BIDS.