Sr. Etl/talend Developer Resume
Austin, TX
PROFESSIONAL SUMMARY:
- 16+ years of experience in the IT industry and wide range of progressive experience in providing product specifications, design, analysis, development, documentation, coding and implementation of the business technology solutions in Data warehousing applications.
- Extensive experience in development and maintenance in a corporate wide ETL solution using SQL, PL/SQL, TALEND 4.x/5.x/6.x on UNIX and Windows platforms.
- Strong experience with Talend tools - Data integration, big data and experience in Data Mapper, Joblets, Meta data and Talend components, jobs
- Extensive experience in integration of various heterogeneous data sources definitions like SQL Server, Oracle, Flat Files, Excel files loaded data in to Data ware house and Data marts using TalendStudio.
- Experiences on databases like MySQL, Oracle using RDS of AWS.
- Experienced in ETL TALEND Data Fabric components and used features of Context Variables, MySQL, Oracle, Hive Database components.
- Experience in scheduling tools Autosys, Control M & Job Conductor (Talend Admin Console).
- Good experience with Big Data, Hadoop, HDFS, Map Reduce and Hadoop Ecosystem (Pig & Hive) technologies.
- Extensively created mappings in TALEND using tMap, tJoin, tReplicate, tConvertType, tFlowMeter, tLogCatcher, tNormalize, tDenormalize, tJava, tAggregateRow, tWarn, tLogCatcher, tMysqlScd, tFilter, tGlobalmap, tDie etc.
- Excellent Experiences onNOSQL databases like HBase and Cassandra.
- Excellent understanding of Hadoop architecture, Hadoop Distributed File System and API's.
- Extensive knowledge of business process and functioning of Heath Care, Manufacturing, Mortgage, Financial, Retail and Insurance sectors.
- Strong skills in SQL and PL/SQL, backend programming, creating database objects like Stored Procedures, Functions, Cursors, Triggers, and Packages.
- Experience in AWS S3, EC2, SNS, SQS setup, Lambda, RDS (MySQL) and Redshift cluster configuration.
- Experienced in Waterfall, Agile/Scrum Development.
- Good knowledge in implementing various data processing techniques using Pig and MapReduce for handling the data and formatting it as required.
- Extensively use ETL methodology for performing Data Migration, Data Profiling, Extraction, Transformation and Loading using Talend and designed data conversions from wide variety of source systems like SQL Server, Oracle, DB2 and non-relational sources like XML, flat files, and mainframe Files.
- Well versed in developing various database objects like packages, stored procedures, functions, triggers, tables, indexes, constraints, views in Oracle11g/10g
- Hand on Experience in running Hadoop streaming jobs to process terabytes of xml format data using Flume and Kafka.
- Hands-on proficiency in one or more scripting languages (e.g., Java, Python, Scala, R, Shell scripting)
- Have experience Develop data ingestion jobs in tools such as Talend to acquire, stage, and aggregate data in technologies like Hive, Spark, HDFS, and Greenplum.
- Worked in designing and developing the Logical and physical model using Data modeling tool (ERWIN).
- Experienced in Code Migration, Version control, scheduling tools, Auditing, shared folders and Data Cleansing in various ETL tools.
- Good communication and interpersonal skills, ability to learn quickly, with good analytical reasoning and adaptive to new and challenging technological environment.
- Strong Team working spirit, relationship management and presentation skills.
- Expertise in Client-Server application development using MS SQL Server … Oracle … PL/SQL, SQL *PLUS, TOAD and SQL*LOADER. Worked with various source systems such as Relational Sources, Flat files, XML, Mainframe COBOL and VSAM files, SAP Sources/Targets etc.
- Work hands-on with integration processes for the Enterprise Data Warehouse (EDW).
- Knowledge in writing, testing and implementation of the Stored Procedures, Functions and triggers using Oracle PL/SQL & T-SQL, Teradata data warehouse using BTEQ, COMPRESSION techniques, FASTEXPORT, MULTI LOAD, TPump and FASTLOAD scripts.
TECHNICAL SKILLS:
ETL/Middleware Tools: Talend 5.5/5.6/6.2, Informatica Power Center 9.5.1/9.1.1/8.6.1/7.1.1 , BODS.
Data Modeling: Dimensional Data Modeling, Star Join Schema Modeling, Snow-Flake Modeling, Fact and Dimension tables, Physical and Logical Data Modeling.
Business Intelligence Tools: Business Objects 4.2, PBI, Tableau.
RDBMS: Oracle 11g/10g/9i, Netezza, Teradata, Redshift, MS SQL Server 2014/2008/2005/2000 , DB2, MySQL, MS Access, HANA.
Programming Skills: SQL, Oracle PL/SQL, Unix Shell Scripting, HTML, DHTML, XML, Java, .Net, Netezza.
Modeling Tool: Erwin 4.1/5.0, MS Visio.
Tools: TOAD, SQL Plus, SQL*Loader, Quality Assurance, Soap UI, Fish eye, Subversion, Share Point, IP switch user, Teradata SQL Assistant.
Operating Systems: Windows 8/7/XP/NT/2x, Unix-AIX, Sun Solaris 8.0/9.0.
PROFESSIONAL EXPERIENCE:
Confidential, Austin, TX
Sr. ETL/Talend Developer
Responsibilities:
- Participated in Requirement gathering, Business Analysis, User meetings and translating user inputs into ETL mapping documents.
- Designed and customized data models for Data warehouse supporting data from multiple sources on real time
- Involved in building the Data Ingestion architecture and Source to Target mapping to load data into Data warehouse
- Extensively leveraged the Talend Big Data components (tHDFSOutput, tPigmap, tHive, tHDFSCon) for Data Ingestion and Data Curation from several heterogeneous data sources
- Worked with Data mapping team to understand the source to target mapping rules.
- Prepared both High level and Low-level mapping documents.
- Analyzed the requirements and framed the business logic and implemented it using Talend.
- Involved in ETL design and documentation.
- Developed Talend jobs from the mapping documents and loaded the data into the warehouse.
- Involved in end-to-end Testing of Talend jobs.
- Analyzed and performed data integration using Talend open integration suite.
- Experience in loading data into Netezza db using NZLOAD utility.
- Experience in working with large data warehouses, mapping and extracting data from legacy systems, Redshift/SQL Server … UDB databases.
- Worked on the design, development and testing of Talend mappings.
- Wrote complex SQL queries to take data from various sources and integrated it with Talend.
- Worked on Context variables and defined contexts for database connections, file paths for easily migrating to different environments in a project.
- Develop big data ingestion jobs in Talend for relational, big data, streaming, IOT, flat file, JSON, API, and many other data sources
- Develop big data ingestion jobs in Talend for relational, big data, streaming, IOT, flat file, JSON, API, and many other data sources
- Involved in loading the data into Netezza from legacy and flat files using Unix scripts. Worked on Performance Tuning of Netezza queries with proper understanding of joins and Distribution
- Created ETL job infrastructure using Talend Open Studio.
- Worked on Talend components like tReplace, tmap, tsort and tFilterColumn, tFilterRow etc.
- Used Database components like tMSSQLInput, tOracleOutput etc.
- Worked with various File components like tFileCopy, tFileCompare, tFileExist.
- Developed standards for ETL framework for the ease of reusing similar logic across the board.
- Analysed requirements, create design and deliver documented solutions that adhere to prescribed Agile development methodology and tools
- Developed mappings to extract data from different sources like DB2, XML files are loaded into Data Mart.
- Created complex mappings by using different transformations like Filter, Router, lookups, Stored procedure, Joiner, Update Strategy, Expressions and Aggregator transformations to pipeline data to Data Mart.
- Involved in designing Logical/Physical Data Models, reverse engineering for the entire subject across the schema.
- Scheduling and Automation of ETL processes with scheduling tool in Autosys and TAC.
- Scheduled the workflows using Shell script.
- Creating Talend Development Standards. This document describes the general guidelines for Talend developers, the naming conventions to be used in the Transformations and development and production environment structures.
- Troubleshoot database, Joblets, mappings, source, and target to find out the bottlenecks and improved the performance.
- Involved rigorously in Data Cleansing and Data Validation to validate the corrupted data.
- Migrated Talend mappings/Job /Joblets from Development to Test and to production environment.
Environment: Talend 6.x, XML files, DB2, Oracle 11g, Netezza 4.2, SQL server 2008, SQL, MS Excel, MS AccessUNIX Shell Scripts, Talend Administrator Console, Cassandra, Oracle, Jira, SVN, Quality Center, and Agile Methodology, TOAD, Autosys.
Confidential, St. Paul, MN
Sr. Talend / ETL Developer
Responsibilities:
- Worked on SSAS in creating data sources, data source views, named queries, calculated columns, cubes, dimensions, roles and deploying of analysis services projects.
- SSAS Cube Analysis using MS-Excel and PowerPivot.
- Implemented SQL Server Analysis Services (SSAS) OLAP Cubes with Dimensional Data Modeling Star and Snow Flakes Schema.
- Developed standards for ETL framework for the ease of reusing similar logic across the board.
- Analyse requirements, create design and deliver documented solutions that adhere to prescribed Agile development methodology and tools.
- Responsible for creating fact, lookup, dimension, staging tables and other database objects like views, stored procedure, function, indexes and constraints.
- Developed complex Talend ETL jobs to migrate the data from flat files to database.
- Implemented custom error handling in Talend jobs and also worked on different methods of logging.
- Followed the organization defined Naming conventions for naming the Flat file structure, Talend Jobs and daily batches for executing the Talend Jobs.
- Exposure of ETL methodology for supporting Data Extraction, Transformation and Loading process in a corporate-wide ETL solution using Talend Open Source for Data Integration 5.6. worked on real time Big Data Integration projects leveraging Talend Data integration components.
- Analyzed and performed data integration using Talend open integration suite.
- Wrote complex SQL queries to inject data from various sources and integrated it with Talend.
- Worked on Talend Administration Console (TAC) for scheduling jobs and adding users.
- Worked on Context variables and defined contexts for database connections, file paths for easily migrating to different environments in a project.
- Developed mappings to extract data from different sources like DB2, XML files are loaded into Data Mart.
- Created complex mappings by using different transformations like Filter, Router, lookups, Stored procedure, Joiner, Update Strategy, Expressions and Aggregator transformations to pipeline data to Data Mart.
- Involved in designing Logical/Physical Data Models, reverse engineering for the entire subject across the schema.
- Scheduling and Automation of ETL processes with scheduling tool in Autosys and TAC.
- Scheduled the workflows using Shell script.
- Used Talend most used components (tMap, tDie, tConvertType, tFlowMeter, tLogCatcher, tRowGenerator, tSetGlobalVar, tHashInput&tHashOutput and many more)
- Created many complex ETL jobs for data exchange from and to Database Server and various other systems including RDBMS, XML, CSV, and Flat file structures.
- Developed stored procedure to automate the testing process to ease QA efforts and reduced the test timelines for data comparison on tables.
- Automated SFTP process by exchanging SSH keys between UNIX servers.
- Worked Extensively on Talend Admin Console and Schedule Jobs in Job Conductor.
- Involved in production n deployment activities, creation of the deployment guide for migration of the code to production, also prepared production run books.
- Creating Talend Development Standards. This document describes the general guidelines for Talend developers, the naming conventions to be used in the Transformations and development and production environment structures.
Environment: Talend 5.x,5.6, XML files, DB2, Oracle 11g, SQL server 2008, SQL, MS Excel, MS Access, UNIX Shell Scripts, TOAD, Autosys.
Confidential, Milwaukee, WI
Sr. Talend / ETL Developer
Responsibilities:
- Implemented File Transfer Protocol operations using Talend Studio to transfer files in between network folders.
- Experienced in fixing errors by using debug mode of Talend.
- Created complex mappings using tHashOutput, tMap, tHashInput, tDenormalize, tUniqueRow. tPivot To Columns Delimited, tNormalize etc.
- Schedule the Talend jobs with Talend Admin Console, setting up best practices and migration strategy.
- Used components like tJoin, tMap, tFilterRow, tAggregateRow, tSortRow, Target Connections and Source Connections.
- Mapping source files and generating Target files in multiple formats like XML, Excel, CSV etc.
- Transform the data and reports retrieved from various sources and generating derived fields.
- Reviewed the design and requirements documents with architects and business analysts to finalize the design.
- Created WSDL data services using Talend ESB.
- Created Rest Services using tRESTRequest and tRESTResponse components.
- Used tESBConsumer component to call a method from invoked Web Service.
- Implemented few java functionalities using tJava and tJavaFlex components.
- Developed shell scripts, PL/SQL procedures for creating/dropping of table and indexes of performance.
- Attending the technical review meetings.
- Implemented Star Schema for De-normalizing data for faster data retrieval for Online Systems.
- Involved in unit testing and system testing and preparing Unit Test Plan (UTP) and System Test Plan (STP) documents.
- Responsible for monitoring all the jobs that are scheduled, running completed and failed. Involved in debugging the jobs that failed using debugger to validate the jobs and gain troubleshooting information about data and error conditions.
- Performed metadata validation, reconciliation and appropriate error handling in ETL processes.
- Developed various reusable jobs and used as sub-jobs in other jobs.
- Used Context Variable to increase the efficiency of the jobs
- Extensive use of SQL commands with TOAD environment to create Target tables.
Environment: Talend 5.1, Oracle 11g, DB2, Sybase, MS Excel, MS Access, TOAD, SQL, UNIX, M.
Embedded Analytics with BI Architect
Confidential, Mountain view, CA
Responsibilities:
- Integration for S4 HANA, Tableau, BIBO. HANA view Calculating /Analytical views.
- WEBI, LUMIRA2.0, AFO, Design Studio Setup the security for Analytical privileges.
- Good on CDS Views, Explorer Info spaces. O-Data service on SAP Fiori app and Power BI
- ACL, Load balancer, Proxy and Network configurations End to end Admin activities and BODS.
- REVPRO, Virtual Trader and Cost Management modules. BOBJ admin activites.
- Responsible for enabling Predictive Analytics solution for a supply chain scenario to improve on-time deliver
- Worked as architect to provide End to End solutions for BI (BOBJ, Tableau, and Power BI) tools, HANA, BODS.
- Managed Services for BO 4.2 environment stability, sizing and sustainability, system slowness, Explorer Info spaces, Xcelsius Dashboards and
- Resolved the Daily issues for both Daily Production / Non-Production Environments.
- Work on HANA data model KPI’s to create view for reporting like calculating and analytical views and Good understanding of various sap modules like SAP MM, SD, FI, and AP, HANA Cloud Platform.
- Strong Analytics/Data warehouse experience using SAP Business Objects Data Services/ Data Integrator using BODS as an external system for ETL between ECC, BI and SAP HANA1.0 Data bases.
- Expertise in Business Intelligence tools and Native HANA1.0 Data bases and HANA XS Engine Enhanced multiple SAP ECC data sources such as 0CACONT ACC ATTR and 0UCCONTRACT ATTR 2 as well as create and enhance the necessary info objects and info providers for BEX queries. Modified transformations within BW using ABAP to produce accurate data for reports and REVPRO, Virtual Trader, Cost Management and people soft modules.
- Gathered the functional and business requirement for various Predictive Analytics scenarios
- Responsible for enabling Predictive Analytics solution for a supply chain scenario to improve on-time delivery
- Designed and Modified IDT/UDT, WEBI, Design Studio, Business explorer, Dashboards using Lookups and various formulae to analyze the data like PDF, Line, column and combination charts and Explorer.
- Strong Understanding on live, Extract and Scheduling the TABLEAU, SAP LUMIRA Documents/Workbooks, Admin, Security. Developing with Microsoft Power BI tools including analytics, reports, dashboards, and data visualizations. Hands on knowledge of EIR Accessibility, required technical specifications, Techniques.
- Created folders in CMC and implement the security on various objects in CMC (Folders, Applications, Universe and Connections). Performance tuning on both admin and application related activities.
- Trained and mentoring business users, Start /Stop, Enable/Disable, Clone, Add, remove BO Services (Delete, Move & edit). Work with SAP Support in case any issue is not resolved internally for all Admin/applications related
- And 360 Eye and Info bust. In-depth knowledge on Tableau /Power BI Desktop/ Server.
- Expertise in SAP BI 4.0, 4.1, 4.2 tools and SAP HANA1.0 Data bases. Data load by using SLT, BODS.
- Created complex Jobs, Workflows, Data Flows, and Scripts using various Transforms (Integrator, Quality and Platform) to successfully load data from multiple sources into a desired target
- Created repository and security domain with user groups and security implementation multilevel hierarchies are Setup enabling the drill down functionality, Create Derived Tables and good UI design.
- Responsible for Created user-groups in CMC and implemented the security model on the groups created like End to end solutions. Unit testing, integration testing, and automated testing.
- Troubleshoot and Resolved by the Remedy incidents, New user creation, performance and ARS incidents
Environment: SAP BO 4.2 SP2, IDT/UDT, Business explorer, WEBI, Talend, ISM, SAP HANA KPI’s, BI, TABLEAU, POWER BI, BODS, Crystal reports and BW.
Confidential, Springdale, AR
Lead on BOBJ, LUMIRA, TABLEAU, BODS, and HANA
Responsibilities:
- Worked as architect to provide End to End solutions for various BOBJ Environment tools, HANA, BODS.
- Expertise to Resolved Daily issues for both Admin and Documents\ Reports in the Daily Production Environment.
- Created folders in CMC and implement the security on various objects in CMC (Folders, Applications, Universe and Connections). Performance tuning on both admin and application related activities.
- Created repository and security domain with user groups and security implementation multilevel hierarchies are Setup enabling the drill down functionality and created Derived Tables.
- Implemented Single Sign-On (SSO) and Taking care End to end issues like, User groups, Report Scheduling and various performance Activities. Hands on knowledge of EIR Accessibility, required technical specifications
- Data center Migration for the DEV /QA / PROD Environments and Repository. Upgrade service packs, ODBC Drivers and Supported admin activity for the various Server/services supported.
- Troubleshoot and Resolved by the Remedy incidents, new user creation, performance and ARS incidents.
- Good experience on the S/E version Migrations. Data load by using SLT, BODS. TPM reporting experience.
- Good understanding of various sap modules like SAP MM, SD, FI, AP, HANA Cloud Platform. Explorer Info spaces, Xcelsius Dashboards and Admin activities.
- Trained and mentoring business users, Start /Stop, Enable/Disable, Clone, Add, remove BO Services (Delete, Move & edit). Work with SAP Support in case any issue is not resolved internally for all Admin/applications related
- Strong Experience Native HANA Information Models using Attribute, Analytic and Calculated Views. Developed MS Analysis Reports, Input Templates. Data load by using SLT, BODS and Native HANA SQL, API Library.
- Designed and Modified IDT/UDT, WEBI, Design Studio dashboards, Stories using Lookups and various formulae To analyze the data like PDF, Line, column, combination, Pie charts, and Local and Global filters.
- Strong Understanding on live, Extract and Scheduling the TABLEAU, SAP LUMIRA Documents/Workbooks/ Stories, Admin and Security. Extensive expertise and experience implementing and developing with Microsoft Power BI tools including analytics, reports, dashboards, and data visualizations. Extract, transform and load the Data by using various transformations and good UI design with 360 Eye and Info bust
- Responsible for Created user-groups in CMC and implemented the new security model on the groups created like End to end solutions and Performance tuning on both admin and Documents/Reports related activities.
- Expertise in Business Intelligence tools and SAP HANA1.0/ S-4 HANA Data bases and HANA XS Engine Created Complex Crystal Reports using SAP Business Objects and SQL as data source
- Designed the solution architecture for the implementation of the KPIs and integration of the dataflow from the regional BI systems to the Global BI system, people soft modules.
- Developed XCELSIUS Dashboards for top level management by using BIWS / Query as a Web Service.
- Expertise in SAP Business Intelligence 4.0, 4.1, 4.2 tools and SAP HANA1.0 Data bases
- Perform installation, migration, configuration, maintenance, technical support and troubleshooting of Business Objects TABLEAU implementations. Setup the Platform search applications and Service level memory issues.
- Used Restricted and Calculated Key Figures, Virtual Key Figures, Formulas, Variables, Replacement path Variables, Customer Exit Variables, User entry Variables in the Query Designer. automated testing
Environment: SAP BO Administrator 4.1 SP2 to SP7, IDT, Crystal reports, WEBI, Talend, Remedy, SM9, Teradata15, TRANSMSN, EDI, HANA.
Confidential, Dallas, TX
Lead Business Intelligence
Responsibilities:
- Attended Business meetings, assisted Business Analysts and gathered requirements as per the data model
- Worked as architect to provide End to End solutions for various BOBJ Environment tools
- Implemented different TABLEAU / LUMIRA Storey futures, Dashboards, Trend lines, Forecasting, Sets, and Date Comparisons. And Performance Tuning on both admin (Vizql) and Documents related activities.
- Designed and Developed WEBI, Design Studio dashboards using Lookups and various formulae to analyze the Data like PDF, Line, column and combination charts. Unit testing, integration testing, and automated testing
- Strong Understanding on live, Extract and Scheduling the Sap LUMIRA and Tableau Documents /Workbooks, Admin and security. Performance tuning on both admin and application related.
- Coordinate with business teams to update the business requirements. Developed product requirement.
- Designed and Developed Web/Desktop reports on MS Access/SQL, creating Hierarchy on different data sets by Utilize Drill across the call history status like call drop, failures...etc. Created user filters, Conditions and global Filters for various reports to specify the reporting parameters and good UI design.
- Create the dashboards for various details about call status.
Environment: TABLEAU, MICROSTRATAGY (9.4), MS SQL/Access, Architecture, WEBI,BODS.
Confidential, Camden, NJ
SAP BI /BO Technical Lead, Dashboards.
Responsibilities:
- Worked as architect to provide End to End solutions for various BOBJ Environment tools.
- Attended Business meetings assisted Business Analysts and gathered requirements as per Data model.
- Designed universes from Multi providers and applied row level and objects level restrictions to different groups and users as required with 360 Eye, Info bust and Spot fire Web Service API Integration.
- Experience in SAP BO Integration kit, Created Reports and Universe using SAP BEX queries.
- Trained in BO best practices and testing Universe.
- Setup the environments/deployed to DEV, QA and PROD Server for this implementation.
- Coordinated with business teams to update the business requirements. Developed product development,
- Experienced working with Universe Builder to build OLAP Universes from SAP BW data source like Info Cubes/ MDX and BW Query. Performance tuning on both admin and application related. Good UI design
- Hands on knowledge of EIR Accessibility, required technical specifications. Unit testing, integration testing.
- Created YTD, MTD, user filters, Conditions and global Filters for various reports to specify the reporting Parameters. Moved reports by using Promotions Management for various environments.
- Created the BIAR files by using CHARM (UnicCSC ID Format) ID’s as per the company standard format.
Environment: SAPBOXIR4.0 (SP6), SAPBI/ BW 7.2, MDX, IDT, Web Intelligence Rich client, Promotion management, EDI.
Confidential, Bellevue, WA
SAP BI /BO Lead Developer
Responsibilities:
- Attended Business meetings assisted Business Analysts and gathered requirements as per Data Model.
- Experience in SAP BO Integration kit, Created Reports and Universe using BW / SAP BEX / MDX queries.
- Setup the environments for DEV, QA and PROD Server for this implementation. Performance tuning on both
- Admin and application related. Hands on knowledge of EIR Accessibility, required technical specifications
- Experience in SAP BO Integration kit, Created Reports and Universe using SAP BEX queries.
- Trained in BO best practices and testing Universe.
- Good Experience in EHSM Incident Management enhancements on BOPF using Enhancement Workbench, consuming web services EHSM.
- Coordinated with the business teams to update the business requirements like KPI’s. Developed product Development and lead supporting responsibility like resolve customer issues support and good UI design.
- Designed and Developed Universe (IDT), Web Intelligence reports, and administration activity.
Environment: SAPBOXIR4.0 (SP6), SAPBI/BW, IDT, Web Intelligence Rich client, BEX, BODS.
Confidential, Plano, TX
Scrum / KANBAN Master
Responsibilities:
- Attended Business meetings assisted to Business Analysts and gathered requirements as per Data Model.
- Migrate from sapbo3.1 to SAPBO4.1 by using Upgrade Management Tool. Extract, transform and load the Data by using various transformations and good UI design with 360 Eye and Inforbust.
- Created folders in CMC and implement the security on various objects in CMC (Folders, Applications, Universe and Connections). Performance tuning on both admin and application related activities. Good experience on the S/E version Migrations.
- Trained and mentoring business users, Start /Stop, Enable/Disable, Clone, Add, remove BO Services (Delete, Move & edit). Work with SAP Support in case any issue is not resolved internally for all Admin/applications related
- Strong Analytics/Data warehouse experience using SAP Business Objects Data Services/ Data Integrator using BODS as an external system for ETL between ECC and BI.
- Created Complex Crystal Reports using SAP Business Objects and SQL as data source
- Designed and Developed WEBI Reports using the features like Master/Detail, Charts, Cross Tabs, Slice and Dice, Drill Down, @Functions and Formulae to analyze the data and Performance tuning on both admin and application related. Hands on knowledge of EIR Accessibility, required technical specifications.
- Coordinated with the business teams to update the business requirements.
- Developed product development.
- Lead supporting responsibility to resolved customer issues, Implemented Agile methodology.
- Created user filters, Conditions and global Filters for various reports to specify the reporting parameters.
- Developed XCELSIUS Dashboards for top level management by using BIWS / Query as a Web Service to find out Network status on SGSN, GGSN and HLR.
Environment: SAPBOXIR3.1 (SP3/5) / SAPBOXIR4.1, Crystal reports, BODS.
Confidential, Houston, TX
Technical Lead on SAP BI /BO, Dashboards, Admin, EDI
Responsibilities:
- Attended Business meetings assisted to Business Analysts and gathered requirements as per Data Model.
- Setup the environments for DEV, QA and PROD Server for this implementation.
- Interacted with MS excel as data source from GSE & GSAP Databases to understand the BRD’s and convert them to tangible Technical spec’s to be used in Dash board report specifications.
- Coordinated the business teams to update the business requirements. Developed product development and Performance tuning on both admin and Dashboards application level and Initial testing activities.
- Automated reports by using VBA Macro’s in MS excel to convert source Data into XML formats, Those XML (URL’s) file incorporated into XCELSIUS Dashboards with XML data Connections and good UI design.
- Design and Develop the Complexes Xcelsius-2008 dashboards using Lookups and various formulae to analyze the data. Hands on knowledge of EIR Accessibility, required technical specifications
- Good Experience in EHSM Incident Management enhancements on BOPF using Enhancement Workbench, consuming web services EHSM. TPM reporting experience
- Worked as architect role to provide End to End solutions for various Environments. Unit testing, integration Testing, and automated testing.
Environment: SAPBO Xcelsius-2008(Dash Boards), MS excel, VBA macro’s, XML data/ Web service Connection.
Confidential, Lewisville, TX
Sr. Business Object Developer, Dashboards
Responsibilities:
- Worked as architect role to provide End to End solutions for various BOBJ tools with in the Environment
- Attended Business meetings assisted to Business Analysts and gathered requirements as per Data Model.
- Designed universes and applied row level and objects level restrictions to different groups and users as required.
- Setup the environments for the DEV, QA, and PROD Server for this implementation and good UI design.
- Designed universes (Information designer tool) IDT and applied row level and objects level restrictions to different groups and users as required. Set hierarchy, cascading prompts and Initial testing activities.
- Developed Universe by using BEX and Created Complex WEBI and Xcelsius Dashboards by using live office, BIWS / Query as a Web Service. TPM reporting experience
- Coordinate with the client s to update the business requirements. Develop the entire product development.
- R&D latest (BOBJ4.0) versions we used in the project and guide the team members and take responsibility to Reached customer desires. Extract, transform and load the Data by using various transformations.
- Design and Develop the WEBI Reports using the features like Master/Detail, Charts, Cross Tabs, Slice and Dice, Drill Down, @Functions and Formulae to analyze the data.
- Created user Prompts, Conditions and global Filters for various reports to specify the reporting parameters.
Environment: SAPBOXIR4.0 (BOBJ4.0) Information designer tool, BI launch Pad, Web intelligence Rich client, SAP Business Object Explorer, SP1,2, &3, Dashboards, Live office, BIWS / Query as A Web Service, Windows 2000 Server, IBM Data stage 8.5, Teradata13.0. Informatic