Data Analyst Resume
San Francisco, CA
SUMMARY
- Business and Data Analyst / Lead / Technical Writer 30+ years’ experience as migration/mapping/requirements analyst, QA, liaison, user advocate, data advocate. No relocation; local travel to San Francisco/Oakland/Petaluma OK. Remote ideal (other time zones OK).
- Good listener (MS, Psychology / Counseling) and observer; flexible communication and documentation styles to accommodate user learning and thinking styles (written and spoken words, diagrams / sketches, whiteboard). Native English speaker/writer. Six years standup and 1:1 computer trainer. Customer / user advocate. Thrive on complex, ambiguous and new / changing environments. Able to assess and figure out a starting point (and next step(s)) in ambiguous and complicated / complex situations.
- Practical idealist; prefer “do the right thing, once.” Motto: “excellent process yields excellent product.” Details - oriented with loose-tight thinking, seeing details and big picture, and how they relate. Be “MacGyver” as needed. Pitch in to share the load. Hands-on style to do the work; lead as needed, doing my share of “the doing;” help others on seeing a need. OK with changing hats; doesn’t have to be in my job description.
- Self-starter/initiator investigating ambiguous and complex business and data processes using critical thinking, reasoning skills, and problem solving, applied practically and profitably, within stated vision, mission, and scope. Prioritize tasks based on context, mission statement, SLAs. I expect data to reveal a story.
- Requirements gathering (BRDs business requirements documents, FRDs functional / technical requirements documents ). Listen, interview, observe, collect existing/legacy samples, and organize all into structured documents; create report/ output mockups. Explain / depict the complex and the complicated in user-appropriate terms, visuals, user stories, and mockups. Virtually match writing style of existing documents. Support decision-making via profiling, data stories, and statistics. Probe for “need” vs “want” (and supporting proof of “need”). Reverse-engineer requirements from SQL, SAS, and COBOL.
- Requirements testing. White box testing to verify data supports requirement(s); research and present alternatives. Support users transition to and working with new / patched data systems. Create custom test data. Create expected results. Conduct user reviews.
- Data profiling / root cause analyst Use (ad hoc SQL (from scratch) on Teradata, Oracle, SQL Server; via tools TOAD and SQL ASSISTANT). Support to data quality, data modeling, data maps, and documentation, without prompting when observed discrepancies; as assigned per project plans.); document “as is,” “to be,” and “dreams,” source-to-target mapping, expected results.
- Create / maintain metadata, data models (tables, fields), data dictionaries, user documentation, job aids, and ERDs.
- Initiate / support audit and compliance requirements, based on laws and client policy, including SOX, HIPPA, DEA, taxation, Basel II, Dodd-Frank, “common obligor,” “governance,” SEC, KYC Know Your Customer, EU regulations, data quality initiatives, risk, and identity management; GAAP, and document management (via Jira, SharePoint). Document technical requirements via pseudo-SQL (with supporting REM notations).
- Liaison between users and development staff for production support, systems and patch deployments. Data warehouse “representative” for data governance, quality, user questions; data user advocate; data evangelist.
- SIT testing. Create test plans, cases, and scripts from requirements and data profiling. Execute testing; support testing by others (on-shore, off-shore; users). Create custom UAT data. Verify production deployments.
- Lead off-shore and domestic teams (developers, QA, writers). Team player when there is a team; individual contributor as needed. POS testing lead as appropriate. Acting PM as needed. Technical writer at times.
- Use Excel to package “pretty results” for users, create mock-ups, document requirements, pivot tables, and V/HLookups (when DIMs not available for SQL JOINs). Draft / contribute to PowerPoint presentations.
- As business analyst, reverse-engineer county assessor tax data business rules and process from analysis of legacy mainframe data via interviews, data profiling/analysis; document and map legacy data into staging schema for migration to AUMENTUM application. Plan all testing through UAT; profile data issues in conversion, post-conversion. PM as needed; coordinate staff, conversion runs, workloads.
- When end-of-month reporting not supported by Oracle Financials, gathered financial user reporting requirements via meetings, review of legacy reports, and 1:1 interviews with GAP Inc. Finance staff to mockup three reports (160+ columns wide, 3,400 rows long) with results samples. Supported ETL into EDW and report development; verified deployment.
- Created expected results (data mining under GAP Inc.’s Oracle Financials) for expected results of ETL (into EDW) and report developers’ work; participated in system, integration and UAT, and verified deployment.
- When Old Navy Stores staff complained that reports showed 756 locations (only 721 brick-and-mortar stores, really) I lead business BI requirements gathering via meetings, did issue analysis and development of possible solutions. Outcome: add a definition of “store” (7 definitions in all) to weekly ETL and reporting (This addressed merchandise being shipped to the wrong-branded store, and sold.)
- Gathered business and “flash” requirements via phone, WebEx, listening, and user-mockups for new reports from new RX Fill / POS system; documented same. Created XLS report mockups showing all calculations, offering additional data possibilities not previously available; conducted reviews and supported user sign-off. Added table.field notation to support WEBI coding.
- Determined the business requirements for “the how” and “the testing” (and verified deployment) to move $41M in sales and inventory EDW data back one week, with restatement of all subsequent sales and inventory values (following nine months) in a system with 4B records per year covering four years rolling data. Deployed “the redo” on a holiday weekend 10 months later, with verification.
- Served as customer-facing representative for Pharmacy Enterprise Data Warehouse staff (18 people; $1B revenue) to nationwide SOX / Compliance and governance staff for nationwide access to prescription fill and sales data. Addressed PHI (personal health info) protection, SOX, compliance, and DEA (Drug Enforcement Agency) and national HEDIS reporting. Created / maintained all team documentation, style guides.
- Gathered and wrote business and regulatory requirements for replacement of critical legacy Kaiser Permanente Medicare report (no documentation; worth $65M annually, and 4-star rating w Medicare) by reviewing legacy report and interviewing KP Medicare staff, then data mined in new system for equivalent data, field by field. Created draft WEBI reports (using real data) for user review. Got sign-off; ran report manually until could be deployed for automated daily execution. Documented new report and metadata. Created technical spec for direct extraction (via Informatica); verified extract output. Supported users for additional add-on requirements over time, including updating business and technical requirements.
- Analyzed (and reverse-engineered) a legacy daily faxed sales report. Drafted replacement proposed daily distributable “dashboard” of RX sales by pharmacy. Worked out security and report / BI distribution requirements and options for presentation to business users, pharmacy staff, national compliance staff.
PROFESSIONAL EXPERIENCE
Confidential, San Francisco, CA
Data Analyst
Responsibilities:
- Data/business analyst for mapping Care1st data into Blue Shield of CA (BSC) systems (agile prjt). Gap analysis and STTP (“source to target mapping”) of BRDS and draft mapping to date for “to be done’s”, and impact of (and solutions to) possible omissions. Areas of responsibility: CLAIMS (“raw stage” to FACETs, et.al, with ETL recommendations) and “CACTUS” (licensing) for project with slash/cut deployment while maintaining “separation” of Care1st data from BSC for reporting data mart. Develop/edit metadata as possible.
- Support all PCI, HIPAA, PHI and other regulatory and best practices in profiling real data; create mockup data for test load/ run purposes.
- Supported BOARD reporting application for clinical data quality, documentation, user support, new/updated requirements (cubes, reports, dashboards, extracts, “application”). Developed documentation via reversing of current code and edge-case testing; developed UAT plans and test scripts, UAT training; developed plan for automated regression testing where appropriate (via HP ALM).
- Participated in QA (as UAT and as unit/system/integration functional tester) of BOARD enhancement deployments. Supported use of HP ALM for requirements, test scripts, and defect tracking (future BOARD releases).
Confidential
Data Quality Analyst
Responsibilities:
- Supported current and future (expanded) use of CA PPM (Computer Associates; “CLARITY’, aka: CA Project & Portfolio Management), addressing process and data quality issues in reporting forecast and actual labor values (huge impact to financial systems). Debugged and documented existing custom and canned reports; planned for modifications, automation’ document defects in detail (for off-shore resolution) with system testing, UAT support, and verified deployments. Coordinated with off-shore support (user needs, defects, report enhancements). Supported expanded use of CLARITY features and functions (annual budgeting, investments). Supported report process improvement. Documented existing SOP processes and “technicalities” in support of IDing possibilities for reduced cost, prep for migration to 15.x.
- Customer-facing representative to 650 internal users and their supervisors. Triaged issue tickets for criticality and appropriate timely handling (immediate, for example, versus overnight by off-shore). Followed up and verified completion of work on closed tickets. Resolved single-sign-on issues; supported improved processes (example: assigning of new staff member an email address that was retired 13 years ago—not SOX compliant, and caused login failure because that email already existed in CLARITY).
- Developed partial data model of selected fact and dimension tables used for budgeting and forecasting toward identifying critical data quality relationships (based on issues seen) to support increasing data quality.
- Worked night hours to coordinate with off-shore resources as to accurate understanding of requirements; created mockups as needed to “demonstrate” data performance and use.
- Conducted system, integration, and UAT testing across DEV, TEST, and PROD platforms. Identified data for real edge-case (boundary) testing. Identified and documented recurring “data quality” issues in reporting logic; created report samples for training purposes.
- Mined/monitored for data quality issues. Developed simple statistics to show the costs of slack processes.
- Gathered requirements from Finance and Accounting depts as for changes to non-automated “data feed” to AS400. Support those depts in UAT of changed feeds. Triaged all user-reported data issues; identify data issues in daily reports and notify “bad user” for correction overnight before the next report run.
- Gathered requirements for new and broken reports and dashboards for prjt status reporting (high-level down to details of the hours worked and projected by staff member by day across $120M in development projects).
- Analyzed historical data for trends in prjt budget forecasting accuracy, and evaluation of data quality / validity for such analysis.
- Identified SOX/compliance issues in CLARITY data (such as staff approving their own timesheets, and Financial Analysts approving timesheet for contractor whose invoices they authorize payment), CLARITY administration (off-shore use of one login by 6 staff),
- Support to integration of CLARITY with Outlook, SharePoint, ServiceNow.
- Use TWITTER to “influence, not control” as part of research in predicting responses to different, very targeted inputs (goal: create online communities that are inclusive, supportive, and “self-policing“. Coordinate content and timing with analysts, profilers, others. As much as 14 hrs/day, 7 days per week. On-going. Report incorrect online error handling and application defects.
- Take online courses and webinars in the following: Tableau, statistics review, SQL, PowerPoint, Excel (functions, dashboards), data science, MS Project, data visualization.
Confidential, MN
Clinical Data Analyst/Business Analyst / Data Analyst
Responsibilities:
- Analyzed of UHC/UHG/Optum research requirements versus data received (lab test results) as XML (read source XML files for missing or damaged elements). Mined data (ad hoc SQL on Teradata, DB2, Oracle) for data quality issues in clinical lab test result EDI feeds (daily feeds, multiple sources, multiple EDI formats) in support of accurate reporting back to clinical test results providers and for HEDIS, research, other. Compare results across systems (legacy, new) for which to trust / use and when, when not. Reverse-engineered business rules from reading SAS code (to create documentation). Put documentation in SharePoint; updated others’ content as appropriate (existing SOPl updated SOP; proposed Sop improvements).
- Identified, proposed, and tested cleansing strategies for quality issues. Documented quality issues for upstream systems (statement of impact to downstream systems, support to resolution or workarounds). Verified data in EDI (some in XML) feeds (or lack thereof, with documentation). Tested and proposed revised matching/rules strategies for ETLs (match lab test results to members).
- Supported documenting legacy reports / dashboards, providing detailed research to explain what they included; gathered detailed requirements for updates and fixes. Supported team in troubleshooting / fixing SQL pulls and Excel reports / dashboards.
- Documented system & platforms, extracts, process flows, and detailed processes (created data models and metadata for platforms, including ETL transformation documentation); inherent very little that was current. Application tracking (Excel).
Confidential
Sr Business Analyst / Technical Project Lead
Responsibilities:
- Gathered legacy (COBOL) processing flows and taxing requirements via interviews, observations of users, user input and screen shots, and data mining to support migrating county legacy mainframe property tax / assessor data into Thomson-Reuters Aumentum (SQL Server COTS) system conversion tables.fields. Support to conversion scripts {aka” ETL) development (including attempted use of SQL Server SSIS routines), backups and test runs.
- Supported “go live” deployment planning / coordination between major annual critical tax events. Developed test plans, cases, details, expected results; report / explain defects and support solution deployment. Map migration of legacy data (1893 to present) for parcel lineage, land-line changes, tax authority / fund and rate changes, tax roll statements, collections, corrections, ledger adjustments, payment plans, and tax sales of secured and unsecured property to interim staging tables. Learned taxation on my own and through pattern analysis of legacy data, plus interviews with county staff. Planned for and participated in performance tuning and automation of conversion runs.
- Supported change data capture (CDC) for partial tax bill payments, updated assessments (no change to parcel boundaries) to address taxable event changes, selected ownership changes, and book re-mappings). Supported enhancement (user, cash drawer, and accounting process changes) to taking debit/credit cards at GOLIVE. Planned and supported all change management. Loaded reference files (diverse file formats, including XML) as needed for conversion, test loading. Suggested performance tuning and ETL alterations to achieve reasonable conversion/ETLs for daily test and as-needed runs.
- Followed Federal and State audit and SOX/compliance rules in ETL transformations; reported load failures into the COTS application for correct legal compliance.
- Identified and documented traceability (including county data specialist of record) on field-by-field basis through all ETLs to ensure continuity and having the correct staff as parties to decisions. Developed traceability matrix for client to maintain toward go-live. Anticipated and managed risks; supported client in coping with, adjusting to, and planning for same. Mining and migrating of unstructured data (to structured schema). Participated in UX (user interface) design, testing. Application tracking via Excel,
- Created reusable template for mapping and scripts (9 other counties with versions of same MANATRON software, want to convert to Aumentum). Developed data models (in VISIO) of legacy, flat-file COBOL system (including 33 years of changes), staging transformation tables, and partial data model of final tables (where converted data landed directly or with transformations). Setup Tableau “dashboard” for legacy extract data quality issues (for fast feedback to client for additional data review and cleanup as needed).
- Put documentation in SharePoint and Office360; updated others’ content as appropriate.
- Supported county staff plan for addressing legacy data quality issues (and new ones during prep for go-live). Documented legacy rules and migration transformations. Developed data quality cleansing strategies; prepared character-specific requirements for coding to address 55% of tax payer history issues. Plan for data encryption as needed.
- Coordinated internal staff activities to meet aggressive recursive runs and deliverables, using best of talents on fixed-price project. Prepared status reports; conduct status meetings; lead technical meetings, email communication. Planned technical migration and deployment, including change management (parallel operation until GoLive), contingencies for GoLive, and post-GoLive revised SOPs.
- Coordinated with Aumentum (Thomson-Reuters) staff for customizations needed by county conversion and “conversion run” timings to support GO LIVE conversion that fits in county tax activities (with least impact).
Confidential, San Francisco, CA
Metadata Analyst / Tech Writer
Responsibilities:
- As metadata analyst on risk reporting team, coordinated with EDW business analysts to create metadata for Fraud/Payments (mobile banking, mortgages, investments, checking / savings and deposits, IRAs, special products, promotions; wealth management), and Customer Surveys; wrote definitions for data users and report developers, adding caveats and business requirements for associated fields on Teradata. Reviewed/edited metadata across systems for continuity, accuracy, and consistency of style, and dept writing standards. Included notations for compliance (e.g., customer confidentiality). Profiled for new (undocumented) products, often associated with payment types and/or new feature for customers; ensured all products and payment types had correct metadata. NOTE: all transactions were in USD. Reviewed metadata for support to regulatory reporting needs (SOX, BASEL II, CCAR, KYC, SEC, State, CDD, et. al.). User tracking (downstream “data consumers”) and their SLAs (via Excel).
- Profiled data (ad hoc SQL on Teradata, SQL Server; including performance tuning my SQL) to verify field content matched metadata; supported reconciliation and resolution. Contributed to metadata maintenance guidelines and processes. Notified others of change management needs. Worked with data from checking, savings, CDs, special accounts, IRAs, car loans, HELOCs, LOCs, businesses, mortgages, and “wealth services” (all data for US customers of all kinds; no foreign customers’ data).
- Supported maintenance of logical and physical data models (Teradata and SQL Server schemas) and ERDs. Supported MDM rules adherence (reported new variations to appropriate data owner). Put documentation in SharePoint; updated others’ content.
- Gathered business reporting / intelligence requirements (specifically, risk reporting) from SMEs, meetings, emails, complaints and concerns voiced in team meetings, and detailed reads of EU, federal and state banking regulations (SOX, Basel II, Dodd-Frank, CCAR, CDD (“Customer Due Diligence”, FDIC, KYC “Know Your Customer”, EU, in particular); added data quality standards to requirements. Drafted SSRS runs for dashboard content. Contributed to daily, weekly, monthly and quarterly status reports. NOTE: all transactions were in USD. Application tracking (especially SORs) via Excel, Coordinated change management. Put documentation in SharePoint; updated others’ content. Supported data quality “fixes” in ETLs for improved customer matching across diverse platforms and systems.
- Developed and updated internal data quality processes (reflecting frequently changing requirements) by creating / updating SQL (SQL Server; T-SQL) “checks” and supporting documentation and metadata (with source system fields). Identified pre-reporting clean-up requirements. Documented technical requirements with SQL, pseudo-SQL, and input / output samples. Performance tuned my ad hoc SQL appropriate to the platform and data volume to support timely daily reporting updates.
- Profiled data via ad hoc SQL (Teradata, Oracle, SQL server) for magnitude of impact (risk; exposure) for management review, and for “common obligor”. Worked with data from checking, savings, CDs, special accounts, IRAs, car loans, HELOCs, LOCs, businesses, mortgages, and “wealth services” (all data for US customers of all kinds; no foreign customers’ data), data mining for “commonalities” across diverse systems and customers for how to identify correctly all assets and risks of each “customer” for accurate reporting. Drafted and tested data models, metadata, and ETL for draft reporting on ‘common obligor” (representing mapping from many diverse sources and identified issues).
- Proactively listened for activities by other teams, including new proposed (or “chatted”) reports, and new or changed state / fed regulations requiring audit / compliance support. Informed mgmt of “talk,” concerns, “dupe work.”
Confidential, San Rafael, CA
Business Process Analyst / Tech Writer (employee)
Responsibilities:
- SQL coding (retooling of Oracle ETL to SQL Server ETL) in support of hierarchies and commissions (supporting customer retention).
- Developed data models and metadata for undocumented processes (customer “hierarchy’ schema) for recoding to other SQL platforms.
- Customer-facing representative (and reporting data mart “owner”) for all KP data users to new nationwide Pharmacy DW that received RX workflow records, transforming selected workflow and transactions into operational transactional data stores and data marts (with change data capture CDC content). This OPPR (OutPatient Pharmacy Replacement) DW was built from scratch, replacing legacy McKesson; used mix of SDLC, agile, and waterfall methods. Assisted McKesson users in identifying equivalent data between legacy and replacement systems. Note: All transactions were in USD. Supported capture and reporting of changing payment methods, reimbursements, credits, co-pays consistent with federal and state regulations (no “chip and pin” debit/credit cards used KP then).
- Gathered business, regulatory, and inventory requirements from meetings, 1:1 interviews (in-person and Web-ex), mock-ups, legacy reports, SAS code, coffee clatches; verified if new RX data would support those requirements; opened defects and uses cases as needed for corrections or additional fields. Tested impact of proposed technical solutions on reporting data sets and solutions; lobbied on behalf of reporting users to system/ETL architecture staff to ensure delivery to requirements. Gathered and implemented compliance, regulatory and policy requirements for masking of patient and RX data (including POC of Oracle VPD for table.field masking); meet with data user representatives from 15 teams, and reviewed state / federal regulations and KP compliance policy documents; drafted data model POCs consistent with masking needs and efficiencies to support BusinesObjects/WEBI reporting. Coordinated with all KP national teams for change management. Put documentation in SharePoint; updated others’ content. Develop “dept standards” and SOPs / SLA, for handling requests, issues, custom reports, and defect reporting/testing. Developed proposed (Informatica on Oracle 11g) ETL from DB to reporting layer for improved reporting performance.
- Identified and invited participation in nation-wide cross-functional teams for heads-up on OPPR planned projects, data uses, and audit/reviews that would require staffing and tasks in project plans; estimated tasks and notified management. Coordinated (on behalf of entire DW staff of 18) with national and regional SOX / Compliance, HIPPA and governance staff to implement best practices and regulations on 27,000+ fields of data. Conducted introductions to the data and its protections to all levels of KP staff; provided field-level documentation and procedures on behalf of entire EDW to support Compliance reviews, audits; extracted data with ad hoc SQL to meet specific qualified data requests (for marrying to data from legacy systems). Supported documentation retention policy and planning as applicable to OPPR. Application, user, and team member tracking (via Excel; requested access to CLARITY for this not installed ).
- Acquired and loaded OTC (Over the Counter) product files as XML for loading to POS. Data quality checking of XML content; fixed content as needed.
- Used Crystal Xcelsius for early dashboards. Used UNIX to open pharmacy files to verify layout, content.
- Developed data quality metrics at field level for mission-critical fields per business requirements for specific needs. Opened RALLY use cases for system changes when defect came from source system; verified requested deployment. Tested Data Architect’s data models for support to reporting and dashboards; created mockup data and mockup sample reports.
- Hand-coded BusObj (work flow data) universes to support reporting needs; coded some WEBI reports; troubleshot Crystal Reports; adjusted universe to support reporting needs. Performance tuning and virtual tables as needed to support reports running to completion, with accuracy. Implemented appropriate Change Mgmt processes.
- Supported Oracle 360 Commerce (aka, Infogain), doing data quality verification. Participated in UX (user interface) design, testing.
- Coordinated with KP national staff for security roles to align with LDAP requirements for access control; drafted detailed plan and templates; not deployed in my time. Coordinated all stress testing for Pharmacy DW with other KP teams to ensure fail-overs worked. Managed contract staffing access to all OPPR data. Granted or cancelled user and developer access to data based on vendor contractual compliance with SOX and HIPPA terms.
- Coordinated SLAs and data mart access (refreshed data) with nationwide users (Eastern Time Zone to Hawaii Time Zone) and ETL and architecture staff.
- Planned toward MDM (tested IBM Infosphere and Oracle MDM). Prototyped proposed MDM rules toward implementation of Oracle MDM solution. Effort back-burnered when DW had to be redesigned (twice). Incorporated draft “rules” (patients, drugs) into BusObj Universe and documentation toward future deployment
- Supported Excel “inventory and supply chain” reporting (manual) in prep for new inventory system.
Confidential, San Francisco, CA
Sr Ad Hoc SQL Business/Data Analyst / Tech Writer
Responsibilities:
- Gathered business requirements from emails, data user complaints, phone calls with data users, user interviews, SQL / SAS code, and legacy documentation / rules for coding of Wachovia mortgages merging into Confidential mortgage EDW. Mined data for possible matches, discrepancies; documented both, presented same SMEs for verification, then to management for possible ETL changes to align for WF systems and product codes . Recorded / tracked defects in HP QC. Application and server tracking (via Excel).
- Used ad hoc SQL (on Oracle, Teradata) to identify and propose solutions for mismatches between systems. Updated metadata documentation for discrepancies, resolutions. Ad hoc SQL profiling (Teradata) for management review. NOTE: all transactions and stored data were in USD. Detailed change management needs for project PM.
- Mined mobile banking transactions (charges, payments) in EDW for compliance with design rules, regulations, and data quality. Updated documentation; defects. Developed physical data models, with metadata and recommended “starter SQL”. Reported wireframe issues.
- Gathered business requirements for COTS from emails, user notes, phone calls to data users, user interviews, for custom building construction estimating and in-progress tracking, including labor reporting and invoicing. Designed data schema/models, ETLs, supported DBAs in development of DB (MySQL). Participated in UX (user interface) design, testing. Projects included built-in MDM approach for all clients.
- Drafted preliminary reports (as samples); conducted user reviews of same; revised specifications. Tested all of team’s work.
- Clients were mostly based in Washington, DC and Arkansas.
- Extracted selected sepsis data for qualified users; compiled monthly reports. Implemented data masking on PHI per users’ “need to know.” Participated in UX (user interface) design, testing. Coordinated change management with hospital QA staff.
- Trained (on-site in hospitals, by phone, and via CAPTIVATE) in-hospital Quality Staff to extract manually from medical records pertinent patient and treatment details into custom MIDAS system (analysis of effectiveness of sepsis protocol). Tested the extracted results for quality and usability; made suggestions for data structure redesign and changes for data extractors. Provided Quality Staff re-training for system changes and patterns of errors to be remediated. Ran ad hoc SQL queries on HealthConnect. Recorded / tracked defects in HP QC. Promoted GCP. Updated ETL into MIDAS for new fields and transformations. Ensured compliance with HIPPA and other compliance standards. Developed partial data models of HealthConnect where desired sepsis data was most frequently storied (as job aid for hospital staff).
- Profiled sepsis patient data for statistical significance of difference and appropriate management review.
- Developed and maintained metadata data (data dictionary) for MIDAS fields and associated HealthConnect source fields. Coordinated with UX specialists for MIDAS layout.
- Coordinated with customer service staff to support field staff in sending data per load requirements for receiving new customer data and payments into FACETS (2 versions) and DW locations; verified that sending teams had current load requirements and that corporate was using current load requirements. Compared flat file data received to “FACETS requirements” to identify root cause of load failures. Began documentation on PHI for HIPAA audits. Support process improvements in data load SOPs. Suggested ETL changes and performance tuning for timely loads of new member loads. Ensured compliance with HIPPA and other compliance standards. “Mooched” content from vendor data models (proprietary) to reflect UBH instances to help troubleshoot and explain data load errors and requirements (used VISIO to record the “data models”).
- Gathered, documented, and updated requirements as needed, based on system changes and new needs.
- Developed data dictionary (none from FACETS vendor) and maintenance process for same. Created physical data model and metadata for selected portions of FACETS schema.
- In Gap Online DW (agile / iterative), took verbal “business requirements” from GAP internal staff (Tax, Legal, Inventory, Online Sales staff, Product Managers, others) about known or suspected online data and transaction issues, and about planned deployments (to be verified); wrote ad hoc SQL to pull data to confirm (or prove “did not happen…”); documented and tested the provided “requirements” (developed clarifications based on discrepancies between the data and the “requirements”); provided reports as needed, updating and revising requirements (and SQL) as needed to address customer, supply and internal issues. Pulled data on customer use of special promotions, specialty products purchases, customer surveys, and discounts for Marketing staff. Recorded / tracked defects in HP QC. Filed documentation per retention policy. NOTE: transactions were in USD, Puerto Rican Pesos, or CAD (at that time); still supported traveler’s checks (if in USD) at that time; supported use of chip-and-pin cards.
- Developed and maintained data dictionary (metadata) for online sales fields and reporting formulas; lead reviews across teams to gather consensus on naming, business rules descriptions. Application and downstream user tracking (via Excel). Updated Erwin data models to reflect undocumented changes and metadata (mostly non-existing).
- Gathered and verified requirements for Product Managers’ tracking and reporting of specialty product sales and promotions to ensure correct starts/end of deployments, usage, customer demographics, overlap with online referral systems (to adjust billings from same for referrals); data mining for product performance comparisons (including returns / exchanges). Gathered requirements and executed “data cleaning” and re-categorizing of 9 yrs of Omniture (4 brands) for reloading.
- Participated in Athleta wireframe redesign and testing (after purchase by GAP), including messages for errors, and error-handling.
- Queried proactively and responsively via ad hoc SQL (Oracle, Teradata, including performance tuning of my SQL) to address product support, tax, legal, promotions use and success, customer service, inventory, supply chain, refunds, and exchanges. Researched supply chain issues and delayed deliveries to customers. Identified refunds due from online referral vendors (returned and exchanged merchandise). Implemented data masking based on “need to know” requirements. Profiled data for suspected fraud (with stats to support decision-making. Reported ETL defects (with examples of failures and proposed corrections) in customer product exchanges.
Confidential, Mill Valley, CA
Business Analyst/ Technical Writer
Responsibilities:
- Gathered business requirements (via 1:1 interviews, reverse-engineering of existing reports, JAD sessions) and business rules for company-wide MicroStrategy reporting off SQL Server EDW. Identified variances in formulas, overlaps in reporting, reporting delivery issues, and report run inefficiencies for scheduling of fixes; held meetings to develop consensus on formula naming and math conventions for company-wide consistency. Planned and coordinated change management (company-wide). Supported conversion from SSIS ETLs to hand-coded ETLs (main servers to reporting data mart) to reduce failures from source data quality conditions.
- Developed “business shopping requirement” (FRD) to replace custom securities trading system, added supporting MDM (Master Data Management system). Tested three MDM products (in reality, purchased 2-3 month deployments each) as “lead shopper; made technical recommendations. Documented MDM reporting requirements. Coordinated off-shore staff for off-hours support, QA, and data loads, MDM vendor mgmt (during “sales testing” and install/deployment). Ensured SEC and FDIC compliance. Supported data conversion / recoding of MicroStrategy reports (with performance tuning for timely results delivery).
- Developed data dictionary (metadata) for custom JAVA trading system (396 tables) in support of shopping for and transition to replacement system. Filed documentation per retention policy.
- Owner of reporting data mart (MicroStrategy), including all formulas, scheduled reports, company “business rules” (and supporting stored procedures that reflected those formulas and rules). Documented all user variations. With selection of an MDM, supported migration of all reports to the MDM, adjusting each for the MDM rules. Supported the MicroStrategy sys admin.
- Supported test-deployment and modifications to new EDI design; created test data. Verified deployment.
- Wrote detailed technical spec (FRD) for migration of trading data from custom legacy system and EDW into new DW housing selected MDM solution. Support to two different rules engines (including reconciling different requirements)
- Created and maintained ERDs (aka, Entity Relationship Diagrams) in Embarcadero Studio; managed Embarcadero licenses and defects. Produced logical and physical “as is” data models. Application and license tracking via JIRA.
- Wrote spec and testing the nightly loading of 17 LIBOR (London Interbank Offered Rate) to 14 decimals with rounding to 13 decimals (for use in computations in the billions). NOTE: all values were converted to USD.
- Flagged / fixed mortgage data from different sources that was likely to cause load failures. Implemented data Implemented masking of mortgage-holder information per users’ “need to know.” Recorded / tracked defects in Jira, Bugzilla. SDLC environment.
- Gathered data and load requirements to support Risk Modeling team (and risk models; e.g., earthquake, flood, hurricane) via pre-profiling mortgage payment data, and securities forecasting modeling. Documented and verified requirements were correct to the needs, updating as appropriate. Put documentation in SharePoint; updated others’ content.
- Supported servers running securities trading system (mortgage-backed securities) that was written in JAVA with MicroStrategy report lay on SQL Server. 24/7 troubleshooting to loading of mortgage payment and purchased.
- Gathered audit requirements (internal, external) and developed, tested, and turned over to RWT Internal Audit staff the appropriate data handling processes to support consistency and quality. Delivered requested audit content. Participated in preparation of SEC statements. Attempted SSRS reports for repeated draft runs (rejected as unworkable from the complicated processing).
- Backup SQL Server “DBA” as needed for emergencies (lived five miles from servers). Creation and troubleshooting of nightly SSIS ETL processes; creation of SSRS runs for data quality issues (identify data quality issues causing failures).
- Gathered business requirements via meetings, analysis of legacy Excel worksheets, and legacy reports for new ETL and enhancements to corporate EDW, including monthly ETL of Oracle Financials for reporting. Created source-to-target mapping (per business requirements); confirmed all requirements mapped to source tables and fields; created expected results for all ETL stages, interim tables, reporting layers and completed MicroStrategy reports. Verified solution deployments. Supported migration of PeopleSoft data to EDW for custom reporting. Support to supply chain and inventory requirements gathering, cost analyses, shipping charges, customs delays, unreportable future orders, inventory “stuck in virtual warehouses”, other. Participated in UX (user interface) design, testing. Gather non-functional requirements for adjusting weekly processing to support SLAs.
- Joined new corporate EDW 6 months after first data was loaded into data silos. Mission: Construct 300 records for pushing through new DW ETL design so that in 3 minutes with results, could know if new ETL was mostly correct. Tested first BusObj reports against DW data. Was asked to convert to employee. Recorded / tracked defects in HP QC.
- Supported Sarbanes / Oxley (SOX) implementation from 2002 on, ensuring data compliance and reports could withstand audit (developed audit processes, documentation). Supported currency and currency conversions based on store location (e.g., France, UK, Japan, US, Canada) and MSTR report destination (or user need, such as all Corp reports were in USD). Supported PO system currency conversion depending on “the need” (vendors’ local currency, customs, inventory valuation depending on the user, and final merchandise destination. Tested VAT (value-added tax) based on location requirements. Tested POS hardware for support to early chip-and-pin cards (using FR and UK-configured “stores”). Coordinated EDW change management (patches, upgrades, deployments).
- Updated, expanded and created new data models as needed (using Erwin); took over ownership of orphaned data models, including FFRED (First Financial Retail Enterprise Data).
- Proactively “gathered” metadata from observations, listening, tasks and learning; documented same. Socialized same to team. Gathered / organized legacy and mainframe documentation.
- Managed EDW software vendors and licensing (MicroStrategy, Business Objects; beta census data mapping overlay); tracking via VISIO and Excel. Upstream application tracking, especially SORs (“System of Record”) tables and/or fields (via Excel).
- Mined data (via ad hoc SQL on Teradata, Oracle) for problems within and across data silos, including financial systems, sales, inventory, Purchase Orders, leasing, and selected HR; including proactive and responsive querying for questions and new possible data uses.
- Supported initial deployment of DataStage ETLs (created, reviewed requirements; tested results) as EDW converted from hand-coded SQL ETL to DataStage. Participated in ETL design reviews and testing on several projects, including Oracle Financials to the EDW, and implementing “store count” solution into weekend processing.
- Developed custom extracts for Corp Finance teams’ modeling needs (gathered their requirements, reviewed their planned models).
- Tested requirements and first deployment of Oracle Retail Integration Bus (RIB); tested the retooled effort.
- Lead off-shore QA team of 34 in verifying transition from home-grown to Oracle RETEK store inventory system.
- Conducted statistical analysis for magnitude of impact on “data issues” between and within data silos.
- Maintained selected logical and physical data models (Erwin); researched and recommended changes to support new functionality.
- Participated in or lead small projects (to $600K). SharePoint “admin” for own projects and other PMs (load, move, edit, organize files).
- Created data for special UAT and performance / load testing; participated in partition planning. Validated view creation; supported retooling of MicroStrategy reports to use views.
- Point-of-Sale (POS) testing lead (1 of 3 leads), supporting cash register deployments to brands in five languages and 6 currencies (end-to-end testing through 57 systems to EDW). Gap analysis of test coverage. Created detailed QC regression test scripts for automated testing.
- Gather business requirements (based on 1-1/4 page “spec”) for 80% remake of custom web-capture program (worked in 14 languages). Interviewed user staff and key management; lead requirements reviews. Wrote up all requirements and changes. Created data dictionary for updated system. Mentored junior staff to take over, post-deployment. Participated in UX (user interface) design, testing. Coordinated change management.
- Participated in design and testing of updated ETL for new fields (from internet page sources).
- Developed and executed detailed testing (including deployment). Engaged user staff for testing in all 14 languages.
Confidential, Emeryville, CA
Business Analyst/Technical Writer
Responsibilities:
- Analyzed player activity and customer surveys via ad hoc SQL (Oracle), including performance tuning. Gathered “requirements” from gamers’ emails, synthesizing their issues and suggestions. Profiled game play (via ad hoc SQL) for insights for developers. Identified contest winners (via ad hoc SQL); administered contests. Reported interstitial ad viewing for billing to advertisers. Rough-drafted data models of game-play schemas in support of requesting changes to better support game analysis and reporting.
- Coordinated with developers to address player complaints (emails, phone calls). Provided game play insight to developers.
- Gathered and documented training and documentation requirements in prep for “slash /cut” deployment of new Windows-based customer service interface (2,500 users; 314 systems); $25M budget within $250M project). Participated in UX (user interface) design, testing. Coordinated change management re training and documentation.
- Lead team of nine writers in gathering requirements and technical content for development of all user reference content for Foster Children modules (312 tables), including single-source online documentation. Developed content from business rules from client staff (written in English and Visual Basic); worked with client staff to resolve discrepancies, imprecision of rules, screen/UI layouts, icons for modules and functions. Drafted displayable interface prototype in Visual Basic to help client staff understand what they would be getting (there was no working prototype). Participated in UX (user interface) design, testing. Ensured compliance with legal and child protection standards.
- Joined Instructional Design team to produce training materials; traveled to counties, Developed Train-the-Trainer materials (including application and hardware troubleshooting).
- Joined Training Team to deploy software (install and test training version of software on client sites); support client’s IT staff in supporting the application, backups. Train travel training staff of 10; supported them by phone when they were on training sites.
- Managed training enrollment of 18,000+ staff, including publication/distribution of printed and online documentation to training sites in time for specific training sessions.
- Customer face for internal Help Desk to 800+ users worldwide. Supported PCs (hardware, software) and telephone systems.
- Survey design, distribution, collection, and data processing (including synopses of comments) for PW customers. Application and hardware tracking via Lotus 1-2-3.
- Coordinated for field training staff and franchisees use of corporate training centers. Ordered/tracked supplies for training centers, inventory management, training center sales. Tracked training staff (get them paid on time, restock training centers ahead of classes).
- Tested, revised requirements and deployed, then maintained, IVR system (input voice recognition system, enrolling stylists in haircutting academy) including “stylist documentation” appropriate to user community, generating $1M/yr. Generated franchisee invoices; took store staff and franchisee calls / resolved issues. Participated in UX (user interface) design, testing. Created, drafted, solicited / edited content, and distributed monthly trainer stylist newsletter (in COSMOPOLITAN vein).
Confidential,San Rafael, CA
Computer Consultant / Owner
Responsibilities:
- Gathered hardware and software requirements for 100+ business in support to conversion from paper/manual systems to (and upgrading of) computer systems; supported conversion from manual credit card payments to swiped debit/credit card payments for retail customers. Learned clients needs and shopped on behalf of clients for appropriate applications, printers, and components; deployed/installed hardware, software, patches, and upgrades as needed. Supported clients in development new business processes to make efficient use of computers. Supported transactions and bookkeeping in USD and Mexican Pesos.
- Clients including private investors (including the former owner of PSA, now Southwest Airlines), mom-and-pop retail, construction, chambers of commerce, wine factors, hotels, auto dealerships, banks, non-profits, more. See CHRON resume for full list.
TECHNICAL SKILLS
SQL (ad hoc: Netezza 4.8; Oracle 10g and 11g; Teradata SQL Server T-SQL ; Informix; DB2); TOAD 10.5; SQL Assistant 13.11 (“Queryman”); T-SQL (2000, 2005, 2008, 2010, 2012, 2013) on SQL Server (as user); SQL Server Visual Studio (2010).
WORD 7/16: Excel 7/16; Visio 7/16; SSIS 2005/2008 (troubleshooting;), Access (data cleansing; no programming), PowerPoint.
Reporting: Tableau; SAP Business Objects (BusObj) Enterprise XI 12.1.0 Universe Designer 12.3 (hand coded universes because of unusable foreign keys), security access, WEBI Intelligence report design / QA ; MicroStrategy (report QAing; re-purpose MicroStrategy SQL into ad hoc). Crystal Xcelsius (for dashboards).
Accounting: Great Plains, Mas-90, DataMar, Quicken, Computer Associates (GL, AP, AR).
Other: BOARD (10.1). CA PPM (“CLARITY”) v14.3 by Computer Associates; ERWin; ChartRunner; QC (Mercury QualityCenter); Embarcadero Studio; Remedy 6.03, Confluence / JIRA; Bugzilla; Subversion (SVN); Trackwise; Lync; WebEx. InfoGain (RX sales version). Ad hoc data querying of HealthConnect; Snag-IT; Snipping Tool. Manaton (property tax mgmt); Aumentum; C-Track.