We provide IT Staff Augmentation Services!

Etl Developer Resume Profile

2.00/5 (Submit Your Rating)

Career Overview

Developed software systems through the complete SDLC, in multiple application areas virtual electrical power trading, database marketing, cellular phone internal external customer support, biotech quality control . These systems included both client-server DB Oracle, MySQL, Informix, Sybase applications, and real time ETL data acquisition applications. Developed both data entry and system configuration GUI front-ends with Delphi and Visual Basic, and their associated back-end Unix ETL scripts Python, Bash, Perl, Korn . Able to logically analyze, design, implement, test, debug, maintain, and document robust structured event driven object based, DB and real time ETL data acquisition application systems.

  • Programming Languages
  • Python, Perl, Unix Shell Scripting, Oracle PL/SQL, Visual Basic VB 4-6 , Delphi i.e. Object Pascal , Java, C/C , VAX/DCL
  • Relational Database Management Systems
  • Oracle, MySQL, Informix, ODBC, ADO, DBI/DBD, Access, VAX/Rdb, Sybase
  • Operating Systems
  • Microsoft Windows NT, Y2K, 98, 95, 3.1, MSDOS , Unix Linux, Cygwin, SunOs, Solaris, Digital Unix, IBM AIX , VAX/VMS
  • Hardware
  • Intel X86 CPUs, IBM RS6000, DEC VAX/mVAX

Experience

Confidential

Unemployed.

Confidential

  • Developed and maintained mostly Python and some Perl ETL scripts to scrape data from external web sites and load cleansed data into a MySQL DB. Did some Java and MATLAB ML code development and maintenance. The data was used for daily electrical power virtual trading activities in several markets.
  • The original Perl scripts were monolithical w/ no abstracted functionality. Developed shared Perl Python modules to encapsulate common functionality. These modules' functions were used as building blocks like LEGOs, modular and allowed for a more structured script architecture. The structured script architecture, in turn allowed for faster new script development and easier maintenance of deployed scripts.
  • Played the role of a MySQL DB developer administrator. Created and maintained user profiles. Monitored DB SQL statements and their timings, for performance concerns optimized long running queries, minimized table locks, etc. . Analyzed and did informal physical modeling of raw data, then created sufficiently normalized corresponding MySQL DB objects MyISAM tables, primary keys, etc. . The original table design was using the float type to store price data, subsequently used Decimal type.
  • Occasionally provided Java and ML code development and maintenance. Developed Java code to upload bid submission files to external web sites behind SSL certificates. The Java file upload program was invoked by the traders via the ML IDE. Developed ML GUIs based on the MVC Model View Controller architecture to separate the presentation, computation, and data. Enhanced ML cache DB price data was cached to a ML file for speed build code by optimizing the DB queries, and added DB versus ML cache check sums to detect inconsistencies. Additionally, moved some ML code functionality to back end Python scripts. Some of the ML code depended upon traders pushing buttons' sequence, and was better performed by the back end.

Confidential

  • At Jump Power for TEKsystems developed Perl ETL scripts to scrape data from external electrical power marketing websites and populate a MySQL DB. The data was be used in daily electrical power virtual trading activities.
  • The Perl ETL scripts used the LWP module to scrape file location info, and then get the files CSV, XLS, or zipped of preceding from external web sites. All the files were transformed to canonical CSV arrays: they were unzipped if necessary, parsed from XLS using the Spreadsheet::ParseExcel module if necessary, and finally transformed to a canonical CSV array. The CSV array was either written to a file, and directly loaded via the mysqlimport utility. Or via the DBI module, the CSV array records were inserted, updated, or replaced as necessary. A check sum of the CSV array's numerical values was computed, and this was checked against SQL query check sums to ensure the CSV values have been correctly populated into the DB.

Confidential

Unemployed.

Confidential

Worked with account team to meet customer needs by developing and/or customizing technical solutions to provide data files to 3rd party vendors to drive the customer's multi-channel campaigns. Utilized business insight as well as technical and data processing knowledge, to interpret campaign requirements and translate those requirements into specific technical solutions for the campaign process. Developed solutions w/ Perl, Oracle SQL, Oracle PL/SQL, PL/SQL Developer, Oracle SQL Loader and Unix shell scripting on both Windows via cygwin and Unix operating systems. Also, provided daily production support for the campaign processes, monitored the processes, analyzed root cause of process problems implemented their solutions, and implemented process change requests.

Confidential

  • At the Acxiom Corp. for TEKsystems worked on an Oracle data warehousing project. Developed, maintained, and unit tested Unix shell scripts w/ embedded Perl regex one-liners for moving inbound source data feed files to an archive directory, pre-processing the inbound files, invoking a data file auditing utility, submitting the audited data file to an identification and cleansing process, invoking the Oracle SQL Loader utility to load cleansed data into staging tables, and calling PL/SQL stored procedures to do process logging and data transforms from staging tables into final target OLAP DB tables. The hierarchy of Unix shell scripts were well structured application programs with a robust error exit handling capability and sanity checks for Input-Process-Output IPO record count matching, record length verification, and correctly embedded file ID verification. Also, performed inbound and in-process data analysis with Unix utilities tr, grep, Perl, regular expressions, etc. , and wrote system operator procedures' documentation. Additionally, performed DB data checking w/ ad-hoc SQL queries.
  • Developed a script execution scheduler an execute past due and then re-schedule ahead scheduler system with an Oracle package and supporting objects tables, referential constraints, sequencers, triggers, etc. , a Perl schedule driver with DBI interfaces to the Oracle package, and a Unix shell wrapper script to invoke execution of the scheduled scripts, and enter the scripts transition state changes into the database via Unix shell function interfaces. The system checked for the existence of input file dependencies, and launched scripts based on an English like schedule syntax e.g. monthly on 7th busday at 12:00 .
  • Developed a Perl script and it's inverse to transform a one dimensional 1D, linear, hourly metric readings one per line, e.g. 136,353 lines report to a two dimensional 2D, matrix, up to 256 metric readings per line, e.g. transformed to 727 lines w/ 191 metric readings per line . The original metric reports' a CSV file were used a data inputs for EXCEL graphs. But, 32-bit EXCEL was limited to opening CSV files w/ at most 64K rows and 256 columns a typical input file would have 100K rows and one column: one reading per hour . The Perl script would transpose and segment 1D format into a 2D format w/ 256 metrics per segment. For unit testing, also developed the inverse Perl script that transforms the 2D layout back to the original 1D format.

Confidential

At The Clearing Corporation for Encore Consulting Services worked on a series of mini-projects to migrate several sub-systems from Sybase to Oracle. Developed, maintained, and tested Unix shell and Perl DBI/DBD ETL scripts. Converted Sybase formatted SQL to Oracle formats: mostly Sybase implicit date-time conversions to Oracle explicit conversions, and some Sybase floating point currency exchange rate implicit conversions to Oracle numeric conversions. The DB structure and stored procedure objects were done by the Oracle Migration Workbench OMW . The mini-projects were the migration/conversion of about 125 Unix shell/Perl ETL scripts that invoke SQL statements which had to be manually converted, and the stored procedures had be corrected and tested.

Confidential

  • Developed and maintained customizations to ILNB's IL North Branch bankruptcy court's Case Management CM and Electronic Case Files ECF systems. Developed and maintained Perl and Unix shell scripts for monitoring and reporting on Informix DB status lock monitoring, log buffer space usage, etc. , Solaris system status disk space usage, large PDFs, etc. , and CM/ECF status EDI file well formed tag checks, etc. .
  • Supported and maintained legacy Records Management System RMS for tracking bankruptcy case's associated barcoded i.e. case file numbers physical file folders. RMS was a Visual Fox Pro VFP GUI-IDE-DB for tracking the location of case file folders w/in the court, and their archived location at the Federal Records Center FRC . Developed and maintained Python ETL programs for analyzing and correcting record shipment barcode scanner raw data e.g. up to 150 box numbers, and up to 8000 case file numbers . These Python ETL programs allowed for up front corrections of scanner data for the importation of corrected data into VFP DB tables, versus loading erroneous scanner raw data dump into RMS and then using the RMS VFP GUI to manually make tedious corrections delete erroneous records, re-entry of records, corrections of records, etc. . Also developed a Python program that cross checks the corrected scanner data flat files and VFP DB tables via ADO after importation against the Informix DB via ODBC for upfront identification of case file folders that should be added or removed. Additionally provided analysis and solutions for IT help desk calls e.g. malfunctions of barcode label printer, barcode scanner corrupt PDFs and floppies PC dual monitor problems CM/ECF slowdowns and file upload problems etc. .

Confidential

Developed several client-server Oracle DB, data-entry and presentation GUI business application systems for call center internal customer support of the external customers with VB and Delphi, through the complete SDLC. Additionally, developed a multi-threaded real-time Delphi data acquisition application. The first thread packaged a serial communications port data stream into raw report files, while the second thread used FTP to upload the raw report files via WinInet API Delphi class encapsulation of FTP functionality to a remote Unix host. Also, designed and developed the associated Unix Perl back-end background ETL processing scripts that transformed raw report files into CSV files for database loading. The Perl process was a sophisticated flexible regular expression pattern matching algorithm, that was able to recognize several different report types, and apply the appropriate transformation. Developed, maintained, and supported several associated Unix shell, Perl, SQL, SQL Plus, SQL Loader and PL/SQL application systems for ETL-ing raw data streams into CSV loader and report presentation formats.

Confidential

Developed and maintained AIX platform Oracle Pro C programs w/ embedded Oracle queries, stored PL/SQL packages, and SQL Forms that extracted data from DB, transformed it into ASCII description files for input into Word Basic programs, which then transformed the description files into Postscript image files used as plates for mass printing of multilingual documents. Developed and maintained Oracle Pro C programs to extract data from an AIX Oracle DB and transform it into ASCII input for M.S. Word VBA programs for production of internal system specification documents. Also, designed and developed M.S. Access database and VBA programs to prototype the next generation production of internal specification documents. Maintained SAS data ETL applications. Developed and maintained Unix shell scripts to automate data transfer from remote AIX hosts via local AIX proxy host to a remote VAX host. Developed and maintained DB operator data processing VAX DCL scripts. Developed VB5 application that extracted new system configurations from a DB Oracle or Access , transforms them into flat file ASCII configurations, FTP downloads files from a VAX previously defined system configurations , appends new system configurations onto the FTPed files, FTP uploads the updated system configuration files to a VAX system. These application systems were used to support the microbiologist who managed the disposable product manufacturing and quality control of several automated microbiological bacterial identification and susceptibility testing medical devices.

We'd love your feedback!