Sr. Software Engineer Resume
Reston, VA
CAREER SUMMARY:
- Over 10 years of IT experience in various phases of IT projects following the SDLC process such as analysis, design, coding, testing and deployment.
- Extensive experience with ETL tools such as Informatica (9 years) and Talend (1 year) in designing and developing complex Mappings, Mapplets, Transformations, Workflows, Worklets, Configuring the Informatica Server and scheduling the Workflows and sessions
- Extensive experience in designing and developing complex mappings from varied transformation logic like Unconnected and Connected lookups, Normalizer transformation, Source Qualifier, Router, Filter, Expression, Aggregator, Joiner, Update Strategy etc.
- Extensive experience in Workflow Manager and Workflow Monitor and scheduling ETL jobs in Informatica using PMCMD, repository creation/Backup/Restore using PMREP, Creating repository user/role/group
- Extensive Experience in Performance Tuning of targets, sources, mappings and sessions, Co - coordinating with DBAs and UNIX Administrator for ETL tuning
- Extensively worked on AWS tools such as S3, DynamoDB, Lambda, Step functions, Cloud logs, EMR & EC2 instances
- Extensively worked in Client-Server application development using Oracle 9i/8i, PL/SQL and SQL*Plus.
- Extensively worked with different Teradata utilities.
- Experience in CRM tool such as Talisma.
- Experience in writing SQL queries, Stored Procedures, Cursors, and Cursor variables.
- Extensively worked with Change Data Capture (CDC).
- Demonstrated ability to quickly grasp new concepts, both technical and business related and utilize as needed.
- Worked on multiple scheduling tools such as Harvest change manager, Tidal, Tortoise SVN, TAC Scheduler, Appian
- A Team Player with excellent communication, analytical, verbal and writing skills.
- Excellent interpersonal skills.
TECHNICAL SKILLS:
Data Warehousing: Informatica Power Center 10x/9x/8x/7x (Source Analyzer, Warehouse Designer, Metadata Manager, Mapping Designer, Mapplets, Transformations, Workflow Manager, Worklets, Workflow Monitor), Talend (Components, Mappings, Workflows)
Data Modeling: Logical and Physical data modeling, Star-Schema Modeling, Snowflake Modeling, FACT and dimension tables
Databases, Files & GUI: Oracle, Teradata, SQL Server, IBM DB2, Flat files, XML/XSD files, JSON files CSV, TXT files
Operating Systems: MS Windows XP, 98, Windows NT/2000, NT 4.0
CRM: Talisma DMU, Talisma Client
Languages: PL/ SQL, SQL* Plus 8, UNIX Shell Script
Cloud tools: AWS Console (Step Functions, Lambda, EMR, Simple Queue Service, Cloud Watc S3, DynamoDB, Athena, Code Build)
Tools: SQL* Plus 8, FTP Tools, BTEQ, MLOAD, FLOAD and TPUMP
PROFESSIONAL EXPERIENCE:
Confidential, Reston, VA
SR. Software Engineer
Responsibilities:
- Design, develop, test, and implement software (database triggers and stored procedures) that works within complex databases to provide functionality that can insert, update, delete and report on millions of records efficiently (query optimization, parameterized queries) and accurately deploying various use cases (ways in which data needs to be addressed).
- Design, develop, implement, and support queries (database access algorithms) and support various Relational Databases like Oracle, Teradata, DB2 and SQL Server and writing complex Queries using SQL, PL/SQL.
- Providing and implementing technical recommendations and options based on software designs that can cost-effectively be realized in the production environment.
- Providing and implementing technical recommendations and options based on solution designs that can cost-effectively be realized in the production environment.
- Involved in the Software Development Life Cycle (SDLC) that include duties like providing documentation on design, testing and migration from development to production.
- Followed Agile methodology with bi-weekly sprints and all Agile ceremonies.
- Extensively worked on AWS products such as S3, Lambda, Step functions, DynamoDB, EMR, Cloudwatch, Athena, EC2, and Code build, SQS.
- Experience working with Continuous Integration/Continuous Deployment (CI/CD), Gitbucket.
- Daily interaction with clients and business analysts for requirement gathering, clarifications and suggestions in data integration in the software.
- Understanding and implementation of organizational strategies, policies, and procedures related to functional requirements of the software by developing documentation, flowcharts, layouts, diagrams, charts, and code.
- Worked on creating JSON files for AWS and using them to generate files as per clients requirement.
- Responsible for design, development, and implementation of ETL Mappings using Informatica and enhancements of Production Loads and Recovery and Informatica Administration. This involves designing software procedures by which large amounts of data can be accessed from various complex database and data sources.
- Responsible for resolving software implementation issues and documentation of solutions for complex workflows.
- Monitoring software operations including loads for High Performance and proactively manage the day to data Health checks of various jobs and processes.
- Work and Support complex UNIX Shell scripting to automate processes, build and deployment of databases and associated developed software into production and Job Scheduling using Scheduler tools Appian, flux and Data flux
- Worked on creating DynamoDB tables
- Worked on AWS S3 to load/transfer files
- Providing ETL unit testing, automation testing, file comparisons and designing and developing reconciliation scripts etc.
- Install and ensure that the installed software runs in the given technical environment. Maintain the software tools like WebSphere MQ (middleware that makes communication between various software possible), Flux, Appian, Vertex and Secure Transport (Tumbleweed), Backup and Recovery of data on the developed or customized software.
Environment: Informatica Power Center 10.2, 10.1.1, AWS (S3, DynamoDB, Lambda, Step functions, Cloudwatch, EMR, EC2 instances) SQL Server 2016/2019, TOAD, Oracle 11g, PL/SQL Developer, DBeaver, Appian, Jira, Tortoise SVN, MS Office, Scrumban, Kanban
Confidential, Rochester, NY
SR. Informatica Developer
Responsibilities:
- Extensively worked on generating 1095B XML files for IRS audit purposes.
- Involved in analysis and resolving the data issues related to Subscriber and Member related data.
- Involved in review and finalizing the technical specs of source-target mapping documents.
- Involved in designing Error Handling strategy using Error detection/Notification/Re-Processing processes.
- Experience in working with ETL against large scale terabytes of data and databases.
- Involved in designing Audit process for the sessions loaded as part of mapping design as well querying against informatica repository tables
- Involved in data migration procedure and mapped data from legacy systems to new existing systems for providing a design for data extraction and data loading.
- Involved in all data migration phases like design, extraction, cleansing, load and verification
- Created mappings using the transformations such as the Source qualifier, Aggregator, Expression, lookup, Router, Filter, Sequence Generator, Update Strategy, Stored Procedure, Normalizer.
- Involved in performance tuning Informatica mappings with XML, Normalizer, Joiner, Aggregator, Sequence Generator, Router, Filter transformations
- Used Workflow Manager for creating workflows, Worklets, emails, and command tasks
- Involved in scheduling the informatica workflows using Tidal Scheduler
- Involved in tuning the mappings with audit, error design and reprocessing strategies
- Performed Unit Testing and Integration Testing on the mappings
Environment: Informatica Power Center 9.6.1, SQL Server, TOAD, Oracle 11g, PL/SQL Developer, Facets Data, XML, Tidal, Jira, ServiceNow (SNOW), MS Office
Confidential, State College, PA
Sr. Application Developer
Responsibilities:
- Involved in analysis of three different source systems and prepared technical specifications for the ETL.
- Worked extensively as a liaison between World campus, Undergrad admissions and Bursar Department.
- Extensively worked on General Ledger files to be transformed to Undergrad Admissions rules.
- Developed Error/Exception handling mechanism to enhance the data quality loaded to Talisma.
- Involved in enhancing the data model for new additions reference tables.
- Responsible for data migration to achieve an automated migration.
- Involved in all stages of data migration.
- Created complex mappings and workflows using the transformations such as the tMSSql Input, tMSSql Output, tMap, tFile Input Positional, tFile Input Delimited, tJava, etc.
- Collaborated with other teams to create technical specifications for ETL/ELT processes based on functional specifications and Developed ETL/ELTs based on the technical specifications.
- Extensively worked with Talend Data Management and Power Connect to import sources from external systems.
- Created Templates on Error handling, Error Logging in Talend
- Used Talend Data Management for creating mappings, workflows, email and command tasks
- Involved in tuning Talend Mappings and Workflows by implementing parallelism, partitioning and caching
- Developed Java scripts to automate different tasks involved as part of loading process and scheduling
- TAC (Talend Administration center) scheduler was used as part of scheduling
- Performed Unit Testing and Integration Testing on the mappings
Environment: Talend Data Management 5.4, TAC Scheduler, SQL Server, Teradata 14.0, Teradata SQL Assistant and Administrator, Talisma DMU (Data Model Utility), Talisma Client, XML, Flat Files
Confidential, Virginia Beach, VA
Sr. Informatica Developer
Responsibilities:
- Involved in analysis and resolving the data issues related to Customer, Distributor and Transaction increments of the data warehouse.
- Collaborated with other team members to create technical specifications for ETL processes based on functional specifications and Developed ETLs based on the technical specifications.
- Worked along with Business Analysts to gather requirements.
- Used existing Perl scripts which convert Excel files in the standardized process.
- Developed complex Informatica mappings and also involved in performance tuning.
- Worked with MLOAD, FASTLOAD, TPUMP and BTEQ utilities of Teradata for faster loading and to improve the performance.
- Extracted the data from different sources like CSV files, oracle, SQL Server, Teradata.
- Involved in creating mappings to get data from different sources to different targets.
- Created Stored Procedures, Triggers, Synonyms and Indexes at Database level.
- Involved in Informatica migration from 9.1 to 9.5 and involved in Unit testing all the processes that involved in migration.
- Tidal was used as part of scheduling.
- Performed Unit Testing and Integration Testing on the mappings.
- Used Jira as Change management and Defect management for Audit purposes.
- Involved in Data conversion of data from Facets database to staging database.
- Configured workflows with parameter files and used connection variables in the sessions.
- Configured parameter files to load the default values in the columns.
- Involved in performance tuning of different views with Oracle and SQL server DBA’S.
- Created migration scripts to deploy code to different environments.
- Created complex SQL scripts to compare source data against target data.
- Involved in all stages of data migration.
- Used existing Perl scripts to automate different tasks involved as part of loading process and scheduling.
Environment: Informatica Power Center 9.5/9.1, SQL Server, TOAD, Teradata 14.0, Oracle 11g, PL/SQL, Teradata SQL Assistant and Administrator, Facets Data, XML, Tidal, Jira, Tortoise SVN, MS Office
Confidential, Camp Hill, PA
Sr. Informatica Developer
Responsibilities:
- Developed complex Informatica mappings to load the data from various sources using different transformations like source qualifier, connected and unconnected look up, expression, aggregator, joiner, filter, normalizer, rank and router transformations.
- Worked on Informatica power center tools like source analyzer, mapping designer and transformations.
- Developed Informatica mappings and also tuned for better performance.
- Worked with MLOAD, FASTLOAD, TPUMP and BTEQ utilities of Teradata for faster loading and to improve the performance.
- Teradata views have been developed against the Departmental database and claims engine database to get the required data.
- Developed application views in Teradata for loading data to Relational databases.
- Maintained warehouse metadata, naming standards and warehouse standards for future application development.
- Generated XML targets to capture clinical data from flat files.
- Extensively worked on XML source and target feeds and loaded into Relational databases.
- Worked on MQ series data to provide information for doctor office reports.
- Created and Monitored Workflows using Workflow Manager and Workflow Monitor.
- Application views are built through Harvest Change Manager Tool in desired schema in Teradata Warehouse and used as one of the sources for Informatica.
- Load balancing of ETL processes, database performance tuning and capacity monitoring.
- Involved in Unit testing and System testing of the individual.
- Used UNIX to create Parameter files and for real time applications.
- Involved in developing various shell scripts.
- Worked on HIPAA and HL7 standards.
- Extensively involved in testing the system from beginning to end to ensure the quality if the adjustments made to oblige the source system up-gradation.
- Involved in Preparing Technical Detail design documentation and Transition Reference guide thoroughly for production support team.
- Involved in Preparing Unit Test plans, Unit test cases and Unit test documentation.
- Rational Unified Processing Methodology being followed to develop the code.
Environment: Informatica Power Center 9.1.0/8.6.1/8.1 , Teradata V2R12/V2R6, Oracle 10g, Teradata SQL Assistant and Administrator, IMS Data, XML, Harvest Change Manager, Rational Clear Quest, HIGH RUP( Confidential ’s Rational Unified Processing), Mainframes and Windows