Informatica/postgresql Developer Resume
PROFESSIONAL SUMMARY:
- Around 6+ years of experience in Data Architect and Data Modeling, Data development, Implementation and Maintenance of databases and software applications.
- Experience in designing Star schema, Snowflake schema for Data Warehouse, ODS architecture.
- Very good experienced in SQL queries and optimizing the queries in Oracle, SQL Server, DB2, Teradata, MS Access.
- Experienced in Requirement gathering, System analysis, handling business and technical issues & communicating with both business and technical users.
- Experience in UNIX and UNIX Shell Scripting.
- Very good experience in Normalization (1NF, 2NF, 3NF and BCNF) Denormalization techniques for TEMPeffective performance in OLTP and OLAP environments.
- Experienced with Requirements gathering, interacting with users, analyzing client business processes, documenting business requirements and conducting Joint Application Design (JAD) sessions to discuss overall design patterns.
- Good experienced in Data Architect using SQL Queries and Data profiling tools.
- Experienced in designing the data mart and creation of cubes.
- Experienced in developing Conceptual, logical models and physical database design for Online Transactional processing (OLTP) and Online Analytical Processing (OLAP) systems (ODS, ADS and Data Marts).
- In depth noledge of the PostgreSQL architecture, Database design, and Relational Database systems.
- Solid noledge of Data Marts, Operational Data Store (ODS), OLAP, Dimensional Data Modeling with Ralph Kimball Methodology (Star Schema Modeling, Snow - Flake Modeling for FACT and Dimensions Tables) using Analysis Services.
- Expertise in Data Architect, Data Modeling, Data Migration, Data Profiling, Data Cleansing, Transformation, Integration, Data Import, and Data Export through the use of multiple ETL tools such as Informatica PowerCenter.
- Experienced with Client-Server application development using Oracle PL/SQL, SQL PLUS, SQL Developer, TOAD, and SQL LOADER.
- Strong experience with architecting highly performant databases using PostgreSQL, MySQL. MS Access.
- Extensive experience in using ER modeling tools such as Erwin and ER/Studio, Teradata.
- Good noledge of problem solving and analytical skills with exceptional ability to learn and master new technologies efficiently.
- Extensive experience in Unix shell scripting, Perl, Ruby, or any other scripting language.
- Good Knowledge on SQL queries and creating database objects like stored procedures, triggers, packages and functions using SQL and PL/SQL for implementing the business techniques.
- Experienced in Technical consulting and end-to-end delivery with architecture, data modeling, data governance and design - development - implementation of solutions.
- Work on Background process in oracle Architecture. Also drill down to the lowest levels of systems design and construction.
TECHNICAL SKILLS:
Data Modeling Tools: ER/Studio 9.7/9.0, Erwin 9.6/9.5, Power Sybase Designer.
OLAP Tools: Tableau, SAP BO, SSAS, Business Objects, and Crystal Reports 9
Cloud Platform: AWS, Azure, Google Cloud, Cloud Stack/OpenStack
Programming Languages: SQL, PostgreSQL 9.4, UNIX shell Scripting, PERL.
Databases: Oracle 12c/11g, Teradata R15/R14, MS SQL Server … DB2.
Testing and defect tracking Tools: HP/Mercury, Quality Center, Win Runner, MS Visio & Visual Source Safe
Operating System: Windows, Unix.
ETL/Data warehouse Tools: Informatica 9.6/9.1.Pentaho.
Methodologies: RAD, JAD, RUP, UML, System Development Life Cycle (SDLC), Agile, Waterfall Model.
PROFESSIONAL EXPERIENCE
Informatica/PostgreSQL Developer
Confidential
Roles & Responsibilities
- Analyze the OPDM data architecture and dataflow diagram.
- Developing and documenting test plans, test cases, SQL scripts and automation scripts, prepare data as required; primarily responsible for large volume of data validations.
- Working Knowledge on Linux environment for application installation, licenses management, software tracking/distribution and backup/recovery of system configurations and user data files.
- Extract Data from multiple sources like Flat files, Oracle into staging database using informatica tool.
- Understanding the process flow and business functionalities for the upstream and downstream applications.
- Used various transformations like PostgreSQL Transformation, Expression Transformation, Lookup Transformation, Joiner Transformation, Router Transformation, Filter Transformation, Normalizer Transformation, Union Transformation, Update Strategy Transformation and Aggregator Transformation.
- Worked with the Mapping Parameters and Mapping Variables in various mapping to pass the values between sessions.
- Strong experience in writing complex SQL queries and PostgreSQL stored procedures.
- Writing Unix shell scripts, Perl Scripts
- Wrote SQL Overrides in the Source Qualifier to filter the data being pulled/selected from the source table and to perform incremental loading of data into the target table.
- Created Re-Usable Transformations and the Mapplets to perform the desired operations to load the data.
- Created Workflows for different Tasks like Sending Email Notification, Timer that Triggers, when an event occurs and sessions to run mappings.
Informatica Developer
Confidential
Roles & Responsibilities
- Extensively worked with Source Analyzer, Target Designer, Mapping Designer and Mapplet Designer.
- Developing Informatica mappings using transformations like Lookup, Router, Update Strategy, Filter, Sequence Generator, Joiner and Expression Transformations.
- Worked with different sources such as Oracle and flat files.
- Loading facts and dimensions from source to target data warehouse.
- Create and configure file systems and file system attributes, such as permissions, encryption, access control lists, and network file systems.
- Analysis of system performance, and routine maintenance of systems and applications.
- Deploy, configure, and maintain systems, including software installation, update, and core services
- Manage users and groups, including use of a centralized directory for autantication
- Manage security, including basic firewall and Linux configuration
- Configuring static routes, packet filtering, and network address translation
- Setting kernel runtime parameters and working on Unix shell scripting.
- Designed various mappings using transformations like Lookup, Router, Update Strategy, Filter, Sequence Generator, Joiner and Expression Transformations.
- Worked with different sources such as Oracle and flat files.
PostgreSQL Developer
Confidential
Roles & Responsibilities
- Developed the informatics mappings for loading data into staging and data warehouse.
- Implement and maintain database code in the form of stored procedures, scripts, queries, views, triggers, etc.
- Work closely with the CTO to implement TEMPeffective and maintainable database coding practices that form an architectural foundation.
- Work with front end developers to define simple yet powerful APIs.
- Work with GIS analysts to implement geospatial processes, queries and reports.
- Work with DBAs to ensure efficiency of database code, integrity of data structures and quality of data content.
- Work with product managers to ensure database code meets requirements.
- Work with DBAs and data analysts to ensure database code is accurately documented.
- Taking frequent back up of data, create new storage procedures and scheduled back up is one of the duty of dis job.
- Ability to work with Linux friendly applications and able to troubleshoot it when issue arises from the server.
- Monitoring of servers is also one of the important duty ofdis job.
- Designs, develops, tests and troubleshoots applicable software packages that implement complex ETL processes in accordance with business requirements and service level agreements.
- Collaborate with application Developers, Business Partners, Management and Data Warehouse team.
- Extends the data model to house additional structures as needed to meet applicable regulatory requirements.
- Designs data warehouse or data mart fact tables, aggregate tables, and dimensions. Delivers quality data integration projects, standards, best practices and data dictionary.
- Develops ETL audits and controls to ensure quality of data meets or exceeds defined standards and thresholds. Develops and documents systems, processes and logic required to expose existing data sets in the warehouse to end users for reporting and analytical purposes.