Solution Architect Resume
SUMMARY
- IT professional with 12 + year experience Specializing & comprising of cloud implementation with AWS (Amazon Web Services) and design, development, deployment of BI technologies.
- 3 years of experience as an AWS Cloud Solutions architect.
- Proficient in AWSsolutions architecture, Systems Design, Disaster Recovery and Storage Administration.
- Experienced with installation of AWSCLI to control various AWSservices through BASH scripting.
- Experienced with AWSLambda functions to monitor AWSCloudTrail logs and automateAWSEBS snapshots.
- Seamlessly transitioned AWS infrastructure from EC2 - Classic to VPC, overhauled AWS accounts to follow latest security best practices, reduced AWS costs by over 30%, and wrote documentation.
- Implemented EC2 backup strategies by creating EBS snapshots and attaching the volume to EC2 instance
- Created S3 buckets in theAWSenvironment to store files and apply lifecycle policies.
- Implemented POC/Pilot projects to evaluate different Cloud based storage solutions usingAWS storage types.
- Created shell scripts to Export/Import database backups from RDS & keep the same in S3 (AWS Storage)
- Configured elastic load balancers and auto scaling groups to distribute the traffic and to have a cost efficient, fault tolerant and highly available environment.
- Experienced with AWScloud formation templates.
- Experienced with informatica cloud components.
- Experienced with Data warehouse migration to AWS cloud by using Redshift & S3
- Experience with preparation of migration checklist.
- Experience with various AWS EMR components (Hive /HDFS/Spark/ EMFRS/Scoop) handling very large data sets in a large Data Lake setup.
- Experienced with ETL/ELT workload using Python Scripts
- Experience with loading and manipulating large data sets using Spark/PySpark and SparkSQL.
- Strong experience in development and implementations of Data Warehousing projects.
- Extracted data from various sources like Oracle, Sybase, VSAM files, Flat files and xml files.
- Experience with DT Studio.
- Experience in creating column based profiling using IDQ analyst tool.
- Extensive experience with leading teams in offshore and onsite model.
TECHNICAL SKILLS
Operating System: Win NT/2000/XP, Unix and Linux
Database: Oracle, My SQL Dynamo DB and Redshift
Tools: AWS, Informatica Power center, IDQ tool, Informatica B2B tool, Micro stagey and Erwin
Languages: Python, Java, SQL, PL/SQL, HTMLand XML.
PROFESSIONAL EXPERIENCE
Solution Architect
Confidential
Responsibilities:
- Migrating databases from Datacenter to AWS cloud using Virtual Private Cloud(VPC) and Security Gateways
- Managing AWSsolutions architecture, Systems Design, Disaster Recovery, Storage Administration
- Creating snapshots and also images to store launch configurations of the EC2 instances.
- Managing and creating User accounts, Shared Folders, provided day to day User support, Log management, Reporting, applying Group Policy restrictions.
- Maintaining High availability infrastructure inAWS, and to process Requests like Creation of Servers.
- Creating S3 buckets in theAWSenvironment to store files, sometimes which are required to serve static content for a web application.
- Configuring S3 buckets with various life cycle policies to archive the infrequently accessed data to storage classes based on requirement.
- Using IAM for creating roles, users, and groups to provide additional security toAWSaccount and its resources.
- Creating RDS instances to serve data through servers for responding to requests.
- Creating snapshots and also images to store launch configurations of the EC2 instances.
- Good knowledge in Continuous Integration (CI) and Continuous Deployment(CD) methodologies
- Provided support for java applications by collaborating with java development team using the agile methodology
- Managing and automating all aspects of ourAWSinfrastructure (compute, storage, network, permissions, cost) using configuration management tools Cloud Formation and shell scripts
- Ensure that the entire system stays secure and available by monitoring day-to-day operations, applying security patches and pushing out deployments of all systems and applications
Architect
Confidential
Responsibilities:
- Analyzing the business requirement with business SMEs and finalized requirements
- Creating data model by using Erwin tool
- Creating design documents for ETL process.
- Implementing Informatica Best practices in Project.
- Guiding offshore team in implementation of ETL process and reviewing process with Client at onsite
- Solid understanding of software development (SDLC), change management lifecycles, E2E testing & release procedures.
- Implementing ETL development standards and procedures
- Extensively using sql concepts like indexing, partitioning, cursors and performance turning.
- Creating many utility functions and procedure to use in Informatica.
- Used standardizer, labeler & Case Converter Strategy transformations apart from regular transformation in IDQ developer tool
- Build the validation rules in DQ and implemented exception handing process
- Created reference table by using analyst tool.
- Experience in creating column based profiling using IDQ analyst tool.
- Developing Autosys jobs for scheduling of ETL Jobs
- Responsible for overseeing the Quality procedures related to the project
- Providing weekly status to management team.
Environment: Informatica 9.6.1 Windows 7, UNIX, Oracle 11g, Erwin and XML, Informatica data quality (IDQ) tool.
Solution Architect
Confidential
Responsibilities:
- Analyzed the business requirement with business SMEs and finalized requirements in fusibility phase
- Involved data model design risk and values phase
- Guided offshore team in implementation of ETL process and reviewing process with Client at onsite
- Understanding web service middle layer and backend integration setup.
- Rule’s integration with ETL scrubbing and tacking through Audit trial
- MSTR report flat form setup and creation of reports.
- Provided weekly status to management team.
- Extensively used sql concepts like indexing, partitioning, cursors and performance turning.
- Created many utility functions and procedure to use in Informatica.
- Created materialized view for Oracle Optimizer driven aggregate awareness functionality
Environment: Informatica 9.1.5, Windows 7, UNIX, Oracle 11g, JAVA EXTJS, DROOL, Micro stagey.
ETL Architect
Confidential
Responsibilities:
- Analyzed the business requirement with business SMEs and finalized requirements
- Created design documents for ETL process.
- Guided offshore team in implementation of ETL process and reviewing process with Client at onsite
- Worked on XSD, XML target, XML Parser, XML Generator and SQL Transformations using Informatica 9.5 Power Center.
- Developed SCD Type 2 by using Effective Date Range mapping filters source rows based on user-defined comparisons and inserts both new and changed dimensions into the target
- Coordinated with QA team during testing cycle
- Provided weekly status to management team.
- Extensively used sql concepts like indexing, partitioning, cursors and performance turning.
- Created many utility functions and procedure to use in Informatica.
- Created materialized view for Oracle Optimizer driven aggregate awareness functionality
Environment: Informatica 9.1.5, Windows 7, UNIX, Oracle 11g and Informatica data quality (IDQ) tool, XML.
Confidential
Responsibilities:
- Understood the requirements and designed the IDQ mappings
- Used standardizer, labeler & Case Converter Strategy transformations apart from regular transformation in DQ developer tool
- Build the validation rules in DQ and implemented exception handling process
- Experience in integrating DQ components and power center.
- Created reference table by using analyst tool.
- Experience in creating column based profiling using analyst tool.
- Tested sample parsers in Windows and compare both versions output for Windows
- Tested sample parsers in Windows and compare both versions output for UNIX
- Tested by using cm console and compare both versions output in UNIX
- Tested by using Informatica UDO transformation and compare both versions output in UNIX.
- Changed CRISIL application to use the new UNIX server and setup the Production Environment.
- Migrated old parsers history to the new environment.
- Created shell scripts for file formatting and exception & Error handling
- Used Autosys scheduling tool and wrote many jil files to run the Informatica jobs.
Environment: Informatica 9.0.1 and 8.6.1, Windows XP, Oracle 10g and Informatica B2B data Exchange, Informatica data quality (IDQ) tool