Application Architect Resume Profile
EdinA
Research-oriented, motivated, proactive, self-starter with strong technical, analytical and interpersonal skills.
Experience
Responsibilities : Application Architect
- Discuss with Portfolio architects, Data architects and other stake holders to discuss with proposed solutions.
- Helping the team to make strategic and technical level decision by analyzing the existing process.
- Involved at every stage of the software development cycle and would be working as a Tech Lead.
- Participate in design review. JAD session and detailed design discussion.
- Creation of application architecture document and design artifacts.
- Design review with portfolio architects and senior architects.
- Mentor the team to deliver high quality codes by defining the best standards and checklist.
- Coordinate design discussion with internal team, cross commits and key stakeholders.
- Analyze and resolve issues to ensure high quality deliverables at each stage of Software Development Life Cycle SDLC within the guidelines, policies and Ameriprise Quality Management System AQMS .
- Code review with architects and application team.
Subject Matter Expert:
- As a Subject Matter Expert SME for Historical Operational Data Store HODS application provided ideas to improve the performance of batch process and meeting the System Level Agreement SLA .
- Both technically and functionally guided Informatica team during migration. Training new resources on Historical Operational Data Store HODS batch application and other process.
Key solutions provided to improve the batch performance:
- Pre-schedule the copy partition job which yielded in good improvement in the batch process for high volume files. This idea got implemented during May Conversion 2012 and the batch process were able to handle the high volume delta files without breaching the System Level Agreement SLA .
- Solution provided to handle high volume update files. This solution got implemented in production and the system was able to process 60 million records against 150 million records less than 4 hours.
confidential
Application Architect
- BDIL is Brokerage Data Integration Layer which receives data from different data sources and maintains the data. HODS is a data store which maintains the brokerage data and serves data to clearing and other business for their daily reporting and reconciliation activities. Both BDIL and HODS data loads were not having restart feature that leads to data synchronization and performance issue in case of any failure during the batch run. Proposed few options to address this issue and worked closely with application team to introduce the restart logic with minimal performance impact.
- Proposed Informatica Metadata Manager tool to create the lineage between source and targets. This has been incorporated in test environment and business impressed with the tool for ease of analysis process. These lineages are extensively used by the application team for any code analysis and downstream impact anlaysis.
Environment:
Informatica Power Center 9.1, Source Analyzer, Mapping Designer, Transformations Developer, Workflow Manager, Workflow monitor, Repository Manager, Power Exchange , Oracle 11g, MS Visio, Windows, UNIX Shell Script, PL/SQL and SQL Loader.
confidential
Technical Lead Project Lead
confidential
- AITT is an enterprise level program at Ameriprise financials to upgrade and relocate the servers in St. Louis data center. As part of this program ETL Informatica servers, HODS App server and DB servers will be migrated from legacy data center to new data center. Existing functionalities should be migrated without any issues by application remediation.
- HODS is a 18TB of data store which stores historical and operational data.
- Migrating 18 TB of data without impacting to business is a challenging task.
- Careful planning with DBAs and business partners without impacting their day to day activities.
- Migration plan activities like hard and soft cutover.
- Optimize batch execution.
- Impact analysis by turning on Data Guard at Database level.
- Redesigning the batch to reducing the archival log generation.
- Validate the ETL, UNIX Shell scripts and DB migration.
- Secured file transfer using sftp and ftp between different servers.
Environment:
Informatica Power Center 9.1, Source Analyzer, Mapping Designer, Transformations Developer, Workflow Manager, Workflow monitor, Repository Manager, Power Exchange , Oracle 11g, MS Visio, Windows, UNIX Shell Script, PL/SQL and SQL Loader.
confidential
Project Lead/ Architect
confidential
- The project involves development and maintenance of BETA Data Integration layer BDIL and Historical and Operational Data Store HODS . This system keeps track of the account and Trade transaction details from BETA System provided by Thomson Reuters . BDIL system publishes all this data from BETA to all the Ameriprise Internal system. HODS will store all this data for compliance and regulatory purpose. This data from HODS will be used by Ameriprise clearing team for all client data analysis, data distribution via reports etc. This is a development maintenance project involving requirements elaboration, design, build, testing, implementation and support. The system will be developed using System Development Life Cycle Method.
- HODS is a nightly batch application with the SLA of 6 AM CST.
- Potential risk of breaching SLA due to increase in volume.
- Introduced stored procedures to gather stats and copy partition stats which increased the batch performance.
- Pre-schedule the copy partition job which yielded in good improvement in the batch process for high volume files. This idea got implemented during May Conversion 2012 and the batch process were able to handle the high volume delta files without breaching the System Level Agreement SLA .
- Solution provided to handle high volume update files. This solution got implemented in production and the system was able to process 60 million records against 150 million records less than 4 hours. Avoided all updates and delete statement which normally time consuming DML statements.
- Extensively involved in performance tuning of the ETL process by determining the bottlenecks at various points like targets, sources, mappings, sessions or systems. This led to a better session performance.
- Mapping property changed to run with dynamic partitions which internally triggers and distributes the load between multiple nodes.
- Enabled bulk mode in the DB target mapping which will go for direct load instead of conventional load.
- Discussion with Data Architects to decide on data model, data quality, data integration and exposure pattern.
- Developed PL/SQL and UNIX Shell Scripts for scheduling the sessions in Informatica.
- Experience in debugging and troubleshooting Sessions using the Debugger and Workflow Monitor
- Worked with sessions and batches using Server Manager to load data into the target database.
- Responsible for development, test and production mapping migration using repository manager.
- Worked on Tuning the Informatica Repository and Mappings for optimum performance. Effectively played a multi dynamic role facing all challenges and managed working on similar projects as well.
- Used Informatica Data Quality for data cleansing and data profiling.
- Secured file transfer using sftp and ftp between different servers.
Environment:
Informatica Power Center 9.1, Source Analyzer, Mapping Designer, Transformations Developer, Workflow Manager, Workflow monitor, Repository Manager, Power Exchange , Oracle 11g, MS Visio, Windows, UNIX Shell Script, PL/SQL and SQL Loader.
confidential
Technical Lead
confidential
- FINRA Financial Industry Regulatory Authority they make sure that all the financial organizations are having most current stock exchange details. Every year they will audit the reports from different financial organizations. This is a regulatory project to generate the FINOPS reports from multiple NYSE New York Stock Exchange data.
- Extensively involved in performance tuning of the ETL process by determining the bottlenecks at various points like targets, sources, mappings, sessions or systems. This led to a better session performance.
- Mapping property changed to run with dynamic partitions which internally triggers and distributes the load between multiple nodes.
- Enabled bulk mode in the DB target mapping which will go for direct load instead of conventional load.
- Developed PL/SQL and UNIX Shell Scripts for scheduling the sessions in Informatica.
- Experience in debugging and troubleshooting Sessions using the Debugger and Workflow Monitor
- Worked with sessions and batches using Server Manager to load data into the target database.
- Responsible for development, test and production mapping migration using repository manager.
- Worked on Tuning the Informatica Repository and Mappings for optimum performance. Effectively played a multi dynamic role facing all challenges and managed working on similar projects as well.
- Utilized text editor Editpad, UltraEdit tools to analyze the data.
- Used Informatica data mask transformations to mask some of the confidential data.
- Used Informatica Data Quality for data cleansing and data profiling.
- Secured file transfer using sftp and ftp between different servers.
Environment: Informatica Power Center 9.1, Source Analyzer, Mapping Designer, Transformations Developer, Workflow Manager, Workflow monitor, Repository Manager, Power Exchange , Oracle 11g, MS Visio, Windows, UNIX Shell Script, PL/SQL and SQL Loader.
confidential
Technical Lead
confidential
Clearing Data Store is an extension of HODS application migrating towards Informatica technology. Inbound process to load the files through using Informatica. Outbound APIs developed to send the update to Thomson Reuters BL server using Informatica web service. Secured Layer certificates for the web services using Informatica 9.1 and introduced the new tool Informatica Metadata Manager MDM and worked on this tool as a part of this project. UNIX scripts are used for all the validations and TWS
Responsibilities:
- Participated in the Design Team and user requirement gathering meetings.
- Performed business analysis and requirements per high end users and managers and performed requirement gathering.
- Extensively used Informatica Designer to create and manipulate source and target definitions, mappings,, transformations, re-usable transformations.
- Created different source definitions to extract data from flat files and relational tables for Informatica Power Centre.
- Used Star Schema approach for designing of database for the data warehouse
- Developed a standard ETL framework to enable the reusability of similar logic across the board.
- Created different target definitions using warehouse designer of Informatica Power Center.
- Used mapping designer to generate different mappings for different loads.
- Created different transformations such as Joiner Transformations, Look-up Transformations, Rank Transformations, Expressions, Aggregators and Sequence Generator.
- Created stored procedures, packages, triggers, tables, views, synonyms, and test data in Oracle.
- Extracted source data from Oracle, Flat files, XML files using Informatica, and loaded into target Database.
- Created medium to complex PL/SQL stored procedure for Integration with Informatica using Oracle 10g.
- Developed complex mapping also involving SCD Type-I, Type-II, Type-III mapping in Informatica to load the data from various sources
- Utilized text editor Editpad, UltraEdit tools to analyze the data.
- Involved in extensive Performance Tuning by determining bottlenecks in sources, mappings and sessions.
- Created Models based on the dimensions, levels and measures required for the analysis.
- Validate the data in warehouse and data marts after loading process balancing with source data.
- Worked closely with the business analyst's team in order to solve the Problem Tickets, Service Requests. Helped the 24/7 Production Support team
- Good Experience as an Onsite Tech Lead coordinating the offshore teams.
Environment:
Informatica Power Center 9.1, Source Analyzer, Mapping Designer, Transformations Developer, Workflow Manager, Workflow monitor, Repository Manager, Power Exchange , Oracle 10g, MS Visio, Windows, UNIX Shell Script, PL/SQL and TWS Scheduler.
confidential
Technical Lead
confidential
The project involves development of BETA Data Integration layer BDIL and Historical and Operational Data Store HODS . This system keeps track of the account and Trade transaction details from BETA System provided by Thomson Reuters . BDIL system publishes all this data from BETA to all the Ameriprise Internal system. HODS will store all this data for compliance and regulatory purpose. This data from HODS will be used by Ameriprise clearing team for all client data analysis, data distribution via reports etc.
Environment :
Oracle 10g, Informatica 8.6, MS Visio, Windows, UNIX Shell Script, PL/SQL and SQL Loader, Control M Scheduler.
confidential
Programmer Analyst
confidential
The Goldman Sachs Group, Inc. is a bank holding company and a leading global investment banking, securities and investment management firm. GASS is an existing system which contains lot of modules to allocate the short positions, Entitled position calculation, client communication etc.,. Asset servicing technology group decided to re-write the existing code to improve the performance, scalability and handle high volume on peak period and with the latest technologies. As part of the re-engineering, short response allocation is the first module. It contains UI for Entitled Positions, Instruction Management/SRA, Manual Position Adjustment, Manual Allocation, Audit History, Exception etc.,. It is built on top of the Ocean desktop application, Ocean is an in home application developed by Goldman Sachs.
confidential
Module Lead
confidential
LaTrobe Financial is one of the leading financial institutions in Australia and they are leaders in mortgage. LaTrobe plan to maintain their application in online for ease of maintenance. The system is automated to maintain the life cycle of a mortgage to make the business easier and it is an intranet application. It consists of around 14 modules and I worked on the following modules, Financial Control, Customer Services, Securities, EOD End of Day batch process.
Environment :
Windows XP, Eclipse, Oracle 10g Application server, Oracle 9i, Stateless Session Bean.
confidential
Software Engineer Module Lead
confidential
According to new UK Banking policy the entire bank under United Kingdom has to provide the Faster Payment services FPS to their clients. FPS is a quicker payment services so that the payment has to be cleared within 15 seconds. HCL is engaged to provide couple of services which provides important information about the account and beneficiary bank. The services are Sort Code Lookup Service, Account Location Service.
Sort code Lookup service confirms whether a Beneficiary bank can accept a Faster Payment services or not. Other responsibility of this service is to refresh the sort code data store. Refresh is scheduled weekly once and it is bulk refresh of the data store. Account Location Service is gives the information on a particular account for the incoming payments. ALS provides online update service as well as batch update service.
Environment :
IBM Websphere RAD, Oracle 10 g, Websphere Application Server 6.0, MQ Series, MDB.
confidential
Software Engineer Onsite coordinator
confidential
Payment is the important daily activities in a bank and investigation on payments is another important activity carried out if any payment goes wrong. Halifax and Bank of Scotland are using various systems for a Payment, customer will make a call after the payment is not realized, and from that point investigation process starts. In current scenario all these investigation process is happening manually hence it is time consuming one. HBOS decided to make this process as automated one by integrating all the systems to Smart Investigate application product of Pega . For each system need an interface and it will communicate with Pega SI.
confidential
Software Engineer
confidential
dbDirect Internet is web-based application of Deutsche Bank for its customers with many modules each catering to a specific need. DBDI application is based on the NextGen Architecture. Nextgen Maintenance which involves all the modules of the DBDI Applications and the modules are Asian Payments, FI Payments, CPIEC, Direct Debits, Free Format Instructions, EAI, Loan Deposits, F3I, Netting, File Transfer, System Features, FX etc., NG Maintenance is enhancement and maintenance project and it covers all the modules of the DBDI application.
Environment :
Java, JSP, Servlets, JUnit, JWEBUnit, Oracle, Data Junction.
confidential
Software Engineer
confidential
File upload is one of the modules of dbDirect web-based application of Deutsche Bank for its customers. The customer will upload their payment file through the application and it will convert into an intermediate UDF. This process is done by a DJS file, which developed in Data Junction tool. The java file calls the corresponding DJS file for the upload and converts into the UDF, if the uploaded payment file is correct format.
Environment :
Java, JSP, Servlets, JUnit, JWEBUnit, Oracle, Data Junction.
confidential
confidential
dbDirect Internet is a web-based application of Deutsche Bank which provides multiple features to their clients catering to a specific need. 'Payments' is an important module of the dbDirect Internet Application. This module caters domestic and international payments. It is further divided for single and bulk payments. Much functionality has been included and code is now sleeker and maintainable than the previous version. Extensive use of patterns like Value Objects, DAO etc is made. CPIEC got Best project for the year 2004. Extensively worked on 'Extended Payment Details' module.
Environment :
Java, JSP, Servlets, JUnit, JWEBUnit, Oracle.
Recognitions:
- Appreciation from Technical Director and Vice President of Ameriprise Brokerage team on solution to process high volume files.
- Received Sleuth Award for detailed level analysis on an issue and providing the solution.