Background
My Work History
Certified Enterprise Architect with over 17 years of experience in IT, retail, BSFI, advertisement and IOT business domains. Highly skilled in analyzing and developing technical solutions and architectural approaches for complex business problems. Responsible for enhancing the EA practice by establishing architectural standards, repository, content frameworks and EA governance frameworks. Dedicated master data management architect with an outstanding track record on creating reference data management and data quality management capabilities. Plagued with the overwhelming urge to tinker, figure things out, and build programs for microcontrollers and home automation devices.
Summary
Customer Domain Management services: In an endeavor to modernize their technology stake, prudential wanted to build a customer domain, a one stop shop for all customer needs, and sunset their legacy applications. Customer domain should get data from various LOBs, create a profile for the customer and a portfolio that includes all the contracts and policies the customer belongs to. Must support microservices for editing both profile and portfolio individually. The initiative must navigate various organizational rules and regulations. A clear traceability of every transaction must exist. 150+ million records, 13 master entities. Technologies used: Aws Lambda, SQS, S3, Dynatrace, Power BI, CloudWatch, Ataccama MDM, Aurora Postgres, Document DB, Kong, Python and Java.
Architect – Master Data: September 2024 – Present
- Analyzed data from various LBS and existing systems.
- Developed the architecture for CD and MDM applications,
- Designed integration end points for both Ingres and Egress
- Design and developed the data model for MDM.
- Proposed and implemented cleansing, matching aggregation and merging rules based on analysis.
- Solutioned and developed various business requirements and regulations.
- Trained fellow team mates and data stewards on the technology and usage.
- Developed AWS lambda functions in both python and Java to modify and transfer data.
- Implemented Dynatrace dashboards for traceability, server and services monitoring.
- Implemented a PowerBI dashboard that showcased the quality of matched & assist in data certification.
- Worked with other teams to seamlessly sync data in a Consolidated MDM model.
- Work with business to propose and solution enhancements and improvements.
- Managed a team of 7 in an onsite – offshore model
Customer Master Data: Nissan intended to build a customer master data system that took in transactional data from which master data needs to be extracted and eventually mastered as single view of truth. Authored data via salesforce needs to contribute to the master. Master should contain customers, vehicles, relationships to vehicles. Traceability to authoring systems and transactions. System should also include Preferences and must abide by regulations such as CCPA. Data needs to be published and consumed to 3rd party for enhancement. Millions of transactions per day. Technologies used: Ataccama MDM, Aurora Postgres, AWS EC2, sqs, s3, snowflake and API’s.
Master Data Architect: April 2024 – September 2024
- Designed and developed the MDM Architecture and integration endpoints
- Worked on snowflake to load transactional data and extract master data.
- Setup various infra environments,
- Design and implement data model.
- Analyze data and work with stakeholder to arrive at rules for cleansing and mastering.
- Implement workflows to comply with CCPA such as right to forget, right to delete etc.
- Work with business to propose and solution enhancements and improvements.
Master Data Capability: Trader.ca DBA AutoTrader.ca intends to manage its master data assets to assists it consumers more efficiently and make it an easier process for them to purchase cars. In this endeavor Trader.ca intends to initiate the Master Data Capability to improve the quality and effectiveness to leverage this assess in its downstream processes.
Digital Retail: To keep up with the trending needs to be able to purchase a car online, Trader.ca intends to enable digital retailing for consumer to purchase a car online. The complex business process, Application, Data and technology landscape of Trader.ca makes it a challenge to enable DR directly. As an enterprise architect it is my job to Design, architect and support implementation leveraging the existing infrastructure and modifying as necessary to enable DR
Tundra M&A: Trader acquired Canadian business of cox automotive. Trader intends to integrate the applications, data and customer base into the organization. As for part of this a transition architecture and an end state architecture was to be built and Customer MDM was introduced using Ataccama MDM to seamlessly integration the acquisition.
Enterprise Architect – Master Data: February 2021 – Present
- Establish master data practice within the organization.
- Establish and chair the enterprise architecture review board.
- Create fellowship for Data Governance activities.
- Create and publish architecture principles.
- Establish and implement architecture development methods
- Analyze existing architecture and recommend changes for DR.
- Identify opportunities and scope for improvement and suggest solutions
- Design and recommend architectural changes to improve business process, IS and technology.
- Establish an MDM project to handle master and reference Data.
- Support the engineering team on implementations for DR, MDM and RDM.
- By leveraging data management best practices, analyze, design and solution the architecture.
- Propose and design approaches for application development and data improvement.
- Govern the implementation and integration approaches
- Manage architectural changes.
- Assist other teams and practices where architectural expertise are required.
- Plan, implement and govern the end-to-end migration of application, data and business processes.
- Build and transition and end-state architectures
Master Data Management: In an endeavor to master their customer data as part of the flex project, Fleet Complete used Ataccama MDM in a centralized MDM model to master the customers and integrate data between CRM and operation systems. Azure Logic Apps was used for bulk and real-time data integration.
Reference Data Manager: Fleet Complete intended to have a one stop shop for all enterprise owned reference data needs. Like MDM, Ataccama RDM suite was used and Logic app for data integration.
Data Catalog and Data Quality Dashboard: Fleet Complete intended to profile and catalog their existing data along with a platform for business users to create data quality rules and view the progression of the quality of data over time in a dashboard. To do this Ataccama ONE was setup for business users to profile, catalog and create DQ rules. A supporting web app called the data quality dashboard was setup for viewing the DQ results.
MDM Architect: August 2019 – February 2021
- Understand the requirements and develop solution and Project plan.
- Setup Ataccama environment catering to various business needs.
- By leveraging Data Management best practices and analysis, design and solution the architecture.
- Architect and assist the development effort of DQ and MDM plans.
- Perform system testing and Integration testing.
- Catalog and Profile data. Generated data quality rules to and dashboard to show DQ results.
- Analyze profiles result to arrive at match/merge & threshold rules.
- Design the Data Model for RDM and MDM Solution
- Develop and maintain architecture and design documents
- Develop Web services and event handlers for applications to access data from MDM and RDM
- E2E development of MDC processes- data modelling plans for load, aggregate, merge and unmerge.
- Develop logic app orchestration for data integration
- Tune Performance of MDM and RDM Systems to optimize and improve scalability.
- Perform system testing and integration on all the applications developed.
- Develop workflows in MDM for data change management.
- Release planning and application deployment
City of Markham MDM: City of Markham intends to build an MDM Solution to understand the citizens and their needs as a digital Markham initiative. They wanted to do a feasibility analysis on various MDM tools verses building an inhouse MDM system depending on the existing enterprise architecture and data.
Just Energy MDM: Just Energy replaced their existing CRM system and the home-grown MDM system with Ataccama MDC tool. The engagement entailed establishing the Ataccama environment and to do an analysis of the existing systems and data to build the MDM solution and arrive at an SOW.
Fleet Complete MDM: Fleet Complete wanted to build a new customer master system using Ataccama Suite to cater to their 2025 vision to connect all systems. This involved the environment setup, identify data quality rules, architecture, design and development of the MDM application.
Web hosting App: Develop a web hosting app using Python to dynamically host a WordPress server on Google cloud Platform and Azure Portal by leveraging the cloud platform as much as possible.
Data Management Consultant: July 2018 – August 2019
- Understand the requirements and develop solution and Project plan.
- Setup Ataccama environment catering to various business needs.
- Prepare presentations on analysis, build SOW and present the same.
- By leveraging Data Management best practices and analysis, design and solution the architecture.
- Develop and assist the development effort of DQ and MDM plans.
- Perform system testing and Integration testing.
- Provide trainings to clients on MDM and Ataccama Technologies.
- Perform the study of various data management tool’s advantages and disadvantages.
- Develop python programs to automate Google Cloud Platform services based on needs.
- Learn Google Cloud Platform stack to understand the best use of various services provided.
Data Migration Project: Hadoop data lake is the centralized data storage in HDFS with 93 source feeds from variety of source data storage types namely DB2, Oracle, SQL SERVER, VSAM and flat file. The data lake ingestion mechanism is capable of accepting data from both Batch and Near Real time data sources. The data ingestion strategy has 2 phases, namely historical load and daily incremental load. The Historical load is accomplished with Sqoop and hive HQL. The Incremental delta data is captured using CDC Tool called SQData. The data ingestion includes data cleansing using PIG, before loading the data to target data storage. The data lake data storage is built using HIVE external tables sourced via Apache AVRO files for data and additional audit and technical columns.
Senior DQ Developer/Data Analyst: February 2018 – July 2018
Cognizant Technology Solutions, Teaneck, NJ
- Understand the requirements and develop Project plan.
- Track all project deliveries. Plan, Execute, Monitor and Close projects as per project lifecycle guidelines.
- Analyze and understand the source data and identify feasibility of transformation.
- Develop and enhance ELF Scripts to handle migration needs
- Develop SQData Scripts for CDC and Avro creation.
- Perform system testing on all processes including automation.
Integrated Regulatory Reporting (IRR): As a mandatory requirement, SunTrust bank had to provide monthly quarterly and annual reports to Federal such as 2052A, Y9C/Call, FR-Y14A, FR-Y14Q, FR-Y14M. Thus far this task was done manually by the business enablement team using excel. Due to tedious manual work, SunTrust wanted to automate this task. IRR would be the centralized repository for all reporting data while another application called KFIRE would generate the reports and corresponding edit checks. For this purpose, the technologies used in IRR are Oracle as repository and Informatica Data Quality for Data quality checks and ETL transformations.
Senior DQ Developer/Data Analyst: August 2017 – January 2018
Cognizant Technology Solutions, Teaneck, NJ
- Understand the requirements and develop Project plan.
- Track all project deliveries. Plan, Execute, Monitor and Close projects as per project lifecycle guidelines.
- Analyze and understand the source data and identify feasibility of transformation.
- Profile source data to identify Data quality issues and arrive at mitigation rules.
- Generated scorecards for business users to view the data quality anomalies.
- Work with the BA’s to verify rules and identify requirements.
- Developed routines to extract data from source, cleanse, standardize, transform and load into IRR Tables.
- Developed mapping to generated KFIRE expected data and move to DTS layer for transmission to KFIRE.
- Developed extraction logic from KFIRE loopback to IRR.
- Worked with scheduling teams to setup CA7 jobs to enable automation.
- Develop functional specification document and Design Document and Test Scripts.
- Establish Connection between various sources and IRR.
- Perform system testing on all processes including automation.
Client Master (MDM): At First Data, Ataccama tools were used such as MDC and MDA to develop an MDM solution from scratch. MDC is the primary MDM tool which runs on Ataccama server while MDA is a web-based application that is used for stewardship of the data. Ataccama uses its own proprietary coding standards, which uses technologies such as Java/J2EE, XML, CSS and Rest Web services. MDM Implementation at FirstData also uses IDQ, JMS, Elastic Search, Hadoop, Hive, Impala and Oracle as their back-end Database.
Product Reference Management (PRM): The purpose of the initiative is to create workflows for maintenance and governance of Product Reference Data for First Data’s multiple vendor/clients. This will typically contain well defined process/workflow for maintenance and data governance and automated notifications supporting the workflows. This uses Ataccama Reference Data Management (RDM) and Ataccama Data Quality Center (DQC) tools. Coding standards are proprietary to Ataccama.
Client Locator: The purpose of this module was devised in order to meet a business need of plotting on google maps an Owner associate’s FD office location, and Locations (both merchant and FD) based on zip codes passed. The solution needed to be designed, constructed and rolled out to production in phases. The solution will implement Tables and relationships in big data Hub, security to grant Data Stewards access to relevant data, initial Load of First Data Office Locations in datahub post cleansing using IDQ’s address validator transformation and finally web services to consume zip and publish nearest FD locations
Technical Manager: January 2017 – August 2017
Cognizant Technology Solutions, Teaneck, NJ
- Analyze the requirements, determine feasibility; identify technical milestones and Review Code.
- Track all project deliveries. Plan, Execute, Monitor and Close projects as per project lifecycle guidelines.
- Profiled data on IDQ and developed scorecard to show case the quality of data.
- Developed IDQ routines to extract source data, address validation, match and store in datahub.
- Execute DQ plans on the Hadoop cluster using Ataccama BDE (Big Data Engine) – DQ on Big Data.
- Work with BAs to finalize match/merge & threshold rules.
- Design the Data Model for PRM and MDM Solution
- Develop functional specification document and Design Document and Test Scripts.
- Setup Development, QA and Production Environment
- Develop plans and components on Ataccama IDE to support MDM logics.
- Develop Web services for applications to access data from MDM and RDM using Ataccama IDE.
- Developed web services on IDQ for real time address validation required by client systems.
- E2E development of MDC processes- data modelling plans for load, aggregate, merge and unmerge.
- Establish Connection between Ataccama and Impala/HIVE tables on Hadoop Cluster.
- Tune Performance of MDM and RDM Systems to optimize and improve scalability.
- Perform system testing on all the applications developed.
- Develop workflows in MDM for data change management.
- Develop workflows for PRM to send mails when a change occurs.
- Develop IDQ Mapping to extract complete and accurate address data for client locator.
- Engage with Ataccama Technical Support to address product issues and improve performance and scalability.
Master Data Management: At Key Bank, IBM Infosphere MDM was used to manage Customer data to gather a holistic view of their clients in order to engage the client more effectively by assigning a dedicated Primary Office for all their financial needs. Key Bank Master Data Management also includes a web-based application that is used for stewardship of the data, to maintain the integrity of it whenever manual interventions were needed. Technologies used are IBM Infosphere, IDQ, Java, J2EE, JSF, Oracle, SOAP Webservices, RMI, MQ, XML, etc.
Technical Manager: December 2015 – December 2016
Cognizant Technology Solutions, Teaneck, NJ
- Analyzed the requirements to determine the feasibility of the Project and identify technical milestones.
- Involved in determining risks involved and derive at mitigation plans for the same.
- Setup and configured the development environment.
- Initiated and attended meetings with business to understand the requirements and arrive at road map.
- Developed functional specification document and Design Document from BRD.
- Analyzed the existing web services to integrate with the project.
- Profiled the source data using IDQ to identify data quality issues.
- Developed IDQ routines to improve the Quality of data and stored them in MDM source tables.
- Developed MDM Transactions, Additions and Extensions
- System performance Tuning.
- Was involved in system testing activities.
- Coordination with the other teams involved in the project
- Coding, development and unit testing
- Responsible for code reviews and testing of the deliverables
- Developed SQL Queries to enable data access from DB.
- Track all project deliveries. Plan, Execute, Monitor and Close projects as per project lifecycle guidelines.
- Coordinate with the offshore team to enhance agile development process.
- Worked on a POC to understand how best can BPM be used in MDM and how to leverage its capabilities
- Was involved in Upgrade and Increasing scalability of Hardware and Infosphere MDM.
Retirement Planning: Retirement Planning or RHP was a development application for retirement planning. Although this was an existing project, it was not in production due to major defects in the application. It was a stateless session management application in which I fixed all the defects to move it to production. It was built on java and involved web services. Also was involved in creating and updating Autosys jobs
IOE: This is a new vendor application for Insurance order entries which uses IML web services to retrieve data from UBS. It uses Java for development and Service Mix as the EAI layer. Currently working on learning and understanding Service Mix to support development in the future.
Annuity Repricing: This is an enhancement project to an existing application called the IFS. It is intended to normalize the commission amount received by the financial advisors for brokering an insurance to their client. Initially the commission to the advisors varied based on the insurance company. By the advancement of this enhancement, for a type of insurance the commission is standardized despite various Insurance company’s involvement. This application was built using Java and struts framework. Backend is a centralized repository on DB2.
ECL: An already existing application that is used by the financial advisors to do Alternative investments and funds transfer for clients. I was involved in the enhancement of this application to incorporate partial funds, upgrade to the workflow of the application and to integrate with other enterprise systems to check for client registrations. This application was built of Java using JSF, Struts and spring. This application also uses adobe LiveCycle to generate documents. The application’s back-end layer is an instance of oracle and it also communicates with the enterprise level centralized repository on DB2.
IFactory and IML: These are existing applications which works as a Information service providers to other applications and was going through end of life phase where these application were to be migrated from Websphere Application Server 6.1 to Websphere Application Server 8.5, Migration of Java version from 1.6 to 1.7 and the Oracle database was migrated from 9 to 11g. I was involved in the migration of these applications.
Financial Plan Billing: Financial planning tools which already exist help clients meet their life goals through proper management of their finances. Life goals can include buying a home, saving for a child’s education or planning for retirement. The planning tools gather relevant financial information, set goals, review the client’s current financial status and tolerance to risk, and help formulate a plan to meet the life goals. UBS wanted to automate the process for billing the clients those who required Financial Planning. This process involved interaction with various other systems such as Generic Billing System, Financial Advisor Compensation system, Client and Relationship system, etc. The billing process involved a workflow as it required frequent manual interactions such as client Signatures, approval from authorities, etc. The design and development incorporated all of the above.
Architect/Lead Developer: August 2013 – November 2015
Trident Consulting Inc via HCL Americas, Dublin, CA
- Architected and developed projects at both application level and Enterprise level.
- Initiated and handled discussion with various teams for Enterprise Application Integration.
- Handled a team for a complex project to bring the application to Closure.
- Analyzed the requirements to determine the feasibility of the Project and identify technical milestones.
- Involved in determining risks involved and derive at mitigation plans for the same.
- Estimated the efforts required to develop the project using FPA and WBS techniques.
- Project planning, tracking and documenting.
- Setup and configured the development environment.
- Using SVN, setup the code integration environment for teams both at onsite and off-shore.
- Initiated and attended meetings with business to understand the requirements and arrive at road map.
- Was responsible for designing the project in accordance with the Existing Enterprise Architecture.
- Developed functional specification document and Design Document from BRD.
- Developed the base code to support further development using RAD
- Analyzed the existing web services to integrate with the project.
- Developed web service Client for various web services using Axis and CXF.
- Alter or enhance Adobe LiveCycle Processes as per business needs.
- Create or modify Abode LiveCycle forms for Dynamic Document Generation.
- Responsible for the performance of the application and System performance Tuning.
- Was involved in system testing activities.
- Coordination with the other teams involved in the project
- Coding, development and unit testing
- Responsible for code reviews and testing of the deliverables
- Migrated application from server version 6.1 to 8.5.
- Planning and deployment of all releases and change managements
- Developed SQL Queries and stored procedures to enable data access from DB.
- Responsible for coordinating deployments in different environments like QA/UAT/Stage
- Coordinate with the offshore team to communicate the requirements, tasks and to enhance agile development process.
Home Depot is an e-commerce website. I was involved in the dot com checkout part extending from the cart to the payment which required commerce skills and on Business Control Dashboard which required Java skills and business expertise. The dot com website had multiple fulfillment models to which the website catered such as Buy online ship to store, Buy online pick up in store, Ship to Home and Appliances. It also includes close quote in which the user is given a quote for a customized product and the payment is made online. Business Control Dashboard provided a platform for the business users to control the visibility of a product, provide offers and clearances on a store product level.
Onsite Coordinator: February 20 13 – July 2013
Mindtree Limited, Bangalore, India
- Analyzed the current system and the enterprise architecture.
- Involved in sizing and effort estimation for projects.
- Developed functional specification document and Design Document from BRD.
- Analysis of defects and provide design approach to the same.
- Defect fixing and testing.
- Test case preparation and execution.
- Co-ordination with off-shore team members.
- Create sequence diagrams, class diagrams, and use-case diagrams for system using MS Visio
I was an integral part of various projects for this client, thereby providing me with complete exposure on various aspects of software development life Cycle for both development and maintenance projects. Below are some of the projects I was involved in with this Client.
EDelivery: This is a service built using WebSphere process server to orchestrate across various services to enables invoices to be delivered electronically. The systems that generate invoice transactions are integrated with eDelivery to deliver the invoices electronically across Chartis domestic and archive them in a common document repository.
NGPS: This service was built using Websphere Process Server and Adobe LiveCycle. It enables runtime creation of PDF document and stores them in common document repository along with updating its metadata in relational Database. It orchestrates various web services to achieve this. It also supports fetching of documents.
RatingService: This is a web service built using CGI Ratabase. It is a Rating Engine used by underwriting applications to calculate premiums for different insurance products based on a set of criteria requested by the client.
ESurety: This web-based application was built on Java/J2EE technology with Sybase as the database. A surety bond is a promise to pay one party (the obligee) a certain amount if a second party (the principal) fails to meet some obligation, such as fulfilling the terms of a contract. This project incorporates most of the insurance areas such as Insurance, surety, co-surety, reinsurance, contracts, PML etc.
Module Lead: December 2012 – January 2013
Mindtree Limited, Bangalore, India
- Involved in providing a proposal for the eDelivery Project.
- Analysis of current system and other services the system interacts.
- Analyzing Functional Requirements and preparing the technical specifications for all modules.
- Designed the inputs and outputs to and from the orchestration layer.
- Designed and implemented the BPEL process flows and ESB Mediation Layer.
- Invoking the ESB layer services from BPEL based on request.
- Created SCA, Web services components in WID.
- Implemented routing and transformation logic in BPEL layer.
- Create sequence diagrams, class diagrams, and use-case diagrams for system using MS Visio
Senior Engineer: July 2010 – November 2012
Mindtree Limited, Bangalore, India
- Perform high level business analysis and co-ordination with Actuarial team, business analyst and customers to prepare functional/technical specifications.
- Conducting trainings within AIG and Mindtree for customizing the products using the tools
- Design new Products and Interfaces in Rating Service for Underwriting Systems using Ratabase® Product Builder and Ratabase® Calculator
- Performance tuning of the Application.
- Feasibility Study and the Design for support and enhancements
- Create/develop Rates, Rules, and Fields in the CGI Ratabase for the new Products.
- Update Rates, Rules and Fields in the CGI Ratabase tool for the existing Products as per BRD.
- Maintain Versions for Rates, Rules, and Fields for the new and existing Products.
- Configure the Rates and Rules for Underwriting Products.
- Review the Design documents and Implementation created by the Developers for various enhancements.
- Design the Interfaces, to communicate with other AIU Holding Underwriting Systems
- App Deployments/Integration/Build of J2EE components on Websphere and configuration management of source code using PVCS
- Manage Maintenance, Production Support and User Rollout
Software Engineer: January 2008 – June 2010
Mindtree Limited, Bangalore, India
- Capturing requirements for various enhancements and key SDLC activities such as development, testing, etc. Developed algorithms to calculate premium review
- Was involved in system testing activities including system test case preparation.
- Involved in design of specification templates/guidelines and populating the specification templates
- Analysis of business use cases provided by the business community.
- Requirements gathering from various stake holders who would intend to showcase the tool.
- Providing solutions as per the requirements gathered.
- Clarifying business requirements with business and other interfacing teams involved in the project
- Coordination with the other teams involved in the project
- Coding, development and unit testing
- Configuring various environments
- Coordinate Integration and System Performance
- Planning and deployment of all releases and change managements