Results-oriented Technical Solutions Architect with extensive expertise in AWS cloud infrastructure and Adobe Experience Platform suite. Proven track record of delivering innovative solutions that enhance campaign ecosystems and drive organizational success. Adept at transforming technical requirements into impactful strategies, fostering collaboration across teams to achieve business objectives. Committed to leveraging technology to create sustainable improvements in digital marketing initiatives.
Eager to implement and expand expertise in Adobe Experience Platform and Adobe Campaign Classic, specializing in audience creation, segmentation, and campaign development. Demonstrated success in implementing efficient architectures that enhance marketing strategies. Strong commitment to collaboration and continuous improvement in fast-paced environments.
Techno-Functional Sr. Business Systems Analyst with 17 years of experience in the telecom sector, excelling as a Scrum Product Owner. Proven track record in analyzing market data and trends to develop effective business strategies. Expertise in various billing systems, data integration tools, and reporting software enhances the ability to drive operational efficiency and improve decision-making processes. Aiming to leverage extensive experience to contribute to innovative projects that shape the future of telecommunications.
Accomplished project manager with a proven track record in leading complex initiatives within the Telecom and Logistics sectors. Demonstrates strong leadership and technical prowess, driving successful project completion while enhancing operational efficiency. Recognized for problem-solving capabilities that contribute to achieving strategic business objectives.
Specializing on BSS/OSS, AWS technologies, ETL based data migration solutions.
Experienced in gathering, analyzing and documenting business requirements and developing Requirement Documents and Requirement Specifications for big scrum teams in the org in the form Solution Wikis, JIRA tickets & BRDs,LLDs.
Skilled Data Modeler for enterprise data warehouse systems with various data mart exposure. Experience working with data quality initiatives.
Business Process Analyst with extensive experience in identifying and documenting existing business processes. Proficient in creating data models, behavior models, and process flow diagrams using tools such as PlantUML, MIRO, and Figma. Known for providing clear direction to delivery teams to enhance operational efficiency. Aiming to leverage analytical skills to drive process improvements within a dynamic organization.
Overview
18
18
years of professional experience
1
1
Certification
Work History
Solution Designer & Architect
Comcast
Philadelphia, PA
07.2022 - Current
Consumer Domain: Developed solution architectures for various Adobe Experience Platform (AEP) based on architecture that is on AWS cloud and has data outbound and inbound ingestion to/from Xfinity Consumer Data Platform(XCDP) audience platform.
The architecture created involves personalization segments that are built on the Adobe AEP tool, which is designed with certain qualification rules to target region, location, contact, ecid, etc., based audiences and is further used to display personalized content on outbound channels like xfinity.com UI or xfinity app screens.
Build solutions that are created under the XCDP profile-centric model as contact interaction event ingestions, and used to build profiles in XCDP audience ecosystems.
Ultimately, the profiles and audiences were ingested into the XCDP campaign hub(XCH), which is also part of the Adobe Campaign Classic(ACC) suite. This tool is used to build communications that are scheduled or on demand to target omnichannel campaigns, such as direct mail, email, and SMS.
Worked on designing and architecting solutions for AWS cloud-based and open systems/tools, like Databricks, etc., for data ingestion, and AWS S3 storage for data storage and migration.
Additionally, I built designs utilizing AWS Lambda processors to perform certain processing functionalities to bring the data into the XCDP March ecosystems for both XCDP Audience (AEP) and XCDP Campaign Hub (ACC).
Developed API-based solutions for a feedback loop where all customer presentations on the digital platform and their actions were captured via JavaScript and pushed through an Feedback Loop event API endpoint to feedback analytics, where they can be read by an Enterprise Business Intelligence (EBI) decision engine. Built an automated ranking system arbitrator for the contents to be displayed for the same consumer on their Home page, Learn page and myAccount pages as banners.
Build solutions that can read customer and consumer intent on the UI and app based on their engagements on each page visit, URL clicks, banner clicks, etc., via the Adobe Web SDK tool, which would then write those to Adobe Experience Platform(AEP) to enable personalized content and deals display on the UI to upsell various Comcast lines of business (LOBs), such as Internet, XUMO, Xfinity Mobile, Xfi gateway, etc.
Commerce Domain: Learn > Plan Builder > Buy > Check Out > Order Confirmation: Designed solutions and build various use cases designs, for both front end UI and backend on xfinity.com buy flow process from Learn, Plan Builder, Buy & Check out modules by using various APIs and databricks jobs.
Build solutions to interact with the Xfinity Consumer Data Platform (XCDP) consumer domain personalization tool to retrieve personalized content for both customers and prospects, to enable qualified content display on xfinity.com Learn pages via XCDP account/location/ecid-based on-prem APIs.
For the buyflow process, architected design solutions for plan builder pages to be able to consume the Enterprise Decision Engine (EDE) offer engine and rules engine, with various qualification modules, to present offers suitable to the individual while building their purchase plans on Plan Builder.
Designed solutions for checkout flow, identity creation, identity management, and identity consent modules based on IAM in-house DynamoDB solution under the sales flows on the xfinity.com order checkout process.
Developed end to end solutions for various rules based engines for EDE (Enterprise Decision Engine) that would use qualified audiences from Adobe Experience Platform(AEP)
Business System Analyst
Comcast
Philadelphia, Pennsylvania
01.2021 - 01.2022
Xfinity Mobile backend data architecture: Worked on this team which handles data hub for reporting users and supply chain, device team in mobility world of comcast, this gives order status, order placing capability and other reporting aspects
Translated Business EPICs into technical SPRINT User Stories for development/test team part of the SCRUM team/ART (Agile Release Train)
Worked with Solution Architect/Solution Designers to understand the business requirement and solution proposed, create epics and user stories out of that, also propose/change the solution to better way
Worked on defining JSON payload/contract structures for the business /domain /BPM events that gets published for each transaction happens between Front end to backend interactions and design requirements/user stories to store in bigdata and AWS S3 platforms
Worked on Swagger UI post requirement design and JSON structure definition response for API use
Created Data Flow model from Sales>Ordering>Provisioning>Billing>Service Assurance for telecom order processing for users
Worked and mentored Scrum team and planning Story Points and scrum schedules for each user stories, design tasks and test plan tasks
Facilitated user story grooming for each sprint with Dev/QA/ETE and SD teams and helped them to size up
Participated on planning session and educated with guidelines to plan the tickets in upcoming sprints based on the business priority, defined business priority
Created data models on database side as well to have the data storage for reporting or further user consumptions
Created on wife frames for Xfinity Mobile Data Visualization and predictive analytics tableau dashboard design
Created design requirement for data migration from Hadoop to AWS S3 data store
Business System Analyst
AT&T
Middletown, New Jersey
02.2016 - 01.2021
OSP (Order Status Platform):OSP is a reporting tool & APIs designed to provide the reports to customers order status end to end
In AT&T enterprise system, there are various services & offers are being provided to enterprise customers & users with billions of transactions, they don’t have a single place for the visibility of their order status on the order life cycle, this tool designated common target platform (aggregated from various 200+ sublime ordering, service delivery applications) to provide a complete report of the order lifecycle
This tool (API & UI layer)is also built for users & customers with lot of search capability by various key identifiers of an order, like order Number, Circuit Id, Asset Id, sales id, address search, customer name search etc.It also reports the delta view of every order when its supped(Supplement), moved in/out & MACD (Modify, Add, Change, Disconnect), process status for all orders.This tool has been built into customer notification with digital notification or touch point letters about key milestone/checkpoints
This tool (API & UI layer)was also built to show color coding on Order Health and Order Milestone Health and pizza tracker view to show Order movement status on the order life cycle
Single Customer View (360 degree):SCV is a 360 view of Billing Only reporting tool, this reports billing MRC (Monthly Recurring Charges), NRC (Net Rate Charges), OTC (One Time Charge) etc
For all AT&T enterprise customers
Tool is built to capture, integrate various billing applications modules (150+) with various data forms like mainframe data, relation data , file systems, NO SQL database data etc
And transform them to a relation data model and build a 360-degree report to customer
360-degree reports are basically billing status, fall outs, compensation reports that get captured in web view as well as mobile view
This also helps AT&T analytics to create revenue generation intelligence for building more customers by offering various new services feasible enough to them based on their billing trends
Transformed complex client business use case’s into Business Requirements documents (BRDs) & Solution Diagrams and then to Software technical documents by creating, modifying, and enhancing business requirements from client into formats as per current standard protocols albeit Agile tool dictates or Waterfall methodology structures (Word, Excel, etc) for handoff to Development and Testing teams
Conducted interviews with multiple stakeholders to capture and create Business Requirements documents (BRDs) & Solution Diagrams
Translated Business EPICs into technical SPRINT User Stories for development/test team part of the SCRUM team/ART (Agile Release Train)
Defined scope and documented BRD or new enhancements based on requirement elicitation, operational constraints, information system architecture and project risks in Waterfall Projects
Elicit, derive, document, prioritize, solicit and gain approval with business stake holder as well as system delivery teams
Document system specific user role matrix in terms of CRUD (create, read, update and delete), SODs(segregation of duties) and state diagrams along with data dictionaries for OSP project for AT&T client
Created ETL data load HLD’s (High Level Design) Requirements with transformation logics and rules from various underline applications from different source databases like oracle, SQL server, Sybase etc.& Source file systems like text, excel, XML files etc
Created rules requirement for rigorous order health and order milestone health color coding rules based on underline data fields that feeds to DROOLs engine to show RED, YELLOW, GREEN, BLUE pizza tracker to showcase Order Status and necessary action to be takes by order taker or service assurance personal
Created BSS requirement design created with discussion with Business and underline OSS designs documentation& UML diagrams created with Erwin Data Modeler & Enterprise Architect tool
Created User Stories, MVP (Minimum Viable Products) from the business EPICs by interactive and conducting scrum/grooming sessions with Business users, Product owners and Tier 1 architects and Epic owners
Worked and mentored Scrum team and planning Story Points and scrum schedules for each user stories, design tasks and test plan tasks
Supported Quality Assurance efforts in software production and system testing environments with analysis, investigation, and resolution within strict timeframe targets using such investigate tools as JDeveloper, SOAP UI, Swagger etc
Worked on Swagger UI post requirement design and JSON structure definition response for API use
Provided IT application support for Agile IT lifecycle process as application Scrum Master with the flexibility of continuous support of the Waterfall process
Delivered requirement designs for microservices with Lean Agile &SAFe Agile process
Provided application Business design requirements for real time REST API for business for order viewing and ordering, provisioning
Created Technical requirements to read various form of files or source mainframe data translation into SCV desired relation data model
Designed requirements & solutions for AT&T customers to viewing the order end to end status using SOV(Single order view), MOV(Multiple order view) and Mobile view
Advice and Review systems documents such as Requirements and Design Specifications, Risk Assessment summaries, Data Migration Plan, Pre/Post Executed Scripts, Traceability Matrix, Release to Production Statements etc
Providing Data Models on creating data model in Oracle using star schema and fact & dimension model
Experience on CA Erwin Data Modeler and Erwin Model Manager to create Conceptual, Logical and Physical data models and maintain the model versions in Model Manager for further enhancements
Designed Logical models for the project consumption from End to End data model on the entire data set by Enterprise Architect tool
Created AID’s (Application Interface Design) documents to create mutual agreements with various source applications to OSP and SXP application for various data transfer mechanism agreement, SLAs, environment configurations etc
Created JSON (JavaScript Object Notation) for data repose output for API responses to present to graphical UIs
Created DMAAP (Data Mechanism for Application to Application Process) AID’s (Application Interface Design) for interfacing application to consume OSP data services
Created REST (Representational State Transfer) API (Application Program Interface) AID’s (Application Interface Design) for interfacing application to consume OSP data services
Created Data Flow model from Sales>Ordering>Provisioning>Billing>Service Assurance for telecom order processing for users
Performed business level BETA testing manual as BETA user to certify the reports before user GA (General Access)
Build requirements for Business Intelligence reports with Business Object reporting tool and Looker tool
Assisted UI/UX developers to build wireframe for reporting tools
Supported UAT (User Acceptance Testing), Audits and managed Data stewardships
Strong database skills, Data Warehousing and Business Intelligence and ETL process knowledge& batch processes (FTP, SFTP, C:D, DMAAP)
Experienced in decomposing interdependent system interactions, ETL batch jobs etc and remapping them to functional requirements
Experience generating technical requirements and translating them into executable user stories
Systems Analyst
IBM
Bangalore, India
04.2007 - 02.2016
SXP (Services eXpress Platform):SXP is a backend Data Integration tool designed to integrate data from various AT&T applications to create complete data model for Billing reports and Compensation failures
It is designed to integrate order status & history to the billing failure calculations
Using certain ETL (Extraction Transformation Loading) tools and mechanisms with over bulk & near real time load to underline database and provide the information to reporting UI (User Interfaces) / API’s (Application Program Interface) as well as to compensation engine to compensate customers
This tool also provides Sales order to Service Delivery order failure reports which is called order hand-off fall out with various pending order check methods & process
This tool also built to complete pending order check with lot of validation processes like site validation, address validation, asset validation, customer information validation etc
GCP (Global Computing Platform):GCP is a data platform for service assurance for AT&T assets and site related status and reports
This is to support classic T bell south legacy customers for Frame Relay, Nodal, ATM, Private Line data services for AT&T classic customers
This gives the opportunity to get statistics of legacy networks or devices of AT&T classic customers and provides opportunity to reuse them
This application captures the status and fault managements process on the network layer for the devices, assets for the faults reported by technicians during the site validations, asset activation etc
There are around 290+ Provisioning & Service Assurance Platform in AT&T across the globe that provides provisioning of the orders placed by doing site validation, address validation etc
And assures the service assurance for existing customers
GCP show service assurance and fault issue information data to those technicians on GCP reporting UI
EDF (Enterprise Data Fabric):EDF is a Data Mart that contains transformed data with various Data Integration tools, designed to integrate data from various AT&T applications
This is complete Data holder of all AT&T 600+ application in single place with lot of transformed data, this data mart feeds every other application which needs various data sets for various purposes
This data mart is built to eradicate scattered data across 600+ applications in their own data store, instead to store in EDF data platform with applied business values on it
There are like 165+ Ordering Platforms within AT&T across the globe who orders fiber based, DSL based, VOIP based, Broadband based, legacy nodal based services, EDF does the centralization of those into the data mart
There are like 290+ Provisioning & Service Assurance Platforms in AT&T across the globe that does provisioning of the orders placed by doing site validation, address validation etc
And that data doesn’t get noticed from single data mart and it’s scattered, this EDF Data Mart was built to store everything in a single place and provide necessary reports out of the data mart or feed to other third party applications for their reporting purpose
There are like 150+ Billing Platform in AT&T across the globe that does every customer billing, fall outs, charging etc
With various file systems and technologies of data but that is not accessible for any data analytics and failure reports, EDF data mart is built to store all the data in readable format and provide necessary API’s , UI’s etc
For analytics
Developed design requirements for GCP data mart for cross domain inventory and order data linkage and lineage reports
Developed supporting data model designs for VOIP and Broadband related asset data structure model in MS visio reports
Involved in creating documentations on server specific FTP, SFTP file transfer mechanism with environment team
Involved in reverse engineering legacy PERL scripts to document the logic requirement involved and transform them to Business and High Level design requirements
Decompose, design, document and compose business processes from UNIX and PLS/SQL end of service line codes for AT&T IT infrastructure to ETL Datastage batch jobs
Created BSS requirement design created with discussion with Business and underline OSS designs documentation& UML diagrams created with Erwin Data Modeler & Enterprise Architect tool
Document system specific user role matrix in terms of CRUD (create, read, update and delete), SODs(segregation of duties) and state diagrams along with data dictionaries for GCP project for AT&T client
Designed Logical models for the project consumption from End to End data model on the entire data set by Enterprise Architect tool
Created AID’s (Application Interface Design) documents to create mutual agreements with various source applications to GCP, SXP& EDF application for various data transfer mechanism agreement, SLAs, environment configurations etc
Strong database skills, Data Warehousing and Business Intelligence and ETL process knowledge