Summary
Overview
Work History
Education
Skills
Websites
Accomplishments
Certification
Current Project
Precedence Experience
Timeline
Hi, I’m

CHIRANJIB DAS

Cloud Data Solutions Architect
Rocky Hill,Connecticut
CHIRANJIB DAS

Summary

As a Data & Analytics Cloud Solutions Architect at Cognizant, I have over 17 years of experience in data projects, business intelligence, cloud migration, AI/ML projects in the Healthcare and Retail sectors. Have extensive experience with data insights of HC Clinical analytics, CPG Supply chain & Sales domain. Holding GCP and AWS certifications and have worked onsite in the USA and Australia.

Core competencies include developing comprehensive migration strategies, reverse engineering existing systems, designing and implementing cloud solutions using various cloud platforms and tools, defining IT roadmap and architecture, and working with cloud security and DevOps teams. I am passionate about delivering seamless and scalable solutions that meet the business needs and goals of clients. Also enthusiastic about learning and applying new technologies such as AI and ML to enhance data and analytics capabilities.

Focused Project Manager adept at planning, directing and maintaining continuous operations in various departments. Experienced in directing manufacturing employees and keeping efficient production in accordance with quality standards. Applying creative and analytical approach to operations for continuous process improvement. Skilled at identifying or anticipating problems and providing solutions. Excels through mentoring, training and empowering team to excel in performance.

Overview

17
years of professional experience
6
Certification

Work History

Cognizant Technology Solutions US Corp

Manager- Projects
2013.11 - Current (10 years & 10 months)

Job overview

Cloud Data Solutions architect

Tata Consultancy Services LTD

IT Analyst
2011.03 - 2013.11 (2 years & 8 months)

Job overview

Senior BI Consultant

Cap Gemini India Pvt LTD

Consultant
2010.06 - 2011.03 (9 months)

Job overview

Teradata Developer

Amdocs

Senior Subject Matter Expert
2010.01 - 2010.06 (5 months)

Job overview

Teradata Developer

Tech Mahindra Ltd (Formally Satyam Computer Services Ltd)

Software Engineer
2006.11 - 2010.01 (3 years & 2 months)

Job overview

Teradata Developer

Education

West Bengal University of Technology Kolkata, India

Bachelor of Technology from Computer Science And Engineering
08.2006

University Overview

Skills

  • Cloud architecture
  • Data warehouse
  • Big Data Tools
  • Project Management
  • Data Migration
  • Technical Support
  • Software Development
  • Data architecture
  • Client Relationship
  • Product Development
  • Business Dev
  • AI/ML business insights

Accomplishments

Accomplishments
  • Presented with Best performance recognition from Kellogg and PepsiCo clients for AWS & Azure Delta Lake implementation
  • Completed the Pepsico Cloud migration Project on time and $1.2M under budget.
  • Supervised team of 20+ staff members.
  • Designed and launched the Kelogg Kortex Cloud Transformation Project, resulting in a 43% decrease in Total cost Of Ownership (TCO)for the company.
  • Achieved 60% time optimization,integrating internal github and Servicenow by creating ChatBot using chatgpt LLM model for internal users in Aetna
  • Achieved 38% monthly billing cost of GCP introducing Compute projects overall for apps and ML platforms. Also fine tuned bottlenecks by analyzing current graphs on GCP billing costs which resultant huge success on TCO

Certification

  • Google Cloud, Google Certified Professional Machine Learning Engineer, 2024
  • Databricks, Accredited Generative AI Fundamentals, 2024
  • Google Cloud, Google Cloud Architect, uZ4kLh, 2022
  • Google Cloud, Associate Cloud Engineer, N1D2vu, 2022
  • Amazon Web Services, AWS Certified BigData Specialty, 1PNEGGKC3FEQ1CW4, 2020
  • Teradata, Teradata certified Master(V2R5), 2010

Current Project

Current Project

CVS health (Aetna) Cloud Transformation (On prem- GCP migration)

  • As a Cloud Solutions Architect, actively involved in all phases (discovery, pilot, ready state) of migrating existing mission critical business applications into GCP cloud-based environment. Activities required to re-host an application into the cloud includes architecture modifications, database and/or application server re-hosting, and potentially recoding of existing capabilities to take advantage of cloud platform services.
  • Reverse engineering of 40+ applications from existing Hadoop system and build roadmap for GCP- BQ migration. Planning for pilot phase data migration for 3 applications, Collecting data sets that meets functional / non-functional business requirements. Working with data and analytics experts to strive for greater functionality in data system
  • Guiding team of 20+ members on 3 tracks for reverse engineering, mapping creation and data pipeline creation. Building strategy for historical data migration (using existing data in Hadoop and SQL server) with minimum changes to accumulate exiting KPI’s and new data pipelines for on – going data (delta) ingestion
  • Introduced Internal chatbot using chatgpt LLM by integrating internal github and servicenow with user prompt api which reduces 40% time for users to search internal knowledgebase
  • Working closely with customer Directors and VP’s assisting them for continuous betterment of the program delivery and cost optimization introducing AI powered solutions

Precedence Experience

Precedence Experience

Project: PepsiCo Teradata to Synapse Migration

  • As Cloud Solutions Architect, Actively involved in a program geared towards migrating existing mission and business applications into a cloud-based environment. Activities required to re-host an application into the cloud may include architecture modifications, database and/or application server re-hosting, and potentially recoding of existing capabilities to take advantage of cloud platform services.,
  • Worked as Azure architect/Data management techno-functional lead from offshore, Approach finalization for historical and ongoing incremental data load, Worked in reverse engineering of the Teradata existing jobs to build the inventory for app migration.
  • Worked in creating detailed level design document for pilot apps, Develop solutions for Pilot phase, to migrate ‘Power Of One’- Sales and Marketing KPI application from Teradata to Azure Synapse, by converting existing TD Bteq logics into ADB PySpark codes.
  • Move Teradata ACQ layer data using S&T framework into Bronze and then build the silver and gold layer inside Delta lake. Finally move the data into Synapse using store procedures., Worked on production go-live/cutover for pilot apps, Responsible for mentoring the team of 10+ members in offshore for seamless delivery, Collecting data sets that meets functional / non-functional business requirements, Working with data and analytics experts to strive for greater functionality in data system, Prepare for future applications discovery and analysis for upcoming sprints

Project: Kellogg Kortex Migration

  • As Cloud Solutions Architect, Strategist and AWS lead on re-engineering of applications to migrate from on premises and external file systems to AWS cloud services, which involved data warehouse rebuilding over AWS which will be accessed through all the reporting services and analytics team,
  • Worked on classification of the existing tables in domain group (Master Data and Sales data major) in the activity of re-engineering and grouping of the existing tables for building domain granular model, Involved in Kellogg specific new data model activities which was restructuring the tables on domain and region basis, Architecting, developing and prototyping of different frameworks to bring data from on premises to cloud using Spark job (written in Python ) with AWS Glue services for non-SAP sources and confluence file type,
  • Create end to end design for applications to host on AWS which involved s3 RAW-> S3 CleansedProcessed->Redshift Target Db for domain creating domain granular layer, Created STTM document for Cleansed to Redshift target tables using current transformation logics which were in on premises sql, Worked extensively with Cognizant accelerator framework (DIF tool) to move data raw to cleansed layer while developing all kind of file formats like csv/txt/xls/xlsb/json/dat to accumulate in cleansed layer,
  • Worked extensively on analysis of SAP global tables ECC/SCM (major) modules for analysis and mapping for building domain granular data model, Successfully migrated Master data and Sales data in Domain Granular model(Material/Customer/location /Foundation reference/Sales- Execution/Strategy/Performance/Order etc) into AWS environment., Working as advisory to all the different tracks for data related issue fix/debugging in all around the tracks in Kellogg, Worked with AWS support for fixing different issues while executing the Glue codes which were not able to execute due to VPC/subnet mask.
  • Managed a team of 20+ members in Agile POD model of execution

Timeline

Manager- Projects
Cognizant Technology Solutions US Corp
2013.11 - Current (10 years & 10 months)
IT Analyst
Tata Consultancy Services LTD
2011.03 - 2013.11 (2 years & 8 months)
Consultant
Cap Gemini India Pvt LTD
2010.06 - 2011.03 (9 months)
Senior Subject Matter Expert
Amdocs
2010.01 - 2010.06 (5 months)
Software Engineer
Tech Mahindra Ltd (Formally Satyam Computer Services Ltd)
2006.11 - 2010.01 (3 years & 2 months)
West Bengal University of Technology
Bachelor of Technology from Computer Science And Engineering
  • Google Cloud, Google Certified Professional Machine Learning Engineer, 2024
  • Databricks, Accredited Generative AI Fundamentals, 2024
  • Google Cloud, Google Cloud Architect, uZ4kLh, 2022
  • Google Cloud, Associate Cloud Engineer, N1D2vu, 2022
  • Amazon Web Services, AWS Certified BigData Specialty, 1PNEGGKC3FEQ1CW4, 2020
  • Teradata, Teradata certified Master(V2R5), 2010
CHIRANJIB DASCloud Data Solutions Architect