Kalpana is a results-driven Lead Technical Architect and Zero Trust Cyber Security Lead at the Department of Homeland Security, Washington DC. With 27 (+2) years of experience in All IT federal government and federal contractor roles, she specializes in cloud engineering, big data, analytics, generative AI, and Zero trust solutions. Kalpana has received multiple awards for quality service and high performance. Previously, she spent 25 years as a federal contractor for organizations including Deloitte & Touché LLP, L3 NSS, CACI, ManTech International, GIS Federal, Northrop Grumman, SAIC (Leidos) with several Federal agencies that include U.S. Bureau of Census, National Cancer Institute –Frederick, Dept of Education, Washington DC, Department of Defense, U.S. Army and US Navy McLean VA, DOD Arlington VA, US Army-Alexandria VA, DISA Fort Meade MD, US Navy Quantico VA, Mclean VA, Defense Health Agency Arlington VA,US Army, VA, US Dept of Homeland Security, Washington DC
Overview
27
27
years of professional experience
1
1
Certification
Work History
Lead Technical Architect and Zero Trust Cyber Lead
US Dept of Homeland Security, Washington, DC
01.2023 - Current
Delivered technical briefings with essential technical feedback and prepared analysis of alternatives, cost estimates, and demonstrations to support strategic decision-making for the Management team and to the Chief Data Officer (CDO) leadership team responsible for technology decisions and recommended significant technology cost reductions of up to $500K by identifying the underutilized technologies and directing efforts towards technology modernization, thereby advancing the maturity of the environment within the planned budget, and prioritizing the needs of the 'Mission Users' in data strategy initiatives.
Successfully architected, lead a team of contractors and developers and implemented the Zero Trust framework across twenty-one products backboned by AWS, Azure cloud based and on premises system architectures, managing and completing agile tasks and establishing metrics for tracking progress.
Served as an Architect and lead the design development and deployment of AWS services, including AWS Glue, AWS OpenSearch, Step Functions, and API gateway resulting in enabling real-time data pipelines for critical missions achieving significant cost savings.
Leveraged JIRA and Confluence and SAFe Agile principles to automate workflows, streamline communication, and enhance team collaboration. Designed and developed 650 tickets across thirty-one essential Zero Trust technical capabilities spawning across seven different layers of Zero Trust Architecture, enabling efficient task management, and fostering alignment across teams.
Fostered productive collaboration and served as a technical liaison to multidisciplinary teams, including Data Science, Engineering, Cloud, Stakeholder, Vendor teams, DHS Components community teams, ensuring seamless solution integration and timely execution.
Analyzed various Gen AI solutions such as Claude 3.7 Sonnet, Named Entity Recognition (NER) and Retrieval Augmented Generation (RAG), AWS Bedrock that provides user-centric analytic search, text generation, summarization based on user activity and/or query terms to support project use cases.
Achieved $1.89K savings in cloud costs by eliminating redundancies and transitioning to a two-environment deployment model while maintaining high security standards.
Excelled in optimizing resources in federal govt to achieve the best outcome for the customers.
Lead API Gateway integration and comprehensive system testing for large datasets, driving process improvements, AWS ETL data pipeline performance improvements, estimating cloud costs, and identifying software defects while optimizing performance.
Established auditing and logging mechanisms using AWS CloudWatch, providing valuable insights into system behavior, and ensuring issue resolution and enhancing operational reliability.
Developed GitLab-based CI/CD pipelines and IAC with Terraform reducing manual effort, improving software delivery efficiency, and enabling secure code management.
Built robust logs to trace user activity and performance metrics, enabling better system monitoring and initiated issue detection.
Successfully deployed AWS OpenSearch, Glue, Step Functions, and API Gateway services in production environments, enabling real-time data ingestion and search capabilities for mission critical requirements.
Full time 40 Hrs./Week
Designed scalable architecture solutions to enhance system performance and security compliance.
Collaborated with cross-functional teams to integrate advanced technologies into existing systems.
Led architecture review sessions, ensuring alignment with organizational goals and best practices.
Developed comprehensive technical documentation for system designs and implementation processes.
Evaluated emerging technologies, providing strategic recommendations for future initiatives.
Spearheaded process improvements that streamlined project delivery timelines and reduced risks.
Performed quality code review and removed technical debt and security vulnerabilities.
Provided technical leadership to team members during system design.
Contributed innovative ideas during brainstorming sessions that led to the successful execution of key initiatives.
Provided current best practices and third-party solution alternatives when necessary for functional design documentation.
Collaborated with clients to determine project specifications and scope.
Developed network and system architecture according to business needs.
Improved business productivity for clients by 85% by re-engineering and designing infrastructures.
Engineered architecture and infrastructure for 300+ users.
Collaborated with cross-functional teams to design and implement effective software solutions, resulting in increased customer satisfaction.
Unified disparate systems through careful implementation of APIs, promoting interoperability across platforms.
Proactively monitored system performance trends to anticipate potential issues before they escalated into larger problems.
Provided 2nd and 3rd level technical support and troubleshooting to internal and external clients.
Wrote and maintained custom scripts to increase system efficiency and performance time.
Designed and implemented system security and data assurance.
Project Delivery Manager II
Deloitte & Touché LLP
10.2019 - 12.2023
Designed and delivered a secure Collibra Data Governance Product Architecture, achieving compliance with stringent cybersecurity standards and passing security risk assessments within challenging project timelines.
Architected and deployed MongoDB Replica Set systems, optimizing document-type data migration and enhancing data accessibility in high-security environments and evaluated the NodeJS APIs for data tasks.
• Coordinated Zero Trust approaches, protecting essential resources through sophisticated PKI deployments and credential management utilities such as OpenSSL, key tool, and MMC and mitigated risks by resolving and remediating the STIG control findings.
• Evaluated and recommended Kubernetes tools, such as HELM and Pulumi, optimizing multi-cloud operational efficiency and enabling seamless containerization.
• Developed Collibra workflows to visualize end to end data management and data lineage, deploying innovative solutions to improve governance and productivity.
• Fostered cross-disciplinary collaboration, bringing together DHS teams, vendors, and engineers to resolve integration challenges and drive impactful results.
• Authored comprehensive deployment guides for Collibra and MongoDB, ensuring clarity and usability for future implementations.
• Applied Zero Trust principles and secured sensitive systems for user pillar by implementing PKI, and certificate management tools and Collibra Data Governance platform as well as MongoDB architectures.
• Advanced knowledge in configuring Single Sign-On frameworks with SAML 2.0 for enterprise-level systems such as Collibra.
• Skilled in cloud portability solutions using tools like HELM and Pulumi for multi-cloud environments.
• Experienced in deep diving into cloud native architecture and API specifications to enhance infrastructure performance and data analytics capabilities.
• Developed technical briefings and cost analyses for 40+ AWS technologies, enabling informed decisions on data lake solutions, multi-cloud compatibility, and visualization tools. Adept at creating comprehensive installation, configuration, and workflow documentation to support system deployment and user training.
• Led and coordinated efforts across diverse teams to achieve project milestones efficiently.
• Delivered mission-critical solutions while adhering to aggressive deadlines and regulatory compliance.
• Accomplished Data accessibility and visualization by modular dashboard designs and reliable data migration strategies.
• Achieved operational excellence and cost savings in cloud deployments by eliminating redundancies and optimizing technical workflows.
Developed and delivered a Python-based application designed to automate SIMP installation and configuration, establishing a fully operational Puppet Master and configuring clusters to orchestrate a suite of vital services, such as Zookeeper, HDFS, Kafka, Accumulo, Elasticsearch, Kibana, SPARK, Storm, Kronos Bulk Ingest Servers, and the RDA deployer. These advanced configurations enhanced data processing efficiency, reinforcing platform scalability and operational efficiency. Deep Dived into Big Data Platform architecture to explain the latest developments of a native BDP architecture with R packages, various Network APIs, and Datastore Interaction.
By developing R Shiny interactive dashboards and modular code, improved the data visualization and search capabilities, enabling actionable insights from complex datasets.
Migrated Python scripts to ANSIBLE Playbooks facilitated configuration automation and ensured seamless integration of over 35+ services on the Big Data Platform, reducing manual efforts and accelerating service deployment.
Provided technical support and consulting for the automated cloud infrastructure, resulting in a scalable and efficient platform tailored to meet diverse mission-specific requirements.
Designed Daucusaurus-website-based coding lessons that empowered data scientists, analysts, and software engineers to understand Hadoop-based architecture and advanced R APIs, fostering skill development and innovation.
Researched and documented critical differences between Python and Python 3, identifying security implications and implementing robust solutions to enhance system integrity.
Delivered impactful client-facing demonstrations on system architecture and code architecture to code deployment, ensuring clear communication of development features in an agile environment.
Developed data flow diagrams and analysis for first system systems like Kronos and RDA, facilitating efficient bulk data processing to meet operational demands.
Investigated and documented Kubernetes containerization scripts for native cloud infrastructure, contributing to the modernization and scalability of Big Data Platform architecture.
Transitioned Python scripts to ANSIBLE Playbooks, automating critical configuration tasks including SIMP-required setups, bootstrapping, and seamless service integration. This automation streamlined the deployment of over 35+ services on the Big Data Platform, incorporating RBAC automation to dynamically assign roles essential for robust cloud infrastructure.
Full time 40 Hrs./Week
Client: US Army, Arlington VA
Lead Software Engineer /Hadoop Engineer
CACI
10.2015 - 09.2019
Leveraged dynamic schema designs and hierarchical partitioning techniques within Teradata Massively Parallel Processing (MPP) environments to improve data distribution and support seamless periodic updates.
Delivered advanced technical support and consulting services, integrating Teradata Big Data technologies such as Teradata, AsterR and Hortonworks Hadoop Data Platform, and leveraging Informatica BDM 10.2.2 streamlining data processing and analytics for high-impact decision-making.
Executed robust R programming for decision tree analysis and regression modeling, providing predictive insights and actionable intelligence derived from complex datasets. Leveraged SPARK and Apache Drill on MAPR for real-time analytics on diverse file formats such as JSON, Parquet, and MAPR DB.
Designed and developed Informatica mappings on Informatica Big Data Management Platform 10.2, facilitating efficient ingestion of data from diverse sources such as flat files, relational databases, and Hadoop systems into Teradata Databases.
Architected object-oriented models using MVC principles, ensuring precise and scalable data visualization within a multi-process parallel (MPP) environment.
Delivered impactful demonstrations on system architecture and code deployment in an agile environment, ensuring clear communication of development features.
Designed disaster recovery protocols and implemented data partitioning using Teradata partition techniques, ensuring optimal data distribution and resilience against logical volume failures.
Provided precise configuration and tuning for Teradata, AsterR, and Hortonworks Hadoop MPP Big Data platforms.
Developed UML diagrams, System Design Documents, and comprehensive System Design Reviews for development and architectural teams.
Created PYSPARK code for generating data frames, executed in the ZEPPELIN environment, with robust schema design.
Elicited requirements from stakeholders through strategic questioning to shape innovative system architectures and designs.
Full time 40 Hrs./Week
Client: Defense Health Agency
Senior Software/Hadoop Engineer
ManTech International Corporation
03.2012 - 10.2015
Initiated as a Key member and established a high-performance distributed search architecture using SOLR CLOUD Prototype, enhancing fault tolerance, high performance, and search efficiency in data-driven analytical applications.
Designed and implemented RESTful APIs for SOLR CLOUD NoSQL databases, leveraging Java programming to seamlessly integrate JSON data models for robust web interactions.
Architected ETL processes utilizing Hadoop, Zookeeper, and Accumulo to ingest and process disparate datasets, enabling scalable data integration in cloud-based systems.
Executed full-stack development in MVC web frameworks with EXTJS4, JavaScript, HTML, and CSS, delivering dynamic and interactive UI components.
Developed RESTful APIs for SOLR CLOUD migration, highlighting improved data ingestion and query performance.
Automated HDFS data ingestion and ETL operations via shell scripting, streamlining data workflows for distributed systems.
Designed disaster recovery protocols addressing Logical Volume Failures, ensuring system resilience and uninterrupted data operations.
Implemented a private Maven repository for automated code builds and release management using Tortoise Git Shell.
Established a Puppet Enterprise 3.5 Prototype for infrastructure management, integrating Host-Only and Bridged Networking concepts for efficient system setups.
Full time 40 Hrs./Week
Client: DOD (Navy)
Senior Software Engineer
GIS Federal
03.2012 - 09.2012
Delivered scalable widget software leveraging EXTJS4, HTML5, RESTFUL web services, and JSON Store for advanced data analysis in supercomputing environments.
Prototyped and analyzed MapReduce jobs within Hadoop ecosystems, utilizing HDFS for input/output operations and ensuring seamless integration with cloud-based NOSQL databases.
Conducted intensive troubleshooting and recovery operations for virtual machine crashes, enabling rapid cloud instance recreation and database restoration.
Established automated release management workflows using Subversion and Hudson, enhancing continuous integration, and minimizing deployment errors.
Implemented shell scripting for efficient HDFS data transfers and ETL processes, ensuring high performance and reliability in distributed file systems.
Full time 40 Hrs./Week
Client: DOD, US Army, Fort Belvoir VA
Software Engineer
Northrop Grumman
12.2006 - 03.2012
Coordinated meeting time with the Govt. Functional Managers for Requirement Analysis and synthesis of the Software Functional Requirements and documented the related requirements with respect to the use cases.
Successfully implemented J2 EE, J2SE, MVC, JSF frameworks for data search web development processes and improving system responsiveness.
Designed normalized database structures, DAO and DTO objects, and Hibernate annotations, ensuring data integrity and facilitating efficient object-relational mapping.
Led the migration of J2EE applications from Tomcat to WebLogic 10.3, resolving intricate clustering and configuration challenges, while achieving improved scalability and performance.
Automated Web Service testing using JUnit 4.x, reducing manual errors, and enhancing the accuracy of regression tests.
Utilized Lucene APIs for data indexing and querying, enhancing search capabilities and optimizing data accessibility within complex systems.
Full time 40 Hrs./Week
Client: DOD, US Army, Arlington, VA
Senior Java, Web, Database Developer,
SAIC
11.1998 - 12.2006
Designed and optimized database schemas to enhance performance and storage efficiency.
Developed complex SQL queries for data retrieval, reporting, and analysis tasks.
Implemented data migration processes to ensure seamless transitions during system upgrades.
Collaborated with cross-functional teams to define database requirements and specifications.
Education
MS - Computer Science
Johns Hopkins University
Bachelor of Science - CMIS
University Of Maryland University College
USA
08-1998
Bachelor of Science - Computer Management, Information Science
University of Gulbarga
India
08-1984
Skills
Leadership: Federal Systems Architecture and Design Lead, Federal Zero Trust Cyber Security Mission Lead Data Enablement Lead, Lead
GenAI implementation for development
Technical Expertise: AWS Cloud Serverless Architectures, MS Azure, Big Data Architectures, ETL Solutions, AI/ML including Generative AI and LLMs, Graph Databases, Cloud-Native and Serverless Product Architectures, Lambda, AWS Glue, SNS, Lambda, OpenSearch, Kinesis Firehose, Redshift, SageMaker, Aurora RDS, AWS Bedrock, AWS Neptune, API Gateway, Analysis of Alternatives for technical architectures, Teradata, MongoDB, PostgreSQL, Hadoop, Accumulo, Cloudera Data Platform, Nifi, DynamoDB, Informatica Big Data Platform , Data Engineering with Pyspark , Databricks Platform
Cybersecurity: Zero Trust, Cell-Level Security, API Security, Cloud security for user, devices, network, data, application workload layers, Security CE Certification, IDP Integration, LDAP integration, Role-Based Access Control, Attribute based Access Control, Public Key Infrastructure, SAML, Immutia product research
Cloud and Data Platforms: AWS Glue, AWS OpenSearch, Redshift, SageMaker, Aurora RDS, Databricks, AWS Data Lake formation, Data Lake Architectures, Vector Databases, Graph Databases AWS Bedrock
Agile Project Management: Atlassian JIRA, Confluence, Use Cases, Mission Need Analysis, Requirements Traceability Matrix, Program Increment Planning SAFe, Agile planned value, Business Value Assessments, Task Prioritization, Cloud Cost Estimates and budgeting
Programming Languages and Tools: Python, PySpark, Nodejs, Java, YAML, R, SQL, JavaScript, EXT JS, Linux
Data Formats: JSON, XML Parquet, Avro, Csv, png, pdf, structured and unstructured datasets
Data Migration & Modernization: Legacy Architectural Migration, Data Migration from SQL to NoSQL, Elasticsearch and AWS OpenSearch, MSSQL to Teradata, On Premise to AWS Cloud
Data Analytics & AI: AWS Bedrock, Optical Character Recognition (OCR), AI/ML Solutions with SageMaker, Predictive Analytics with ASTER-R, Observability, CloudWatch
DevSecOps: Terraform, Ansible, Pulumi, Kubernetes, Docker, Infrastructure as Code, IAC, Linux Shell scripts
Data Science and Gen AI: NKN Algorithms, Decision Trees, Bayesian, AWS Neptune, AWS Bedrock, Chatbots, Copilot
Other Tools and Frameworks: GitLab, Puppet, Informatica Data Catalog, RESTful APIs, Apache Hadoop Ecosystem (Hive, YARN, MapReduce), Collibra Governance Platform, J2EE architecture’s, Well Architected Framework, Swagger UI, ORACLE
Federal Policies and Frameworks: Office of Management and Budget (OMB) Responsible AI and Use of Artificial Intelligence, National Security Memorandum (NSM-8) NIST 800-207, Information Sharing Environment Technology Program Directives
Technical Documentation: Trouble shooting guides, Standard Operating Procedures, Configuration guides, API documentation, OpenSearch Interoperability document, automated workflows and flowcharts
Certification
Security Plus 501 Certified, COMPTIA, 2029-01
Linux+ certified, COMPTIA/LPI
MCHA (MAPR Certified Hadoop Administrator)
AWARDS
‘Above and Beyond’ Service Award 2025 recognizing exemplary contributions and performance from the Dept of Homeland Security, ‘Quality of Service’ Award 2024 for being a model of efficiency service standard and optimizing resources, Performance and shout out award 2022 from Deloitte, 3 years of Dedicated Service award from ManTech International Corporation., 5-year Service and Excellence Award from Northrop Grumman,, Outstanding Performance Award and SAIC service award
SPECIALIZED TRAINING
Attended, Participated and Received Training and insights from Gartner Data and Analytics Summit 2023 and 2024 on Leadership, Artificial Intelligence, Data and Analytics from Industry Leaders https://www.gartner.com/en/conferences/na/data-analytics-us
Received Training for AWS AI/ML Specialty track courses
Completed SAIC offered Systems Engineering Course work
LIMS conference held from April 25th to 27th of 2001: Attended Laboratory Information Management System Conference organized by International Institute of Research at Caesars’ Hotel Las Vegas. The conference addressed the business, technological and regulatory issues of the LIMS system.
Mule Soft’ SOA and Cloud Computing Conference at NY held on Sept 13th, 2011
Participated in Federal Big Data Summit 2014 http://www.fedsummits.com/big-data/
Fusion Development Experience Conference Organized by Oracle corporation Feb 2009
Received training for preparation towards MCSE, June 1999- Nov1999 (NetCenter) Education Center, Virginia) and passed Windows NT Workstation 4.0 (Sylvan Prometric) from Microsoft.
Core Competencies
Excelled in Technical Proficiency, Critical Thinking, Written and Oral Communication, Customer Service, Teamwork and Collaboration, Assigning and Monitoring and Problem Solving
Timeline
Lead Technical Architect and Zero Trust Cyber Lead
US Dept of Homeland Security, Washington, DC
01.2023 - Current
Project Delivery Manager II
Deloitte & Touché LLP
10.2019 - 12.2023
Lead Software Engineer /Hadoop Engineer
CACI
10.2015 - 09.2019
Senior Software/Hadoop Engineer
ManTech International Corporation
03.2012 - 10.2015
Senior Software Engineer
GIS Federal
03.2012 - 09.2012
Software Engineer
Northrop Grumman
12.2006 - 03.2012
Senior Java, Web, Database Developer,
SAIC
11.1998 - 12.2006
MS - Computer Science
Johns Hopkins University
Bachelor of Science - CMIS
University Of Maryland University College
Bachelor of Science - Computer Management, Information Science
Program Analyst at Dept. of Homeland Security | Office of the Chief Security OfficerProgram Analyst at Dept. of Homeland Security | Office of the Chief Security Officer