Dynamic software developer with extensive experience in designing, developing, and deploying high-quality software solutions that meet and exceed client expectations. Skilled in collaborating within cross-functional teams to deliver impactful results in fast-paced environments while consistently meeting tight deadlines. Expertise in programming and problem-solving, with a proven ability to adapt to evolving project requirements and leverage technical skills to drive innovation. Recognized as a reliable team player excelling in effective communication and consistently contributing to project success through technical expertise and strategic insight.
Overview
13
13
years of professional experience
Work History
Senior Software Developer
SLG Innovations: Department of Child and Family Services
09.2024 - 09.2025
Developed and customized modules in Microsoft Dynamics 365 Customer Engagement, including Investigation, Case Management, and Facility areas for the Child Protection domain.
Configured Security Matrix, Teams, and Role-Based Access Control (RBAC) to ensure compliance with DCFS data and user access standards.
Created automation workflows using North52 formulas, Power Automate, and Dataverse integrations to streamline system processes.
Worked closely with Shared Services and Core teams to align Dynamics configurations with SACWIS/CCWIS system requirements.
Participated in PI planning, JIRA tracking, and testing cycles, supporting sprint deliverables and deployment activities.
Designed and developed ETL processes using Informatica, Talend, and Azure Data Factory for data migration and system integration between legacy SACWIS systems and Dynamics 365 Dataverse.
Built and optimized data pipelines for incremental and full data loads, ensuring accurate transformation and validation across environments.
Implemented error handling, logging, and performance tuning techniques to improve ETL job reliability and efficiency.
Collaborated with data analysts and business teams to maintain data quality, referential integrity, and reporting accuracy in Snowflake and SQL Server environments.
Lead Software Engineer
Virtusa Consulting Services Private Ltd
01.2020 - 03.2024
Extracted and loaded data from AWS S3 into Snowflake using Talend and Python automation scripts.
Developed ETL pipelines using AWS Glue for data cataloging and transformation, enabling downstream reporting and analytics.
Configured and managed AWS EC2 instances for hosting development and testing environments, ensuring optimized resource utilization.
Utilized AWS Lambda functions for lightweight data validation, automation, and event-driven processing.
Implemented AWS CloudWatch monitoring for job health checks, alerts, and performance metrics tracking.
Deployed secure data integration through AWS PrivateLink and VPC peering, ensuring compliance with enterprise data standards.
Leveraged AWS Redshift for analytical workloads, optimizing query performance and storage cost.
Coordinated cross-platform data flow between AWS, Azure, and on-prem systems, integrating Snowflake as a unified data layer.
Parse mapping s design specification to simple ETL coding.
ETL Tech Lead
Citi Group (Virtusa)
02.2021 - 10.2023
Developing the Talend jobs, Test and migrate it to higher environments.
Designed and developed ETL processes using SAP Data Services to extract, transform, and load data from various sources into [target systems/data warehouses].
Developed complex ETL processes in Ab Initio to extract, transform, and load data into Hive data warehouses.
Tuned Hive queries for improved performance by optimizing query execution plans and indexing strategies.
Created tables, views, secure views, user defined functions in Snowflake Cloud Data Warehouse.
Collaborated closely with data analysts and business stakeholders to gather requirements and ensure alignment with business objectives and Implemented data quality checks and cleansing procedures.
Created comprehensive documentation for ETL processes, data models, and Hive queries for knowledge transfer and onboarding of new team members.
Utilized Snowflake features like Snowpipe or Snowflake's native connectors to efficiently extract data.
Provided expert consulting services in SAP Data Services, assisting clients in architecting, designing, and implementing ETL solutions tailored to their specific needs.
Experience in end-to-end development of data warehousing projects, right from requirement analysis, Preparing HLD & LLD, environment setup, design analysis, down-stream analysis, coding, testing and documentation and optimizing the design wherever possible.
Team worker. Effective communicator with excellent relationship building & interpersonal skills. Coordinating with team for better understandings and better results.
Set up and configure development, testing, and production environments to support the organization’s data integration and analytics needs.
Setting up the Connect Direct in SIT/UAT/PROD/COB.
Testing connectivity between partner systems through NDM and SFTP.
Maintained desired working level of Ab Initio operational documentation and procedures.
Involved in creation of AutoSys jobs.
Troubleshoot and support in production issues.
Setting up Cyber Ark password module in all the environments.
Manage user access controls, authentication mechanisms, encryption, and auditing to safeguard Ab Initio metadata and configurations.
Ability to handle projects with stipulated timelines.
Providing technical support to STE and SIT teams, helping in achieving project deadlines.
Utilized Snowflake's capabilities such as user-defined functions (UDFs) or stored procedures for complex transformations.
Coding and designing graphs, with optimal components, in compliance to coding standards.
Troubleshooting and resolving technical issues related to Talend installations and configurations.
Installing and configuring Talend software and its components.
Ab Initio Developer
Citi Group (Virtusa)
01.2020 - 02.2021
Played a pivotal role in driving the success of data warehousing and ETL projects through a comprehensive skill set and proactive approach.
By contributing to High-Level Design (HLD) preparation and participating in crucial application dependency check meetings, ensured the seamless integration of new features.
Expertise includes knowledge transfer sessions that deliver valuable insights to the production team, enabling seamless adoption of new functionalities. Proficiency in creating Mapping Design Documents (MDD) and Technical Design Documents (TDD) serves as a critical guide for successful implementations.
Involved in HLD preparation and application dependency check meetings.
Delivering KT to prod team on new features.
Created MDD and TDD documents.
Experience in Release activities and Rapid deployment.
Interface with Ab Initio Support. Responsible for the installation, configuration, and maintenance of all Ab Initio Technologies.
Prepared custom-made scripts to generate psests/dmls to reduce overall development effort and to enforce standardization.
Fine-tuned Snowflake configurations, such as warehouse sizes, clustering keys, and storage options, to enhance ETL performance.
Perform development of ETL application, including ensuring that system and integration test plans are developed and executed.
Extracted and loaded CSV files, json files data from AWS S3 to Snowflake Cloud Data Warehouse.
Actively promotes and participates in process improvement. Be part of architecture review team and assist in development of processes and applications.
Optimizes Spark jobs for performance and scalability, considering factors such as partitioning, caching, and data locality.
Querying and analyzing data stored in Snowflake to derive insights and make data-driven decisions.
Developing and executing SQL queries, analytical functions, and data manipulation operations.
Building and maintaining data models and visualizations for reporting and analytics purposes.
Collaborating with stakeholders to understand business requirements and deliver actionable insights.
Collaborates with data scientists and analysts to understand data requirements and implement data processing logic accordingly.
Monitors job execution and troubleshoots issues related to data quality, performance, and resource utilization.
Executed effective processes for administration and maintenance of Ab Initio Key Server.
Manages users, roles, permissions, and access controls.
Configures connections to data sources and targets.
Automates deployment, monitoring, and management of PySpark applications using CI/CD pipelines and orchestration tools.
Check in/check out of projects/objects to/from source code control system. Promotion of source code between environments (DEVL, ACPT, PROD)
Installing Ab initio software, Upgrade the Cooperating system, Installing the key bundle and key files, setting up the help server.
Provide critical thinking to develop creative and innovative solutions to the business problem.
Work in On-site/Off-shore Team, Guide other ETL Developers.
Ab-initio, SQL, XML, XSD, PL/SQL, UNIX, AutoSys.
Knowledge of Oracle databases, UNIX environment, UNIX shell scripts. Environment: Ab Initio (GDE V3.0.4.2, GDE V4.0.2.0)), Snowflake, Co-Op V3.0.2, Oracle 12 to 19c Unix, Korn Shell Scripting, Ab initio Control center, Jira, RLM, Python.
Senior Software Engineer
Donato Technologies
12.2018 - 12.2019
Interacting with Marketing/Business stake holders, gathering Requirements, Analyzing, and managing the delivery, covering the entire Data Warehouse development lifecycle.
Assisted in installation and configuration of Ab Initio software including patches and server consolidation processes.
Used components of Ab Initio to extract, transform and load data from multiple data sources like Flat files, XMLs etc. to target data marts.
Handled the configuration, management, and monitored Azure Data Factory resources, including access control and resource optimization.
Defined requirements and leveraged data insights from Azure Data Factory to guide informed business decisions and offer strategic feedback.
Cleansed the data using various Ab Initio components like Join, Rollup, Dedup-Sort, Scan, Filter by Expression, Gather and Merge.
Monitor system performance, diagnose and troubleshoot issues, and implement solutions to ensure the availability, reliability, and performance of Ab Initio infrastructure.
Leveraged Snowflake's SQL capabilities to perform complex transformations and calculations efficiently.
Written Unix scripts in the Ab Initio Graphs.
Much familiar with the concept of Sand box, EME for check in & checkout process.
Modified the Ab Initio component parameters, introduced phases, checkpoints to avoid deadlock, utilized data parallelism and thereby improving the overall performance to fine-tune the execution times.
Designed and Built Ab Initio Graphs for unloading data from different source systems.
Good experience with dml utilities like m_db, Cobol-to-dml.
Migrated Oracle database tables data into Snowflake Cloud Data Warehouse.
Extensively used EME for Version control and Code Promotion.
Involved in tuning of SQL queries to improve performance of the Graphs.
Tested Ab initio graphs in development and Test environments using test data, fine tuning the graphs for better performance and migrated them to the Production environment.
Extensively used the Teradata utilities like BTEQ, Fast load, Multiload, DDL Commands and DML Commands (SQL).
Designed enterprise-level applications using the Snowflake platform.
Helped entry level resources in understanding the workflow and on boarding into the team.
Developed Ab Initio Graphs with complex transformation rules through GDE.
Developed Complex Ab Initio XFRs to derive new fields and solve various business requirements.
Developed several Unix wrapper scripts to schedule Ab initio Graphs and implemented Restorability.
Develop and implement backup and recovery procedures to safeguard Ab Initio metadata and configurations, ensuring business continuity in case of system failures or disasters.
Test backup and recovery processes regularly to verify their effectiveness and reliability.
Conducts audits and assessments of IICS environment for compliance.
Reviews access controls, permissions, and security configurations.
Analyzes system logs, user activity, and data usage patterns.
Identifies risks, vulnerabilities, and areas for improvement.
Extensively used EME for Version Control and Code Promotion. Implemented ETL pipelines within and outside of a data warehouse using Python and Snowflakes Snow SQL.
Written Unix scripts to automate Ab Initio processes.
Created Test data to help QA team in testing of the Ab Initio Graphs.
Written various dml and dbc files. Environment: Ab Initio (GDE V3.0.4.2), Co-Op V3.0.3, Snowflake, Oracle 10g, Teradata, Unix, Korn Shell Scripting Control-M, Python.
Software Engineer
VSNL International, INC
09.2016 - 11.2018
Involved in High level designing and detailed level designing.
Creates mappings, transformations, and workflows using IICS tools.
Implements data cleansing, transformation, and enrichment logic.
Collaborates with data stewards to ensure data quality and governance.
Extracted data from various sources like databases, delimited flat files and XMLs.
Worked on Continuous graphs in one module where we used to get Ab initio queues (from Tibco source) and used to process through JMS subscribe, JMS publish.
Developed various Ab initio Graphs based on business requirements using various Components.
Developed Ab initio graphs.
Involved in decreasing application run times significantly by using performance tuning procedures of Ab initio.
Used Ab initio Pan>it to Keep the Job Dependencies Workflow.
Test Ab initio graphs in development and migration environments using test data, fine tuning the graphs for better performance and migrate them to the Production environment.
Extensively used lookups to Increase the performance of the graph to avoid Joins.
Extensively used air commands For Ab initio code migrations, Check-in, check outs.
Written Shell Scripts to schedule the Ab initio Graphs by supplying parameters.
Extensively used Ab Initio Parallelism feature of Component, Data and Pipeline parallelism.
Review the test case results with the team and report the results to business accordingly.
Testing the graph once development is completed and preparing Unit test case documents, code review documents.
Development of source Data profiling and analysis - review of data content and metadata will facilitate data mapping and validate assumptions that were made in the business requirements.
Used Partition components like partition by expression, partition by key, etc., to run the middle layer processing parallel.
Effectively used Multi file system (MFS) to execute the graph parallelly.
Extensively worked in the UNIX environment using Shell Scripts. Environment: Ab Initio (GDE 2.14, Co>Op 1.14), Oracle 10g, SQL, Linux, HTML, Python.
Sonata Software Corporation
02.2013 - 08.2015
Involved in the meeting with Business Analysts to finalize the requirements and documented them.
Created Design documents for the implementation of the project.
Responsible for Performance-tuning of Ab Initio graphs.
Used Command line check-in, checkouts.
Used the Ab Initio Web Interface to Navigate the EME to view graphs, files and datasets and examine the dependencies among objects.
Used top down approach for the creation of the Data Marts.
Test Ab Initio graphs in development and migration environments using test data, fine tuning the graphs for better performance and migrate them to the Production environment.
Created Ab Initio Scripts to Extract, Transform and Load data from various source systems like flat files, XML files etc. to Oracle Database.
Involved in development of complex and efficient SQL scripts in the process of debugging various Ab Initio graphs.
Performed Lookups, lookup local, In-Memory Joins and rollups to speed up various Ab Initio Graphs.
Utilized Oracle Database in the development of reports for the business users.
Led a small team of 3 to deliver a develop an efficient application using AB Initio architecture.
Used various multi and single stage components like Rollup, Normalize, Dedup sort, reformat to create the graphs.
Involved in unit testing the application and deployment.
Experience in writing Jil Scripts to submit AutoSys Jobs.
Involved in Design documentation of the whole project. Environment: Ab Initio (GDE 1.14.30), Co-Op 2.14.74, Oracle 10g, Unix, Linux, AutoSys, HTML.
Education
Master of Science - Computer and Information Systems
California University of Management and Sciences
Anaheim, CA
01.2017
Bachelor of Technology - Information Technology
St. Peter Engineering College
Hyderabad, India
01.2012
Skills
Microservices architecture
API integration
Object-oriented programming
Design patterns
Code fixes
Application development
Software architecture design
Application design
Performance tuning
Testing and debugging
Software development
Problem-solving
Agile development methodologies
Technical support
Technical consulting
Cybersecurity best practices
Application security
Database optimization
Automated testing
Agile software development
Languages
Hindi
Full Professional
English
Full Professional
Telugu
Native or Bilingual
Timeline
Senior Software Developer
SLG Innovations: Department of Child and Family Services
09.2024 - 09.2025
ETL Tech Lead
Citi Group (Virtusa)
02.2021 - 10.2023
Lead Software Engineer
Virtusa Consulting Services Private Ltd
01.2020 - 03.2024
Ab Initio Developer
Citi Group (Virtusa)
01.2020 - 02.2021
Senior Software Engineer
Donato Technologies
12.2018 - 12.2019
Software Engineer
VSNL International, INC
09.2016 - 11.2018
Sonata Software Corporation
02.2013 - 08.2015
Bachelor of Technology - Information Technology
St. Peter Engineering College
Master of Science - Computer and Information Systems
Child Protection Investigator at Child Family Services, Department of Public Health & Human ServicesChild Protection Investigator at Child Family Services, Department of Public Health & Human Services
Intern- Independent Living Program at Arlington County Department of Human Services Child and Family Services DivisionIntern- Independent Living Program at Arlington County Department of Human Services Child and Family Services Division