Summary
Overview
Work History
Skills
Education
Timeline
Generic

Sindhu Reddy

Software Engineer

Summary

Professional with 10+ years of experience as a Software Engineer with AI and Web/Application Developer and coding with analytical programming using Python, Django, Flask, AWS, SQL,GenAI,ML. Proficient in Python with expertise in libraries like NumPy, SciPy and Pandas. Developed RESTful APIs, JSON-based services, and automation scripts using PowerShell scripting. Extensive experience with AWS and Azure cloud services, including EC2 instance provisioning, VPC, RDS, S3, IAM, and Lambda. Expertise in CI/CD pipelines using Jenkins, Hudson, and Azure DevOps. Hands-on with Terraform, CloudFormation, and IaC practices. Managed source control using Git, GitHub, and Bitbucket. Proficient in PySpark, BigQuery, and other data integration/ETL tools Developed automated regulatory reporting solutions using Python, streamlining data extraction, validation, and report generation processes. Developed data-driven applications using Axiom in Python for efficient data extraction, transformation, and analysis. Customized Jupyter Notebook environments to meet project-specific needs, installing and configuring plugins, kernels, and themes. Managed Jupyter Notebook version control and backup strategies, ensuring data integrity and recoverability. Demonstrated expertise in deploying and managing containerized applications using EKS, ensuring high availability and scalability. Implemented LLM models using popular libraries such as Hugging Face Transformers, PyTorch, and TensorFlow Experience with configuration management tools like Puppet, Chef, Ansible, and orchestration using Jenkins and Run Deck. Skilled in both NoSQL and relational databases including DynamoDB, AWS RDS, Oracle, SQLite, PostgreSQL, and MySQL. Experienced in SQL modifications and database migrations. Strong background in Agile, Scrum, and Python-based development with a focus on data analytics. Expertise in developing consumer-facing features and collaborating with cross-functional teams to ensure project success. Strong written and verbal communication skills, adept at engaging with diverse stakeholders, and integrating development processes with design and automation.

Overview

10
10
years of professional experience

Work History

Sr.Software Engineer

Capital One
Mclean, Virginia
05.2024 - 10.2024
  • Technical Environment: Python 3.x, Java8, AWS S3, Lambda, API Gateway, RestfulAPIs, Jenkins, Docker, Web Services, Splunk, Cloudwatch, CloudFormation, Jupyter Notebook,EC2,SDK, Kubernetes,ML/AI, Kinesis, Regulatory reporting, Oracle, Axiom, RDS, DynamoDB, pytest, asyncio, ATDD,LLM, BDD Microservices, REST, Pycharm, VSCode, IntelliJ, GitHub, ServiceNow, Pydantic,Basel
  • Responsibilities:
  • Responsible for new feature development and end to end testing as a, Responsible for maintaining existing infrastructure and code base
  • Enhanced existing applications and APIs to fit current needs
  • Leveraged AWS Bedrock to build and deploy generative AI models, enabling text generation and summarization capabilities
  • Developed and fine-tuned LLAMA3 models for various natural language processing tasks, improving accuracy
  • Collaborated with product managers to define project requirements and develop ML-based solutions
  • Designed and implemented complex Directed Acyclic Graphs (DAGs) in Apache Airflow to automate data processing workflows, ensuring efficient task orchestration
  • Developed custom Airflow operators and hooks to extend functionality and integrate with various data sources and services
  • Utilized monitoring tools such as Prometheus and Grafana to track application performance and troubleshoot issues in EKS environments
  • Developed and implemented machine learning models using scikit-learn and TensorFlow, including data preprocessing, model training, and model deployment.(AI/ML)
  • Fine-tuned pre-trained LLM models for specific tasks and domains
  • Worked on designing and developing new features based on business requirements
  • Implemented natural language processing using NLTK and spaCy, including text preprocessing, sentiment analysis, and topic modeling(AI/ML)
  • Work on version control tools like GitHub and tools like Jenkins for CI/CD development and tools like Jira for tracking the work
  • Optimized read and write operations in DynamoDB to ensure low-latency access to data for machine learning models
  • Designed and developed a Flask-based web applications ,enabling secure and reliable data exchange between systems
  • S)
  • Worked on maintaining and testing new changes and features end to end with unit testing and feature /ATDD testing also integration testing
  • Worked on designing an LLM model for a newly built Sentiment Analysis tool for Credit risk analysis reporting
  • Worked as a part of Card tech management for a subscription management app which maintains the payment card information of customers (PCI)
  • Developed applications that utilize Retrieval Augmented Generation (RAG) techniques to enhance the accuracy of AI responses using AWS Bedrock
  • Integrated AWS Bedrock with other AWS services, such as Lambda and S3, to create a seamless AI application ecosystem
  • Implemented RAG models using popular libraries such as Langchain, Llama-Index
  • Conducted experiments to evaluate the performance of LLM models and identified areas for improvement
  • Utilized Python libraries such as Pandas, NumPy, and SQLAlchemy for data manipulation, validation, and transformation in regulatory reporting pipelines
  • Developed and optimized search algorithms to enhance data retrieval for AI applications using OpenSearch
  • Created new APIs from scratch setting up repos, onboarding business components and set up automated Pipelines using BogieFiles(YAML ) to maintain CI/CD structure via Jenkins
  • Implemented advanced query techniques, including full-text search and filtering, to improve search accuracy and performance using OpenSearch
  • Conducted performance tuning and monitoring of OpenSearch to ensure optimal query response times
  • Developed Restful APIs using AWS Lambda API gateway as per business requirements for new features
  • Registered the APIs as per version and followed OpenAPI design standards
  • Developed and maintained Python scripts to interact with Oracle databases, leveraging libraries like cx_Oracle for seamless integration
  • Automated logging and monitoring of events using Axiom in Python, enabling real-time insights and debugging
  • Collaborated with Product teams of internal and external Line of Businesses on new features to effectively streamline the onboarding of new application and business components being developed
  • Maintained existing business components and kept them in sync with changes on the Technology components
  • Used libraries like pydantic and aws lambda power tools for required functionalities
  • Worked on MacOS and maintained different versions of Python in the same system (Python 3.9,Python 3.11,Python3.12) for different APIs
  • Provided support to existing applications in incident support management by Monitoring and logging mechanisms using Splunk ,CloudWatch logs
  • Configured Cloudwatch alarms at various levels to alert the required team
  • Implemented Basel reporting tool's advanced analytics capabilities, including risk analysis and regulatory reporting
  • Worked on existing microservice (Java) to enhance and update the functionality as per new business requirements
  • Conducted performance tuning and optimization of Mistral workflows
  • Collaborated with stakeholders to gather requirements and translate them into Mistral workflows
  • Optimized SQL queries executed from Python to improve performance and reduce query execution times in Oracle environments
  • Worked on modifying Kinesis data events/flow and Data sinks for above enhancements to modify output Worked on behave testing for ATDD(Acceptance -test-driven-development) and also BDD(Behaviour driven development) frameworks using Python behave library
  • Worked in Agile environment using the Product management tool Jira for tracking and Github for source and version control ,Jenkins for CI/CD integrations
  • Developed scalable applications using agile methodologies for timely project delivery.
  • Managed multiple projects simultaneously while maintaining strict deadlines and high-quality standards.
  • Enhanced software functionality by identifying and resolving complex technical issues.
  • Maintained comprehensive documentation of development work, facilitating knowledge sharing among team members.
  • Streamlined development workflows to increase team efficiency and reduce time spent on repetitive tasks.
  • Proactively identified areas for process improvement, implementing changes that led to significant time savings for the team.
  • Regularly reviewed peers'' code contributions, offering constructive feedback to enhance overall product quality.
  • Mentored junior developers, fostering professional growth and enhancing team productivity.
  • Delivered exceptional client support by promptly addressing concerns and implementing requested changes or enhancements to software solutions.
  • Collaborated with cross-functional teams to design innovative software solutions.
  • Analyzed proposed technical solutions based on customer requirements.
  • Tested methodology with writing and execution of test plans, debugging and testing scripts and tools.
  • Collaborated with management, internal and development partners regarding software application design status and project progress.
  • Collaborated with fellow engineers to evaluate software and hardware interfaces.
  • Developed robust, scalable, modular and API-centric infrastructures.
  • Coordinated with other engineers to evaluate and improve software and hardware interfaces.
  • Estimated work hours and tracked progress using Scrum methodology.
  • Coordinated deployments of new software, feature updates and fixes.
  • Corrected, modified and upgraded software to improve performance.
  • Designed and developed forward-thinking systems that meet user needs and improve productivity.
  • Authored code fixes and enhancements for inclusion in future code releases and patches.
  • Analyzed work to generate logic for new systems, procedures and tests.
  • Conducted data modeling, performance and integration testing.
  • Created proofs of concept for innovative new solutions.
  • Tested and deployed scalable and highly available software products.
  • Documented software development methodologies in technical manuals to be used by IT personnel in future projects.
  • Designed and implemented scalable applications for data extraction and analysis.
  • Developed next generation integration platform for internal applications.
  • Rapidly prototyped new data processing capabilities to confirm integration feasibility into existing systems.
  • Supervised work of programmers, designers and technicians, assigned tasks and monitored performance against targets.
  • Tested functional compliance of company products.
  • Developed conversion and system implementation plans.
  • Translated technical concepts and information into terms parties could easily comprehend.
  • Inspected equipment, assessed functionality, and optimized controls.
  • Optimized dust, temperature and humidity controls for installed systems.
  • Improved software efficiency by troubleshooting and resolving coding issues.
  • Saved time and resources by identifying and fixing bugs before product deployment.
  • Collaborated with cross-functional teams to deliver high-quality products on tight deadlines.
  • Updated old code bases to modern development standards, improving functionality.
  • Enhanced user experience through designing and implementing user-friendly interfaces.
  • Optimized application performance by conducting regular code reviews and refactoring when necessary.
  • Contributed to a positive team environment through effective communication, problem-solving, and collaboration skills.
  • Developed customized software solutions for diverse clients, resulting in increased satisfaction and repeat business.
  • .Streamlined workflows by creating reusable code libraries for common functions and features across multiple projects.
  • .Mentored junior developers to improve their technical skills, fostering a culture of continuous learning within the team.
  • .Achieved faster development cycles using Agile methodologies, including Scrum or Kanban processes.
  • Collaborated on stages of systems development lifecycle from requirement gathering to production releases.
  • Pioneered use of machine learning algorithms to automate and improve decision-making processes within applications.
  • Increased code efficiency by implementing rigorous code review practices, which improved overall software performance.
  • Optimized database queries for enhanced performance, enabling faster data retrieval and processing.
  • Built databases and table structures for web applications.
  • Participated in regular code sprints, contributing to rapid development and iteration of software products.
  • Conducted in-depth market research to guide development of new software features that addressed unmet user needs.
  • Boosted team productivity through introduction of pair programming, fostering culture of knowledge sharing and collaboration.
  • Tailored software solutions to meet specific client needs, ensuring high levels of customer satisfaction and repeat business.

Sr Software Engineer

Freddie Mac
Mclean, Virginia
08.2023 - 04.2024
  • Python)
  • Technical Environment: Python 3.x, NLP,XML, QML Web Server, Regulatory reporting, wxPython, Bootstrap, Django REST v3.2.23, v4.2,Open Shift, Jupyter Notebook, Postman, PostgreSQL,ML/AI, Web Services, REST, Flask,Axiom, Pycharm, Oracle, Windows, Linux, Bitbucket, Jira,, LLM, Heroku, Jenkins, Kubernetes, Docker, Tableau, Dash ,Basel
  • Responsibilities:
  • Working on developing a sub-app for a reporting tool/Microservice to analyze, transform, and deliver data metrics for both internal and customer/business use
  • Designed a scalable approach to perform ETL operations and migrated existing DeNodo approach to new tool functionality
  • Successfully deployed multiple Python web applications on Heroku
  • Implemented LLAMA3 in real-time applications, enhancing user interaction through advanced conversational AI capabilities
  • Collaborated with cross-functional teams to design and implement search features that enhance user experience in AI applications
  • Managed and configured OpenSearch clusters, ensuring high availability and scalability for large datasets
  • Created and managed Databricks jobs using Airflow to automate data workflows, ensuring timely execution and monitoring of job runs
  • Deployed Airflow in various environments, including local setups and cloud-based solutions, ensuring scalability and reliability
  • Integrated Databricks with business intelligence tools like Tableau or Power BI for data visualization and reporting, facilitating data-driven decision-making
  • Integrated OpenSearch with machine learning models to provide real-time insights and analytics
  • Collaborated with cross-functional teams to integrate LLAMA3 into existing software solutions
  • Developed data pipelines to preprocess and prepare data for LLM model training
  • Developed custom REST API services to extract data from various data sources using Django REST framework
  • Implemented dimensionality reduction using PCA and t-SNE, including data preprocessing, model training, and model deployment(AI/ML)
  • Experimented with different model parameters in AWS Bedrock to optimize performance and output quality
  • Implemented document summarization features using AWS Bedrock, improving the efficiency of information retrieval
  • Designed and implemented scalable data models in DynamoDB to support high-traffic AI applications
  • Utilized Heroku Command Line Interface (CLI) for managing applications, databases, and add-ons
  • Integrated various Python libraries (e.g., Pandas, NumPy) via Jupyter Notebook to perform complex data transformations and analyses seamlessly
  • Designed and developed RAG -based application of a Virtual Assistant for an internal use case using Llama-index
  • Developed data retrieval and ranking systems to support RAG model training
  • Utilized Pandas and NumPy in Python to extract meaningful insights, perform data processing, and execute data transformations for ETL operations
  • Developed reproducible research workflows by combining code, visualizations, and narrative in a single document using Jupyter Notebook
  • Managed environment variables and configuration settings for secure and efficient application deployment on Heroku
  • Built NLP models using NLTK, spaCy, and gensim.(GenAI)
  • Collaborated with compliance and finance teams to adapt Python scripts to evolving regulatory frameworks and reporting standards
  • Designed appropriate refresh strategies to deal with changing metrics
  • Performed data filtering and transformations after extraction using materialized views
  • Coordinated and documented the deployment process end-to-end and provided documentation for future ease of use by other developers
  • Enhanced data extraction and mitigated data load overheads for Tableau dashboards
  • Implemented data solutions for Bitbucket, working on extracting and filtering data for various metrics with varied data sizes and requirements
  • Proposed and implemented improvements in design decisions for effective data filtering and cadence rates for staging tables based on priority metrics in dashboards
  • Assisted in end-to-end deployment via Jenkins pipelines and OCP clusters
  • Utilized AWS SDK for Python (Boto3) to interact with AWS services programmatically, automating infrastructure provisioning, configuration, and management tasks
  • Deployed and managed containerized applications using Kubernetes, orchestrating clusters and ensuring high availability and scalability
  • Assisted in effective changes in data modeling strategies for data sources by providing context through data sampling and sample analysis
  • Worked on Postgre SQL database to update existing models and schemas, creating materialized views
  • Working as part of the reporting microservice team to handle the backend component
  • Working on data migration for delivery metrics data from on-the-fly to an SQL database for storage purposes and future scalability when dealing with increasing data size.

Sr.Software Engineer Specialist

FIS Global
Atlanta
02.2022 - 07.2023
  • Technical Environment: Python, Groovy, XML, QML, JavaScript, GO Lang, AJAX, Webserver, wxPython, Bootstrap, Flask, Open Shift, NLP, Postman, Jupyter Notebook, MySQL, MS-Sql,ML/AI,LLM, Web Services, SOAP, REST, VSCode, Windows, Linux, AWS, Heroku, OCP , GCP,Jenkins, Kubernetes, Docker,Azure cloud-native ,Basel
  • Responsibilities:
  • Developed Python scripts for automation, leveraging libraries like Pandas, NumPy, and SQLAlchemy to streamline data pipelines and improve efficiency
  • Developed text classification, sentiment analysis, and topic modeling algorithms using Python.(GenAI)
  • Configured and maintained cloud services including AWS Glue, Athena, EMR, and Azure DevOps, enhancing scalability, cost-efficiency, and automation
  • Conducted extensive testing and validation of LLAMA3 outputs
  • Created comprehensive documentation and training materials for LLAMA3
  • Managed multiple kernels for different projects, ensuring compatibility with various Python versions and packages via Jupyter Notebook
  • Implemented data partitioning strategies in DynamoDB to improve performance and manage large datasets effectively
  • Utilized DynamoDB Streams to trigger real-time processing of data changes, enhancing the responsiveness of AI applications
  • Developed and implemented deep learning models using Keras and PyTorch, including data preprocessing, model training, and model deployment(AI/ML)
  • Utilized Phi models to enhance machine learning algorithms
  • Integrated various Heroku add-ons (e.g., PostgreSQL, Redis) to enhance application functionality and performance
  • Utilized Jupyter extensions to enhance functionality, such as code folding and variable inspection(JupyterHub)
  • Maintained Pipeline repos in multi regions and provided support on Platform as Service as part of daily responsibilities
  • Developed data pipelines to preprocess and prepare data for ML model training
  • Implemented scaling strategies to handle increased traffic, including dyno management and load balancing via Heroku
  • Designed and developed chatbots and recommendation systems for multiple customer facing ML-APIs using LangChain and RAGAS
  • Set up continuous deployment pipelines using GitHub integration for Heroku hosted applications
  • Shared notebooks via GitHub and JupyterHub, facilitating collaboration with team members and stakeholders
  • Worked on incident support management for multi region deployments ,both VM based and cloud based
  • Implemented Axiom Controller View's advanced analytics capabilities, including data visualization and reporting for improved financial reporting
  • Employed Jupyter for rapid prototyping of machine learning models, allowing for iterative experimentation
  • Worked with data scientists to develop and implement data pipelines and ML models
  • Implemented JavaScript libraries like jQuery and React to improve interactivity and dynamic behavior for a financial institution
  • Managed deployment pipelines using Azure DevOps, automating CI/CD processes for seamless updates and rollback capabilities
  • Designed and developed web applications using Django and FastAPI, integrating front-end technologies like React, AngularJS, and jQuery for enhanced user experiences
  • Collaborated with data scientists to refine Phi's training datasets
  • Developed and maintained performance metrics for Phi models
  • Containerized applications using Docker, ensuring consistency across development and production environments.Developed a Flask-based web application, leveraging Flask-Login and Flask-Principal to ensure secure user authentication and authorization
  • Utilized tools like Beautiful Soup for web scraping and Tableau for data analysis and visualization, extracting meaningful insights from complex datasets
  • Streamlined data processing tasks using AWS CloudBatch, optimizing batch workloads and resource allocation.

Software Developer

Ford
, MI
03.2019 - 02.2022
  • Technical Environment: Python, Django, HTML, CSS, NLP ,XML, QML, JavaScript, GO Lang, AJAX, Webserver, python, matplotlib, NumPy, PyDev, PostgreSQL,Jupyter, Heroku,Apache, Bootstrap, Flask, Oracle, Okta ,PL/SQL, MySQL, MS-Sql, Web Services, SOAP, REST, PyCharm, Windows, Linux
  • Responsibilities:
  • Developed entire frontend and backend modules using Python on Django and Flask frameworks, creating RESTful web services and microservices
  • Architected and developed serverless applications on AWS, leveraging Lambda functions and event-driven architecture (Amazon S3, Amazon EventBridge, Amazon Kinesis)
  • Utilized Heroku's logging and monitoring tools to track application performance and troubleshoot issues
  • Conducted research and analysis on Phi's architecture
  • Conducted database migrations (MySQL to AWS RDS) and performance tuning for AWS services, including DynamoDB and Lambda functions.Designed and Developed RESTful Web Services using both SOAP and REST protocols to interact with various business sectors
  • Configured custom domains and SSL certificates for secure and branded application access of Heroku applications
  • Deployed Jupyter Notebooks on cloud platforms (e.g., Google Colab) for scalable computing resources
  • Converted Jupyter notebooks to various formats (HTML, PDF) for reporting and presentation purposes
  • Developed a Dash-based web application, leveraging Dash-Plotly to improve data visualization and analytics via Jupyter notebook
  • Managed database migrations and backups using Heroku’s built-in tools for seamless data handling
  • Designed and implemented complex data transformations using SQL, PySpark within Databricks, and AWS Glue for ETL processes
  • Implemented Dash-Cytoscape to improve network visualization and analysis for a financial institution
  • Automated data processing workflows and analysis using Python (Pandas, Numpy) and AWS Glue, improving operational efficiency
  • Developed and maintained multiple data pipelines and data models using Oracle for efficient data processing and storage
  • Integrated Git into CI/CD workflows for efficient collaboration and code review, ensuring seamless project development.

Python Developer

CBRE
01.2018 - 03.2019
  • Technical Environment: Python, HTML5, CSS3, Angular.js, JavaScript, MYSQL, Django, Django Tasty pie, UNIX, Windows, PostgreSQL, SQL Alchemy, SQL, AWS, Heroku, Apache Web Server, Git, Python 3.6/2.7, Django 2.0/1.6, HTML5, CSS, XML, Pandas, JavaScript, Angular JS, Backbone JS, JQuery, CSS Bootstrap, Mongo DB, MS SQL Server, T-SQL, Eclipse, Git, GitHub, AWS, Linux, Shell Scripting
  • Fg
  • Responsibilities:
  • Migrated Django database from SQLite to MySQL to PostgreSQL, ensuring data integrity
  • Implemented dynamic inventory scripts for Python-based assets, reducing manual intervention
  • Managed Python source code version control using GitHub and automated build deployment with Jenkins and Docker containers in Mesos
  • Collaborated with team members on Heroku projects, utilizing shared access and role-based permissions
  • Worked with multi-dimensional arrays in Numpy to process and analyze structured data effectively using Jupyter Hub
  • Merged and concatenated datasets, handled missing values, and performed group-by operations to summarize data using Python Pandas via Jupyter Notebook
  • Collaborated with infrastructure teams to transition manual provisioning processes to Infrastructure as Code (IaC), reducing provisioning time by 40%
  • Utilized Hudson/Jenkins for continuous integration (CI) to automate building, testing, and deploying Python applications
  • Troubleshot Heroku application issues, using Heroku support resources and debugging tools to resolve problems efficiently
  • Implemented security best practices for Heroku applications, including secure API keys and data encryption
  • Utilized automation with Jenkins for CI/CD on Amazon EC2 and AWS S3 for data storage
  • Designed and developed framework to consume web services hosted on Amazon EC2
  • Configured monitors, alarms, and notifications for EC2 hosts using CloudWatch
  • Automated performance calculations with NumPy, SciPy, and SQL Alchemy
  • Critiqued and developed ETL mappings for source data from various sources (Oracle, DB2, XML, Flat files).

Python Developer

Cobham
Cobham, CA
03.2017 - 01.2018
  • Technical Environment: Python, HTML5, CSS3, Angular.js, JavaScript, MYSQL, Django, Django Tasty pie, UNIX, Windows, PostgreSQL, SQL Alchemy, SQL, AWS, Apache Web Server, Git, Python 3.6/2.7, Django 2.0/1.6, HTML5, CSS, XML, Pandas, JavaScript, Angular JS, Backbone JS, JQuery, CSS Bootstrap, MongoDB, MS SQL Server, T-SQL, Eclipse, Git, GitHub, AWS, Linux, Shell Scripting
  • Fg
  • Responsibilities:
  • Developed an internal testing tool framework in Python
  • Designed and implemented frontend and backend modules using Python on Django, incorporating Tasty pie web Framework, Git, Node.js, underscore.js, Angular.js, CSS, and JavaScript
  • Created APIs, database models, and views using Python for building responsive web applications
  • Utilized GitHub for Python source code version control, Jenkins for automating builds, Docker containers, and deployment on Mesos
  • Designed and Developed RESTful Web services for interaction with various business sectors, incorporating SOAP protocol for web services communication
  • Integrated Python with web development tools and web services, including consuming Restful Web services transmitting data in JSON format
  • Developed and executed various MySQL database queries from Python using Python MySQL connector and MySQL dB package
  • Wrote Python scripts for data extraction from HTML files
  • Standardized toolsets for web development, transitioning from Eclipse to Git for source control
  • Automated existing scripts for performance calculations using NumPy, SciPy, and SQL Alchemy
  • Evaluated source data from various sources (Oracle, DB2, XML, Flat files) and developed ETL mappings
  • Enhanced unit tests and fixed existing ones
  • Conducted code reviews and implemented best Pythonic programming practices
  • Demonstrated strong analytical and problem-solving skills, contributing as both an independent worker and a valuable team player

Python Application Developer

INDMAX
09.2015 - 10.2016
  • Maintained program libraries, users' manuals and technical documentation
  • Managed large datasets using Panda data frames and MySQL, executing various MySQL queries from Python using Python-MySQL connector
  • Contributed to the development and review of requirements, architecture documents, test plans, design documents, and quality analysis
  • Developed PowerShell scripts for continuous monitoring of Exchange messaging infrastructure health, facilitating timely issue detection
  • Developed and maintained Hadoop ecosystem components, including working with HDFS for storing and accessing large datasets
  • Engaged in full-stack development utilizing Python technologies for building web-based applications, including database modeling, APIs, and views.

Software Developer

Metaminds
Hyderabad
04.2015 - 09.2015
  • Technical Environment: Python, HTML, XHTML, CSS, JavaScript, JQuery, Eclipse, MS SQL, Windows OS
  • Responsibilities:
  • Worked for gathering requirements, system analysis, design, development, testing and deployment
  • Developed rich user interface using CSS, HTML, JavaScript and jQuery
  • Used JQuery for selecting particular DOM elements when parsing HTML
  • Wrote PYTHON modules to extract/load asset data from the MySQL source database
  • Created database using MySQL, wrote several queries to extract/store data from database
  • Setup automated corn jobs to upload data into database, generate graphs, bar charts, upload these charts to wiki and backup the database
  • Effectively communicated with the external vendors to resolve queries
  • Used Git for version control
  • Actively participated in system testing, production support and maintenance/patch deployments.

Skills

  • Languages :Python 3x/27, Java, Golang, C, SQL, Shell Scripting, Groovy scripting
  • Frameworks : Django, Flask,FastAPI
  • Databases : Oracle, SQL, SQL Server, DynamoDB,PostgreSQL, MySQL,AWS RDS
  • Web Technologies : AJAX, JSON, JavaScript, jQuery, HTML, XML, CSS, Bootstrap
  • Web Services : SOAP, RESTful
  • IDES/Tools : Jupyter Notebook, PyCharm Code, Eclipse,VSCode,Postman
  • Cloud Platforms : AWS, Azure, OpenShift, Docker, Kubernetes
  • Operating Systems :
  • Linux, Unix, Windows 11, 10, 08, 07, XP, MAC OS
  • Python Libraries : NumPy, Pandas,Pytest,Unittest,asyncio,fastapi
  • VCS : Git, GitHub, Bit Bucket
  • SDLC : Agile methodologies, scrum framework
  • Programming languages
  • Web application development
  • Amazon web services
  • Microservices architecture
  • API development experience
  • Software development
  • Development lifecycles
  • Build releases
  • Oral and written communications
  • Continuous integration and deployment
  • Database programming
  • Database design
  • Problem-solving mindset
  • Testing and debugging
  • Proficient in English ,Hindi ,Telugu and Spanish Beginner
  • DevOps best practices
  • Design reviews
  • Project planning
  • Analytics
  • Project documentation
  • Research and development
  • Scope development
  • Application release maintenance
  • Performance improvements
  • Time management expertise
  • Quality assurance
  • Troubleshooting
  • Performance optimization
  • Virtualization technologies
  • Mobile application development
  • Team reporting
  • Workflows and queries
  • Agile methodologies expert
  • Configuration management
  • Software development lifecycle
  • Advanced data structures
  • Requirements gathering
  • Technical support escalations
  • Software applications
  • Computer engineering
  • Machine learning proficiency
  • Big data analysis
  • Expert database management
  • Pipeline maintenance
  • Software documentation
  • Best practices
  • Performance tuning
  • Deep learning expertise
  • Cross-platform development
  • Performance optimization techniques
  • Critical thinking capacity
  • Team leadership
  • End-to-end testing
  • Infrastructure as Code
  • Microservices deployment
  • Cloud computing
  • Continuous delivery
  • Dependency management
  • RESTful API design
  • Software solution building
  • Web applications
  • Strong debugging
  • Data extraction
  • Design and development

Education

Bachelors - Computer Science

Jawaharlal Nehru Technological University Hyderaba

Master Of Computer Applications - Computer Science

Wright State University
Dayton, OH

Timeline

Sr.Software Engineer

Capital One
05.2024 - 10.2024

Sr Software Engineer

Freddie Mac
08.2023 - 04.2024

Sr.Software Engineer Specialist

FIS Global
02.2022 - 07.2023

Software Developer

Ford
03.2019 - 02.2022

Python Developer

CBRE
01.2018 - 03.2019

Python Developer

Cobham
03.2017 - 01.2018

Python Application Developer

INDMAX
09.2015 - 10.2016

Software Developer

Metaminds
04.2015 - 09.2015

Master Of Computer Applications - Computer Science

Wright State University

Bachelors - Computer Science

Jawaharlal Nehru Technological University Hyderaba
Sindhu ReddySoftware Engineer