Summary
Overview
Work History
Education
Skills
Certification
Timeline
Generic

Krupali Patel

Houston,TX

Summary

Seasoned IT professional with over 9 years of experience in designing, developing, testing, and implementing enterprise applications in Python across various domains. Skilled in web development using HTML5, CSS3, jQuery, Angular Versions, AJAX, XML, and JSON while adhering to W3C standards. Proficient in utilizing Amazon Web Services like VPC, EC2, S3, ELB, EKS, and more to enhance application performance and scalability. Experienced in implementing caching strategies with AWS Lambda layers and external services to optimize response times and reduce reliance on external resources. Successfully reduced operational costs by 40% through the migration of applications to AWS Lambda and the implementation of real-time data processing pipelines.

Overview

9
9
years of professional experience
1
1
Certification

Work History

Python Developer

Bank of America
07.2022 - Current
  • Involved various phases of Software Development Life Cycle of the application like requirement gathering, Design, Analysis and Code development.
  • Developed RESTful APIs using Python and Django REST Framework.
  • Refactored large existing Django/Python code base by maintaining PEP-8 code standards and fix the bugs.
  • Implemented CRUD operations for data manipulation.
  • Worked on SQL and NoSQL databases for data storage, retrieval, and management in API-driven applications.
  • Worked on database schemas, queries in API transactions.
  • Configured permissions for API endpoints to security policies.
  • Developed applications using AWS Lambda, using Python and Node.js to create microservices.
  • Implemented APIs using AWS API Gateway and Lambda functions.
  • Worked on handling RESTful HTTP requests and integrating with backend systems for data exchange.
  • Responsible for deploying the code into Amazon Web Server using BOTO API.
  • Developed Python function in lambda to automate the task and integrate various AWS services.
  • Used Amazon Cloud EC2 along with Amazon SQS to upload and retrieve project history.
  • Worked in AWS services like VPC, EC2, S3, ELB, EKS, Autoscaling Groups (ASG), EBS, RDS, IAM, CloudWatch.
  • Worked in AWS S3 buckets, instances using python boto3.
  • Implemented event-driven architectures using message brokers like Kafka and RabbitMQ.
  • Used AWS ECR for managing Docker images and created ECR lifecycle policies.
  • Used Docker Hub for pulling public Docker images.
  • Used Kubernetes for container orchestration, ensuring scalability and high availability.
  • Involved in development of Web Services using REST for sending and getting data from the external interface JSON format.
  • Used shell scripts for system configuration and automating the project task.
  • Automated system maintenance tasks, including log rotation, backup procedures, and cleanup processes.
  • Used advanced packages like Mock, patch and Beautiful soup (b4) in extracting the Data during development phase and perform unit testing.
  • Developed custom PySpark libraries and reusable components to streamline data processing workflows.
  • Implemented AWS Glue data processing workflows for efficient extraction, transformation, and loading (ETL) of large-scale datasets.
  • Implemented the execution of comprehensive testing strategies using Pytest.
  • Worked on multiple containers and managed the load balancing between all the containers using NGINX.
  • Worked on MongoDB database concepts such as locking, transactions, indexes, Sharding, replication,schema design.
  • Proficient in scikit-learn, a leading machine learning library for Python.
  • Used scikit-learn for data preprocessing, feature engineering, and model selection.
  • Successfully built machine learning models, including regression, classification, and clustering algorithms.
  • Utilized scikit-learn's hyperparameter tuning and cross-validation tools to optimize model performance.
  • Used data visualization using libraries like Matplotlib to communicate insights effectively.
  • Implementing ETL processes to extract, transform, and load data from diverse sources into Spark-based data lakes and warehouses.
  • Troubleshooting the issues related to Spark applications.
  • Collaborating with data scientists to integrate machine learning models into Spark workflows for advanced analytics.
  • Staying current with industry trends and best practices in big data processing and Spark technology.
  • Environment: Python, Django, Docker, Spark, NumPy, Matplotlib, Beautiful Soup(b4), Node JS, AWS, Boto 3, HTTP, JSON, AJAX, Mongo DB, GITHUB, XML, Jira, HTML5, CSS3, Bootstrap, Agile, Windows

Python Developer

All America Bank Oklahoma City, OK
04.2022 - 06.2022
  • Participated in Agile and Scrum meetings which included Sprint planning, Daily Scrums or Stand-ups and involved in designing, developing and testing the application.
  • Developed RESTful APIs using Python and Flask/Django frameworks, adhering to REST principles for scalability and interoperability.
  • Used Amazon Web Services (AWS ) for improved efficiency of storage and fast access.
  • Worked with WEB API’s to make calls to the web services using URLs, which would perform GET, PUT, POST and DELETE operations on the server.
  • Used Lambda functions for Amazon S3, DynamoDB, Amazon SQS.
  • Implemented architecture using AWS Lambda and Amazon S3.
  • Optimized Lambda functions for memory allocation.
  • Applied AWS Lambda extensions and custom runtimes to optimize performance and meet specific application requirements.
  • Created clear API specifications, including endpoints, request/response formats, authentication methods, and error handling mechanisms.
  • Deployed microservices using Docker, created Dockerfiles and deployed using Kubernetes.
  • Used Pandas API to put the data as time series and tabular format for east timestamp data manipulation and retrieval.
  • Implemented integration tests using Pytest.
  • Used Beautiful Soup python library for web scraping to extract important data from html and xml tags.
  • Using Node JS, Express JS, handled HTTP Request/Response calls using Angular Router module and developed a Single Page Application.
  • Used Python-distributions like PyMongo/PyMySQL to connect with the databases and manipulate data in python language.
  • Used Jenkins for continuous integration and deployment.
  • Implemented the automation of repetitive tasks and system administration processes using shell scripts.
  • Implemented the migration of legacy ETL processes to Spark, resulting in a significant reduction in processing errors and increased data accuracy.
  • Implemented Spark streaming for real-time data processing.
  • Implemented data partitioning and caching techniques in PySpark to optimize query performance on large datasets.
  • Worked closely with cross-functional teams to identify and resolve performance bottlenecks, improving overall system efficiency.
  • Implemented data quality checks and validations in AWS Glue jobs to ensure data integrity and accuracy.
  • Implemented unit testing and participated in quality assurance processes.
  • Worked on data modeling and database design to support application functionality.
  • Collaborate with DevOps teams to automate deployment processes, ensuring continuous integration and delivery.
  • Worked with AWS Glue to perform schema evolution and data type mapping for different data sources.
  • Environment: Python, Django, Docker Angular, Angular CLI, HTML5, CSS3, Bootstrap, AWS, Beautiful Soup, XML, PyMongo,Jenkins, GITHUB, Jira, urllib2, DOM, GITHUB Agile, Windows

Python Developer

Acute Informatics, Gujarat, India
01.2019 - 03.2022
  • Developed GUI using Python and Django for dynamically displaying the test block documentation and other features of python code using a web Django browser.
  • Implemented CRUD operations for the applications using the MVC architecture of Django framework and conducted code reviews.
  • Managed data persistence using Docker volumes, ensuring data integrity and separation from container lifecycle.
  • Implemented volume mounts for sharing data between host and containers.
  • Implemented and customized Web Scraping Framework using Python’s Scrapy Framework.
  • Developed tools using Python, Shell scripting, XML to automate some of the menial tasks.
  • Designed and implemented serverless IoT applications using AWS IoT Core, AWS Lambda, and AWS Greengrass, enabling real-time device data processing and edge computing capabilities.
  • Configured AWS Identity and Access Management (IAM) Groups and Users for improved login authentication.
  • Successfully migrated the database from SQLite to My SQL to Postgre SQL with complete data integrity.
  • Involved in database-driven web application development using a variety of frameworks such as Django onPython.
  • Utilized PyQt to provide GUI for the user to create, modify and view reports based on client data.
  • Worked on Jenkins continuous integration tool for deployment of project.
  • Participated in unit testing by using the Python Unit Test framework.
  • Implemented robust error handling mechanisms within shell scripts, ensuring graceful degradation in case of unexpected issues.
  • Configured logging functionalities to maintain detailed records of script executions and error events for troubleshooting.
  • Involved in debugging the applications monitored on JIRA using agile methodology.
  • Developed and maintained PySpark libraries and reusable components to streamline data processing workflows and promote code reusability.
  • Participated in data pipeline monitoring and troubleshooting, identifying and resolving issues to ensure data pipeline reliability.
  • Expertise in developing and optimizing Spark applications for large-scale data processing.
  • Proven experience in designing and implementing ETL processes using Spark.
  • Optimized AWS Glue jobs and transformations to improve performance and reduce processing time.
  • Integrated AWS Glue with various AWS services (such as Amazon S3, Amazon Redshift, and Amazon Athena)to enable seamless data integration and analysis.
  • Configured and managed AWS Glue data catalog and databases to support efficient data organization and retrieval.
  • Developed custom transformations and extractors in AWS Glue using PySpark and Apache Spark libraries.
  • Created and managed AWS Glue workflows using AWS Step Functions to orchestrate complex ETL processes.
  • Implemented data encryption and security measures in AWS Glue to protect sensitive data during ETL processes.
  • Environment: Python, Django, Docker, HTML5, CSS3, Shell Scripting, XML, PostgreSQL, PyQT, Jenkins, GIT, Jira, Agile, Windows

Python Developer

SPEC India, India
04.2016 - 12.2018
  • Used AJAX in UI to update small portions of the web page avoiding the need to reload the entire page
  • Created Django dashboard with custom look and feel for end user after a careful study of Django admin site and dashboard.
  • Implemented security best practices in Docker, including image scanning for vulnerabilities.
  • Configured container security parameters, access controls, and isolation measures.
  • Managed, developed, and designed a dashboard control panel for customers and Administrators using Django, HTML, CSS, Bootstrap, jQuery and RESTAPI calls.
  • Using the built APIs and modules along with Python text parsing modules to cleanse and load data files from partners into the application/database.
  • Worked on building out, page views, templates, and CSS layouts for the complete site within the Django framework.
  • Implemented automated performance tuning and optimization for AWS Lambda functions using AWS Lambda Provisioned Concurrency, reserved concurrency, and concurrency limits management.
  • Used Python scripts to update content in the database and manipulate files.
  • Optimized PySpark code and queries to enhance processing speed and reduce resource consumption.
  • Implemented data validation and quality checks in PySpark pipelines to ensure data integrity and accuracy.
  • Utilized PySpark Streaming for real-time data processing and analysis, enabling timely insights and decision-making.
  • Worked with cloud-based technologies (e.g., AWS, Azure) to deploy and manage PySpark applications andclusters.
  • Wrote and executed various MYSQL database queries from python using Python-MySQL connector and MySQL database package.
  • Continuous improvement in integration workflow, project testing, and implementation of continuous integration pipeline with Jenkins. Implemented code coverage and unit test plug-ins with Maven in Jenkins.
  • Used Git for version controlling and regularly pushed the code to GitHub.
  • Created Business Logic using Python to create Planning and Tracking functions.
  • Involved in sprint planning and backlog grooming with the sprint master and project architect.
  • Environment: Python, Django, Docker, HTML5, CSS3, AJAX, Rest API, Bootstrap, jQuery, MySQL, Maven, Jira, GITHUB, Jenkins,Agile, Windows

Education

Bachelor -

Sardar Patel University
01.2012

Master - undefined

Gujarat Technological University
01.2015

Skills

  • Skilled in Python, TypeScript, SQL, and infrastructure as code with Terraform
  • Web Development: Angular, Angular CLI, NodeJS, Express, Django, Flask, JSON, XML, HTML, Bootstrap, CSS, REST and Ajax, JavaScript
  • AWS and Terraform expertise
  • Web Services: REST, SOAP, REST API
  • Databases & Big Data Technologies: SQL,MySQL, MongoDB, Oracle, Spark, Apache Spark (PySpark, Scala), Hadoop
  • IDEs Tools: Net Beans, PyCharm, PyScripter, Spyder, PyStudio, PyDev, Sublime Text
  • Experienced in managing Windows and Linux environments
  • Deep learning expertise
  • API integration and management

Certification

  • AWS Certified Solutions Architect – Associate
  • Certified Entry-Level Python Programmer [PCEP-30-01]
  • Certified Associate in Python Programming [PCAP31-03]
  • Microsoft Certified Azure Data Scientist Associate
  • Certified SAFe 5 Product Owner/Product Manager

Timeline

Python Developer

Bank of America
07.2022 - Current

Python Developer

All America Bank Oklahoma City, OK
04.2022 - 06.2022

Python Developer

Acute Informatics, Gujarat, India
01.2019 - 03.2022

Python Developer

SPEC India, India
04.2016 - 12.2018

Master - undefined

Gujarat Technological University

Bachelor -

Sardar Patel University
Krupali Patel