Summary
Overview
Work History
Education
Skills
Websites
Certification
Work Availability
Quote
Timeline
SoftwareEngineer
Venkata Ramana Suryadevara

Venkata Ramana Suryadevara

Bualo city,New York

Summary

I am a highly skilled and experienced data engineer with a proven track record of delivering exceptional results in various complex projects. A proven track record of working in a fast-paced environment. Highly-motivated employee with desire to take on new challenges. Strong worth ethic, adaptability and exceptional interpersonal skills. Adept at working effectively unsupervised and quickly mastering new skills. Practical Database Engineer possessing in-depth knowledge of data manipulation techniques and computer programming paired with expertise in integrating and implementing new software packages and new products into system. Offering 3-year background managing various aspects of development, design and delivery of database solutions. Tech-savvy and independent professional bringing outstanding communication and organizational abilities.

Overview

4
4
years of professional experience
1
1
Certification

Work History

Data Engineer

Kroger
Cincinnati, Ohio
10.2022 - Current
  • Kafka Data Processing: Implemented multiple Kafka ingestion jobs for real-time data processing and batch processing
  • Managed workflow processes for Metrics, ensuring accurate processing and publication to Kafka
  • Optimized workflow efficiency by implementing a time zone trigger check
  • Developed a function to streamline data retrieval based on query results
  • Successfully retrieved and analyzed data within defined time ranges
  • Conducted data collection from existing sources and organized it into a DataFrame with appropriate columns for analysis
  • Facilitated payload generation and sent encoded payloads to the Event Hub for further processing
  • Contributed to the successful processing and publication of payloads within Kafka
  • Azure Event Hubs and Python-Based Event Processing: Developed a Python-based event processing system for real-time event data from Azure Event Hubs
  • Implemented a modular architecture with specialized event processors for various event types
  • Utilized Azure Functions for event handling and integration
  • Applied data masking and filtering techniques to ensure data privacy and security
  • Leveraged Apache Kafka serialization for efficient event processing.Implemented multithreading to process events concurrently, enhancing system performance.Conducted unit testing and integration testing using pytest frameworks, achieving code coverage of 95% and 90%, respectively
  • Spark Jobs and Data Processing: Constructed complex SELECT statements and worked with different data types
  • Employed advanced query techniques, including aggregations, joins, and filtering, for effective data processing
  • Developed and maintained complex Spark jobs within the Spark framework for efficient data processing, transformation, and analysis
  • Leveraged Spark's capabilities for distributed data processing to optimize job performance and scalability.Designed data pipelines for both batch and stream processing, adapting to various data formats and sources.Worked on data ingestion and integration from multiple sources, including Cosmos DB, Event Hubs, and Azure Data Explorer ADX.Created dynamic job configurations to support incremental data processing, providing flexibility in defining data range filters and types.Utilized Delta Lake, Parquet, JSON, CSV, and other data formats for data storage and processing.Conducted comprehensive testing, validation, and error handling within Spark jobs to ensure data quality and reliability.Provided Confluence documentation and knowledge transfer to team members for seamless integration of Spark jobs into data pipelines.

Data Engineering Intern

PSQUARE TECHNOLOGIES
Chicago, IL
06.2022 - 10.2022
  • Power BI Development: Created POWER BI Visualizations and Dashboards as per the requirements
  • Developed dashboards and visualizations to help business users analyze data and provide data insights to upper management.Created conditional filters and Action links to enhance data interaction within Power BI dashboards
  • Responsible for creating and modifying data visualizations in Power BI reports and Dashboards based on client requests
  • Requirements Gathering and Training: Involved in requirement gathering for the Power BI dashboards to align with user needs.Assisted end-users in understanding the feasibility of the reports.

Data Engineer

CROZ Hyderabad, India
Hyderabad, India
07.2019 - 06.2021
  • API Development: Developed and maintained APIs, enabling real-time data access for specific data
  • Demonstrated proficiency in Python functions for serverless computing
  • Implemented robust input parameter validation to ensure data accuracy and reliability
  • Implemented error-handling mechanisms to maintain data integrity and enhance reliability
  • Collaborated on response generation, adhering to API format standards
  • SQL and Data Manipulation: Proficient in SQL querying for data extraction and manipulation
  • Manipulated date and time data for accurate data retrieval and presentation
  • Utilized data retrieval techniques to access real-time information efficiently
  • Calculated hashes for data integrity and security purposes
  • Azure and Terraform: Implemented various configuration changes, including enabling permissions for secure access to Consumption API and optimizing to save costs
  • Configured environment variables and application settings for Azure Function Apps, facilitating seamless integration
  • Upgraded Terraform versions for AzureRM and AzureAD, recreating Azure Function Apps
  • Collaborated on bug fixes, including subnet configurations and app settings.Set up alerts and dashboards for monitoring system health and performance in production
  • Collaborated on access control and permissions, including providing access to Data Scientists for data exploration.
  • Developed Python scripts for extracting data from web services API's and loading into databases.
  • Analyzed user requirements, designed and developed ETL processes to load enterprise data into the Data Warehouse.

Education

MS in Computer science -

University of Central Missouri

Bachelor of Technology, Computer Science -

VLITS(Vignans Lara Institute of Technology & Science)

Skills

  • Python
  • Apache Spark
  • Cosmos DB
  • Git
  • Microsoft Azure
  • JSON
  • YAML
  • JSON Schema
  • Spark Schema
  • Visual Studio Code
  • PyCharm
  • Agile
  • Scrum
  • Delta Lake
  • PySpark
  • Synapse and ADX
  • DAX Queries
  • Data Transformation
  • Synapse Studio Integration
  • Azure Function Development
  • Monitoring and Alerting
  • Automation Workflows
  • Data Storage Optimization
  • Script Management
  • SQL
  • KSQL
  • POWER Bi
  • SQL and Databases
  • Data Migration

Certification

  • Triblebyte certified in Python/SQL

Work Availability

monday
tuesday
wednesday
thursday
friday
saturday
sunday
morning
afternoon
evening
swipe to browse

Quote

You only have to do a few things right in your life so long as you don’t do too many things wrong.
Warren Buffett

Timeline

Data Engineer

Kroger
10.2022 - Current

Data Engineering Intern

PSQUARE TECHNOLOGIES
06.2022 - 10.2022

Data Engineer

CROZ Hyderabad, India
07.2019 - 06.2021

MS in Computer science -

University of Central Missouri

Bachelor of Technology, Computer Science -

VLITS(Vignans Lara Institute of Technology & Science)
Venkata Ramana Suryadevara