Summary
Overview
Work History
Education
Skills
Certification
Awards
Timeline
Generic

LAKSHMI NARAYANA LATCHIREDDI

Boca Raton,FL

Summary

With three years of experience as a detail-oriented data analyst, a robust background in statistical analysis, data visualization, and data-driven decision-making becomes evident. Proficiency in using tools such as Python, SQL, and data visualization software is applied to extract actionable insights from complex datasets. The experience encompasses creating and maintaining data pipelines and dashboards, leading to significant enhancements in operational efficiency and the decision-making process. Demonstrating the capability to communicate complex findings in a clear and accessible manner to technical and non-technical stakeholders is a proven strength. The passion is firmly centered on the goal of leveraging data to drive business growth and provide support for strategic initiatives.

Overview

4
4
years of professional experience
1
1
Certification

Work History

Graduate Teaching Assistant

Florida Atlantic University
08.2023 - Current
  • Assisted in teaching and focused on deep learning and its applications.
  • Conducted weekly tutorial sessions and workshops for 150 students, emphasizing practical aspects of deep learning algorithms and techniques.
  • Provided one-on-one guidance to students on course material, assignments, and projects related to deep learning frameworks and tools.
  • Collaborated with faculty members in creating project work that reinforced deep learning concepts.
  • Utilized tools such as TensorFlow, PyTorch, Jupyter Notebooks to demonstrate and explain deep learning models.

Graduate Trainee

Florida Atlantic University
01.2023 - 06.2023
  • Project: Fake News Detection
  • Technologies Used: Python, NLP, Machine Learning, Scikit-learn, Count Vectorizer, TFIDF.

1. Developed an innovative fake news detection system employing Natural Language Processing (NLP) techniques and machine learning algorithms.

2. Assembled and curated a comprehensive dataset consisting of both fake and real news articles to train and validate the model. 3. Implemented Count Vectorizer and TFIDF (Term Frequency-Inverse Document Frequency) methods for feature extraction from text data.

4. Experimented with various machine learning algorithms (e.g., Naive Bayes, SVM, Decision Trees) to classify articles as 'fake' or 'real' based on word frequency and contextual analysis.

5. Analyzed and compared the performance of different models using accuracy metrics, achieving % accuracy in correctly classifying news articles.

6. Successfully enhanced the accuracy of fake news detection by %, contributing to the reduction of misinformation spread on social media.

7. Developed a scalable model capable of adapting to new datasets and evolving news content.

Software Developer

Capgemini Technology Services
10.2019 - 09.2022

1. Smart Workflows

  • Created a knowledge Graph based on requirements after performing Data Cleaning using Python & pandas and Data Ingestion into neo4j.
  • Had frequent interaction with Clients to understand the business requirements and made changes accordingly.
  • To move Data into Azure and used putty & WINSCP to access Docker Container.
  • Documenting APIs for simple cognitive search functionality using Python and Flask.

2. Twitter Sentimental Analysis Project

  • Developed a comprehensive Twitter Sentiment Analysis system aimed at extracting and analyzing customer sentiments from social media interactions, incorporating Python technology to enhance the data processing capabilities.
  • The graph database Neo4j is used to handle complex data relationships and provides data analysis capabilities.
  • Employed advanced sentiment analysis techniques, utilizing Python's natural language processing (NLP) libraries, to interpret and categorize emotions in customer tweets and extract meaningful insights from textual data.
  • Leveraged Neo4j's graph database features in conjunction with Python to recognize patterns and make complex data connections, transforming unstructured data into comprehensive customer insights.
  • Successfully harnessed Python and Neo4j for their superior data processing and relationship handling capabilities, significantly improved the efficiency and accuracy of sentiment analysis.

3. Global Clustering

  • Managed all coding aspects in Scala, creating robust and efficient code in IntelliJ IDEA.
  • Code has been successfully pushed to the development environment for in-depth output optimization and analysis.
  • Designed and maintained complex tables in Oracle, ensuring efficient data organization and accessibility.
  • Conducted in-depth data analysis using Apache Spark, focusing on interpreting log data for meaningful insights.
  • Every coding component has been thoroughly tested in a variety of circumstances to ensure functionality and dependability.
  • Increased the project's analytical capacity by utilizing Scala and Apache Spark's skills to manage big datasets efficiently.
  • Contributed significantly to the project’s success by ensuring high standards of code quality and data analysis precision.

4. End-to-End Dataflow Pipeline Development

  • The development of an end-to-end dataflow pipeline using Azure Data Factory (ADF), significantly enhanced data integration and management processes in a Microsoft Azure environment.
  • Skilfully combined Azure Data Factory with SQL databases, Azure Databricks, and Azure Data Lake to create a cohesive and effective data processing workflow.
  • Superset was used to create dynamic and perceptive data visualizations that effectively informed stakeholders about data-driven insights.
  • Showcased advanced skills in managing Azure resources, playing a crucial role in the seamless development and execution of ADF pipelines.
  • I converted XML files to JSON format and used Neo4j for sophisticated data analysis and insights, which made data visualization and interpretation easier.

5. SAS to Big Query & PySpark Migration

  • Expertly transformed complex SAS prompts, including various data blocks and procedures, into Big Query scripts using the Google Cloud Platform.
  • Prioritized the accuracy and data integrity in the migration process, we carefully translated the SAS prompts into the corresponding Big Query code while testing them with mock data to guarantee their fidelity.
  • Successfully converted SAS prompts into PySpark scripts, utilizing the advanced functionalities of Databricks and Google Colab for efficient data processing and analytics.
  • Developed, validated, and reviewed both the newly converted PySpark scripts and the original SAS scripts with great expertise, guaranteeing a smooth and error-free migration.
  • Played a pivotal role in the end-to-end migration process, from initial translation to final implementation, contributing significantly to the enhancement of the data processing framework within the Google Cloud environment.

6. Xylinx

  • Engaged in the development of a Machine Learning-based solution for detecting fraudulent transactions in credit card operations for banking institutions, using advanced data analytics techniques.
  • Played a key role in leveraging TigerGraph, a graph database, for data management. Responsible for designing an effective graph schema and meticulously mapping data attributes to this schema, ensuring data integrity and relevance.
  • Actively involved in the implementation of the Louvain algorithm within TigerGraph. Collaborated with the team to create communities, a critical step in the process of identifying and analyzing patterns indicative of fraudulent transactions.
  • Contributed to the project's success by combining my expertise in graph databases with machine learning algorithms, leading to more efficient and accurate fraud detection mechanisms.

Education

Master of Science - Data Science And Analytics

Florida Atlantic University
Boca Raton, FL
05.2024

Skills

    Data Science

    Deep Learning

    Machine Learning

    Reinforcement Learning

    Python

    SQL

    Tableau

    Azure Data Factory

    Azure CosmosDB

    Neo4j

    Front End Development

    Power BI

Certification

  • DP-900: Microsoft Azure Data Fundamentals
  • AZ-900: Microsoft Azure Fundamentals
  • AWS Cloud Practitioner Essentials
  • Neo4j Certified Professional

Awards

  • I won 3rd place in Insights and Data Gems Olympiad in Capgemini.
  • Earned Diamond star and Certificate of Recognition in Sectorthon 2020, organized by Capgemini.

Timeline

Graduate Teaching Assistant

Florida Atlantic University
08.2023 - Current

Graduate Trainee

Florida Atlantic University
01.2023 - 06.2023

Software Developer

Capgemini Technology Services
10.2019 - 09.2022

Master of Science - Data Science And Analytics

Florida Atlantic University
LAKSHMI NARAYANA LATCHIREDDI