Summary
Overview
Work History
Education
Skills
Accomplishments
Certification
Timeline
Generic

Dr. Supreeth Mysore

Summary

Highly skilled and experienced AI-ML EngIneer with a Ph.D. In Computer Science (around 8+ years experience), seeking a position that requires expertise in C++ and Python programming, parallel programming, and strong knowledge of Computer Architecture and Operating Systems.

RESEARCH INTEREST: Hardware modelIng Bayesian statistics. PPA. Data Analysis MachIne LearnIng. Computer architecture. LInear Algebra PROFESSIONAL SKILLS Highly strategic and Inquisitive Senior AI-ML EngIneer and Data Scientist with a Ph.D. In Computer Science from the University of Kentucky, possessing a broad range of experience in data-centered projects across various industries with hardware architecture. Demonstrated ability to leverage advanced statistical techniques to obtain actionable insights from data. Proficient in Python, PySpark, Hive datasets, SQL, NoSQL, and Machine Learning principles, as well as experience in cloud environments such as AWS. Performance tuning and debugging experience and skills. Strong C/C++/Python programming skills Proficient in Python, PySpark, Hive datasets, SQL, NoSQL, and Machine Learning principles, as well as experience in cloud environments such as AWS such as Sagemaker Excellent computer vision or machine learning skills Excellent programming skills with C++ Excellent 2D and 3D computer vision Strong oral/written communication. Detail-oriented team player with strong organizational skills. Ability to handle multiple projects simultaneously with a high degree of accuracy.

Overview

8
8
years of professional experience
1
1
Certification

Work History

Computer Vision Engineer

SAMSUNG TECHNOLOGY, SARL
06.2019 - Current
  • Work on the performance optimization of the graphics and AI accelerator and below are some of the key highlights
  • Developed novel real-time 3D scene reconstruction techniques and delivering image-based rendering systems
  • Collated and cleaned data from various entities and ensuring data quality and accuracy
  • Selected and employed advanced statistical procedures, such as supervised and unsupervised machine learning algorithms, to obtain actionable insights
  • Software and develop DSP, video processing, image processing, etc
  • On FPGA's, MATLAB simulation, C, C++
  • Hardware Design Engineers with FPGAs and other hardware platforms
  • Take algorithms and developed software, port into real time operation onto embedded hardware
  • Mechanical Engineers with dynamic modeling skills to perform sensor integration into the automotive ecosystem
  • Optoelectronics Packaging Engineer – design, modeling, transfer to manufacturing, materials
  • High proficiency in C++
  • Competence with computer vision libraries such as OpenCV, MATLAB and PCL
  • Presented talks on specific projects, problems currently under consideration, importance of projects and breakdowns of outcomes versus objectives.
  • Distributed AI, deep learning and computer vision solutions.
  • Designed, implemented and evaluated new models and rapid software prototypes to solve problems in machine learning and systems engineering.
  • Researched, designed and implemented machine learning applications to solve business problems affecting 700 of users.

COMPUTER VISION ENGINEER /COMPUTER ENGINEER /R&D Engineer

UNIVERSITY OF KENTUCKY
08.2020 - 01.2022
  • ECE,UKY
  • Currently, am working on optimizing Computer architecture to improvise PPA of machine learning applications
  • Interests are computer architecture, ML hardware accelerator, system performance optimization, non-volatile memory, and Stochastic computing
  • Also extended research area to RNN and ANN accelerator optimization using
  • Stochastic Computing with NVM memories
  • Modified ISA instructions to accommodate add-on logic of proposed accelerator in C++ and Python 3.0 language
  • Tools used are PTPX, VCS, CACTI, PyCharm, with Keras,
  • NumPy, Matplotlib, SciPy, pandas, TensorFlow libraries
  • Implemented optimized ISA in tradition architecture, which resulted in 400X higher FPS of MNIST dataset compared to state-of-the-art CNN algorithm
  • Further, with ImageNET dataset incur 150X faster image processing with 70% reduction in energy consumption
  • Applied supervised and unsupervised machine learning algorithms, including Linear Regression, Logistic
  • Regression, K-Nearest Neighbor, Naïve Bayes, Decision Tree, SVM, MLP, PCA, and K-means, to obtain actionable insights
  • Conducted exploratory data analysis, including hypothesis testing and feature engineering, to uncover hidden patterns and relationships in data
  • Hands-on experience with object detection, tracking, and optical flow
  • Either of following vast experience solving analytical problems using quantitative approaches
  • Comfort manipulating and analyzing complex, high-volume, high-dimensionality data from varying sources
  • Algorithm design, creative problem solving, and performance optimization
  • Knowledgeable with computational geometry, linear algebra, and statistics
  • Effectively used tools such as OpenCV, Eigen, or TensorFlow
  • Conducted feasibility studies for proposed projects using potential ROI and risk management.
  • Presented talks on specific projects, problems currently under consideration, importance of projects and breakdowns of outcomes versus objectives
  • Distributed AI, deep learning and computer vision solutions

RESEARCH INTERIM ENGINEER, COMPUTER

QUALCOMM TECHNOLOGY PVT LTD
05.2020 - 08.2020
  • VISION TEAM
  • JOB ROLE (Performance Evaluation for Computer Architecture in Limits Management system)
  • During this internship, I could do complete mapping and full flow of converting the generic C++ and python program
  • Into MATLAB Simulink environment with ML technique
  • We have leveraged the Simulink environment forthe existing computer architecture and S-model parameter
  • Due to this, we can investigate the internal connectivity ofthe block and system performance
  • I done a whole block-level MATLAB flow and integrated into the chip top design.Further want to do the system evaluation and scenario testing to evaluate the tradeoff from the silicon chip measurement
  • Design power management IP with subcomponents such as bandgap, current references, voltage detectors, SoC biasing regulator and voltage regulators
  • Working knowledge and recent experience with modern software tools, testing frameworks, and best practices
  • Experience designing scalable software, debugging, and performance tuning applications
  • Developed creative computer vision software for a variety of Oculus products
  • Developed novel real-time 3D scene reconstruction techniques and delivering image-based rendering systems
  • Implemented GPU-optimized high performance algorithms
  • Background in 3D reconstruction and image-based rendering

REASEARCH ASSISTANT

UNIVERSITY OF KENTUCKY
08.2019 - 08.2020
  • R&D Computer Architecture Team,UKY
  • Working in the domain such as Processing in-memory Computing Architecture, Stochastic Arithmetic-based Computer
  • Architecture, High performance, emerging non-volatile devices
  • Tools used are gem5, NVMain, NVSim, PIMSim, PIN-tool (Intel), Cadence Virtuoso.

Research Interim Engineer

QUALCOMM TECHNOLOGY PVT LTD
05.2019 - 08.2019
  • R&D wireless team and collaboration with UvA (machine learning)
  • JOB ROLE (low power modeling in C, machine learning)
  • Worked on modeling of the wireless protocol on RXPCUand data packet handling
  • I had the opportunity to work on collaboration with Prof
  • Max Wellings from University of Amsterdam as well as Qualcomm V.P
  • Technology
  • I worked on implementing spiking neural network on MNIST (temporal data), backpropagation and basic CNN
  • During this internship, completed many courses from online sources like Udacity, Coursera and deeplearning.ai
  • Pytorch and phython.3 practice and algorithm implementation
  • Data analysis course from University of Michigan.

TEACHING ASSISTANT

University of Kentucky
08.2018 - 05.2019
  • R&D Computer Architecture Team,UKY
  • Worked as a supportive instructor for graduate lab, and assisted the class related to Machine Learning and Computer architecture (For graduate students

CPU Design Engineer

INTEL CORPORATION PVT. LTD
06.2015 - 07.2018
  • R&D, Power and Library Methodology Group
  • JOB ROLE (Power Optimization, PPA and Power integrity) Standard Cell library, AMS (Analog and Mixed Signal)design characterization with different PVT corner and SPICE analysis at technology node of 10nm,16nm
  • Drain current characterization and tabulation of dynamic and Leakage computation
  • RTL power optimization using Redhawk power artist methodology
  • Package Interconnect reduction technique and optimization
  • Chip-level package analysis with transient and Impedance Analysis using the Ansys-Redhawk and Cadence Allegro package analysis tool for PCB designing
  • Power Mesh creation and thermal-aware flow Methodology
  • Instantaneous Voltage Drop (IVD), reliability check (EM and SI)
  • Power Estimation at Full Chip level and leakage optimization Technique
  • Power Delivery Network analysis with chip package model using SPICE
  • Automation of Power IVD flow and maintenance using TCL, Perl and CGI scripting
  • Analog Custom Layout and DRC/LVS/thermal hotspot fixation.
  • Conducted engineering studies on design for products, associated and subsystems components and structures.
  • Skilled at working independently and collaboratively in a team environment

Education

Ph.D. -

University of Kentucky
12.2021

Master of Science - Computer Engineering

Vellore Institute of Technology
India
01.2015

Bachelors of Engineering [B.E - E.C.E

V.I.T University, V.T.U
2012

Skills

TECHNICAL SKILLS

  • Machine Learning/Deep learningPytorch, Jupyter, Pycharm, anaconda, tensorflow, OpenCV, RUST
  • VLSI ToolVCS, Cadence Spectre, Cadence Allegro, Cadence Encounter, TCAD, Tanner EDA,
  • Cadence (AMS liberate, virtuoso, Encounter, Conformal_LP), Xilinx
  • ISE, Synopsys (VCS, Verdi, DesignCompiler IC, Formality, ICC 1, ICC 2, PT-PX), Atrenta(Spyglass_LP),
  • Ansys (Redhawk, Power Artist),
  • Hardware Descriptive languageVerilog ,VHDL
  • Programming LanguageC, C, html
  • Assembly level languageMicrocontroller 8051, Microprocessor 8085
  • Simulation ToolSPICE, gem5,NVsim,NVMain,snipper, VCS, Spyglass
  • Scripting and Computing languagePython, MATLAB, TCL, Perl, Cgi scripting, Skill, R Language (data Mining)
  • Computer vision
  • Computer Architecture

Accomplishments

  • AGNI-Iso latency, in memory DRAM based StoB generator for in-deep learning”
  • ISQUED 2023 (Best Papernominated Finalist)
  • ATRIA: A Bit-Parallel Stochastic Arithmetic Based Accelerator for In-DRAM CNN Processing”
  • ISVLSI 2021(Best Paper nominated Finalist)
  • ODIN: A Bit-Parallel Stochastic Arithmetic Based Acceleratorfor In-Situ Neural Network Processing in Phase
  • Change RAM”
  • ARXIV 2021
  • A Scalable Stochastic Number Generator for Phase Change Memory based in-memory stochastic processing”,
  • 2019 IEEE, ESWEEK, NY
  • Presented Conference paper in Synopsys Conference SNUGS 2017 (Synopsys Users Group) “At speed
  • Noise Fixing with Next Generation Tools” dated 12th June 2017
  • Presented Conference paper at Ansys conference (AESE 2017) “Faster PDN Convergence Methodology for
  • Large Designs” dated on 9th August 2017
  • ACADEMIC PHD
  • PROJECT
  • Stochastic in memory compute architecture for deep learning application Period:- Jan’19 - present
  • Low power adiabatic High Speed hardware Cryptographic circuit Period:- Aug18-Jan’19
  • ACADEMIC P.G
  • PROJECT
  • Power-aware Design Methodology Front End Validation Tool Kit Period: - Oct’14-Jun’15
  • Sense amplified Gated Differential Adiabatic clock distribution Period: - Jun’14-Oct’14
  • Self-Gated Resonant energy recovery flip-flop Period: - Jan’14-May’14
  • Low Power Energy Recovery Clock Distribution and Generation Period: - Jul’13-Jan’14
  • MINI P.G PROJECT
  • TCAD 3D-CMOS simulation and analysis at 45nm node (GF) Period: - Jun’13 – Aug’13
  • Tunnel FET Drain Current characterization and simulation Period: - Jun’14- Aug’14
  • 16-bit processor CISC architecture creation from RTL to GDSII validation using Cadence CAD tool
  • Period: - Jan’14 – Mar’14
  • U.G
  • PROJECT
  • Arm Based mobile as Medical Aid using Android Period: -Aug’11-Apr’12
  • MINI U.G
  • PROJECT
  • C8051 based wireless message via mobile handset with LCD annunciation system
  • Period:-Feb’09-Apr’09,
  • Python Learning and mathematical modelling (2021)
  • Data Analysis (Coursera 2021)
  • Udacity
  • Deep Learning for AI (nano degree 2019)
  • AI algorithm using python (nano degree 2019)
  • Courserao Machine Learning (Stanford university)o Deep learning concepts(deep learning.ai)o Bayestain Statistics (University of Santa Cruz)o DataScience (University of Michigan)
  • Cleanroom Workshop and Basic concepts in IIT, Chennai, India
  • EDX online courseso Nanotechnology (courtesy of edx courses)o Graphene analysis (Courtesy of edx courses)
  • Coursera online course on: -o VLSI CAD: Logic to Layout (University of Illinois)o Learning How to Learn: Powerful mental tools to help you master tough subjects
  • (University of
  • California)o Python Interactive scripting course (University of Michigan)o R language for Data mining (Courtesy of IBM)
  • Alison Online Course on: -o Interactive Perl Lectures
  • Verification Academy (Courtesy Mentor Graphics): -o Power-Aware verificationo SOC Metricso Basic OVM conceptso System Verilog Assertiono Testbench Accelerator
  • Udemy academyo Soc verification using system Verilog assertion
  • O Learnt in-depth study using system Verilog assertion and system Verilog coverage
  • O Testbench writing for ovm/uvm from Scratcho UPF 2.0 (Courtesy of Acellera.org)

Certification

Intel Technology Corporation patent disclosure has filed on “Power-aware IR hot spot in the early stageof the design for large design.” Published the IEEE paper entitled by “Sense Amplified Differential Gated Flip-flop for Power Saving & Signal Integrity” in International Journal of Research Science and Management, ISN: 2349-5197 dated on 27th September 2014. Published the IEEE paper entitled by “Self-Gated Resonant-Clocked Flip-Flop Optimized for Power Efficiency and Signal Integrity” in IET Circuits, Devices & Systems, CDS-2014-0282 dated on 1st September 2014. Published the paper entitled by “Low power Energy Recovery Clock Distribution” in International Conferenceon Communication and Signal Processing (ICCSP ’14), IEEE Xplore, paper id 891, dated 4th April 2014. https://sites.google.com/view/supreethms3/home

Timeline

COMPUTER VISION ENGINEER /COMPUTER ENGINEER /R&D Engineer

UNIVERSITY OF KENTUCKY
08.2020 - 01.2022

RESEARCH INTERIM ENGINEER, COMPUTER

QUALCOMM TECHNOLOGY PVT LTD
05.2020 - 08.2020

REASEARCH ASSISTANT

UNIVERSITY OF KENTUCKY
08.2019 - 08.2020

Computer Vision Engineer

SAMSUNG TECHNOLOGY, SARL
06.2019 - Current

Research Interim Engineer

QUALCOMM TECHNOLOGY PVT LTD
05.2019 - 08.2019

TEACHING ASSISTANT

University of Kentucky
08.2018 - 05.2019

CPU Design Engineer

INTEL CORPORATION PVT. LTD
06.2015 - 07.2018

Ph.D. -

University of Kentucky

Master of Science - Computer Engineering

Vellore Institute of Technology

Bachelors of Engineering [B.E - E.C.E

V.I.T University, V.T.U
Dr. Supreeth Mysore