Summary
Overview
Work History
Education
Skills
Timeline
Generic

Tharun Kumar Aduri

Lake Dallas,TX

Summary

With over 3 years of experience in Information Technology, I possess a robust background in Business Intelligence and Data Science.

My expertise includes:

BI Skills: Proficient in Power BI, Power Query, DAX, Tableau, Data Visualization, and Reporting.

Data Science: Skilled in Python, Pandas, R, and Natural Language Processing (NLP).

Data Warehousing & ETL: Experienced with Informatica Power Center 10.x/9.x, Oracle 11G, Autosys, HP ALM, JIRA, and IRIS. Proficient in the ETL process, data extraction, transformation, and loading, with a solid understanding of data warehousing concepts and data modeling principles, including Star Schema, Snowflake, SCD Types, and normalization/denormalization.

Database Integration: Adept at integrating data from various relational databases like Oracle and SQL Server, and from flat files (fixed width and delimited).

Informatica Expertise: Well-versed in using Informatica Designer Components such as Source Analyzer, Transformation Developer, Mapplet, and Mapping Designer. Extensive experience in creating complex mappings with transformations like Source Qualifiers, Expressions, Filters, Joiners, Routers, Union, Lookups (connected/unconnected), Aggregators, and Normalizers. Proficient in developing sessions/tasks and workflows using Workflow Manager Tools, and utilizing Informatica command line utilities (pmcmd) for executing workflows in non-Windows environments.

Debugging & Documentation: Strong capabilities in debugging mappings, identifying bugs by analyzing data flows, and understanding session logs. Skilled in creating technical documentation, including design documents, mapping documents, and unit test documents.

Soft Skills: Excellent analytical and problem-solving skills, with strong communication abilities. A motivated team player who quickly grasps new concepts and technologies.

My comprehensive technical, oral, and written communication skills enable me to effectively solve client issues and contribute to team success. Background includes data mining, warehousing, and analytics. Proficient in machine and deep learning. Quality-driven and hardworking with excellent communication and project management skills. Dynamic ETL Developer practiced in helping companies with diverse transitioning, including sensitive data and massive big data installations. Promotes extensive simulation and testing to provide smooth ETL execution. Known for providing quick, effective tools to automate and optimize database management tasks. Dynamic ETL Developer practiced in helping companies with diverse transitioning, including sensitive data and massive big data installations. Promotes extensive simulation and testing to provide smooth ETL execution. Known for providing quick, effective tools to automate and optimize database management tasks.

Overview

4
4
years of professional experience

Work History

Data Analyst

Assurant
Crewe, UNITED KINGDOM
06.2022 - 09.2022
  • • Modified existing code to improve the performance and minimize the load time while Importing the data from MYSQL servers
  • • Created adaptable visualizations from the data processed for business needs
  • • Gather and compile relevant data from various sources, such as databases, spreadsheets, or APIs
  • • Ensure data accuracy and completeness by performing data validation and cleaning tasks
  • • Assist in conducting data analysis to identify trends, patterns, and insights
  • • Perform exploratory data analysis (EDA) to understand the data's characteristics
  • • Create visualizations, charts, and graphs to communicate findings effectively
  • • Utilize data visualization tools like Tableau, Power BI, or Python libraries (e.g., Matplotlib, Seaborn) to create informative dashboards and reports
  • • Design visually appealing and insightful data visualizations
  • • Ensure compliance with data privacy regulations (e.g., GDPR, HIPAA) when handling sensitive data
  • • Maintain data security and confidentiality
  • • Involved in the design, development, and testing of the scripts in all environments (DEV, UAT, SIT, and PROD)
  • • Collaborate with cross-functional teams, including data engineers, data scientists, and business stakeholders
  • • Communicate effectively and share insights with team members.

ETL Developer

Legato Healthcare Technologies
Hyderabad, INDIA
10.2020 - 01.2022

Domain: Insurance

  • Team Size: Anthem, an US medical insurance giant, currently stands at 3rd rank by market share in the USA
  • Anthem operates all around the USA
  • Its operations include treatment, diagnosis, and pharmacy for its customers
  • Responsibilities:
  • Interaction with Customer business user to understand business requirements
  • Reviewed project requests describing database user needs to estimate time and cost required to accomplish projects.
  • Increased data accuracy by implementing thorough validation checks and error-handling mechanisms.
  • Reduced processing time for ETL tasks by implementing parallel execution strategies.
  • Conducted regular code reviews to ensure adherence to coding standards and maintain overall code quality across projects.
  • Developed and delivered business information solutions.
  • Develop/Modify Informatica Mappings, Mapplets, Workflows, Worklets
  • Extracted data from different sources like Oracle, Flat files
  • Created connected and unconnected Lookup transformations to look up the data from the source and target tables
  • Worked on Decommission Application activities
  • Involved in designing slowly changing dimension types 1 and 2
  • Extensively used ETL to load data from Oracle and Flat files to Data Warehouse
  • Performed Unit testing and Integration testing of Informatica mappings
  • Implemented various Performance Tuning techniques
  • Developed custom reports for business stakeholders, providing valuable insights into key performance metrics.
  • Deployed scalable solutions capable of handling large volumes of data while maintaining high standards of reliability and performance.
  • Streamlined data flow between multiple systems by creating efficient ETL pipelines and APIs.
  • Designed integration tools to combine data from multiple, varied data sources such as RDBMS, SQL, and big data installations.
  • Enhanced ETL processes by optimizing complex SQL queries and streamlining data extraction procedures.
  • Collaborated with business intelligence staff at customer facilities to produce customized ETL solutions for specific goals.
  • Optimized broken and inefficient assets through effectively troubleshooting technical processes and workflows.
  • Managed data quality issues during ETL processes, directing qualitative failures to team lead for amelioration.
  • Automated repetitive tasks using Python scripts, increasing productivity and reducing manual intervention.
  • Documented technical specifications and designs, facilitating knowledge sharing among team members and supporting future development efforts.
  • Data visualization based on the analysis using Power BI
  • Environment: Informatica PowerCenter 10.x SQL Developer, JIRA, Unix, Shell scripts, Agile, Power BI

ETL Developer

Steric Infotech
Hyderabad, India
10.2019 - 10.2020
  • PROJECT, Banking
    This Application is mainly targeted in ICF 7.4 to TDP Integration, i.e., to check whether the new source systems implemented in ICF 7.4 will take care of and serve all the business logic properly and also responsible for checking whether the data populated in EDW target were of right data type right from ICF new source systems (Views). It also includes some of the downstream like Tax, Liquidity, and Cash On on-hand reports, and Sigma feed will no longer needed
    Responsibilities:
    Gathering suit of business requirements, Prepare Source to Target Mapping specifications and Transformation rules
    Created Source to Target Mapping Specification Document
    Involved in system study, analyzing the requirements by meeting the client and designing the system
    Developed mappings/Reusable Objects/Transformation/mapplets by using a mapping designer, transformation developer, and mapplet designer
    Extracted data from different sources like Oracle, Flat files
    Designed and developed complex aggregate, join, and look-up transformation rules (business rules) to generate consolidated (fact/summary) data identified by dimensions using the Informatica ETL tool
    Used the Update Strategy Transformation to update the Target Dimension tables
    Created connected and unconnected Lookup transformations to look up the data from the source and target tables
    Involved in Performance tuning for sources, targets, mappings, sessions, and server
    Used PL/SQL and UNIX Shell Scripts for scheduling the sessions in Informatica
    Wrote SQL, PL/SQL for implementing business rules and transformations
    Developed batch file to automate the task of executing the different workflows and sessions associated with the mappings on the development server
    People soft application engine was used to load the data marts
    Created test cases and completed unit, integration, and system tests for the Data warehouse
    Joined Tables Originating from Oracle
    Wrote test cases and test conditions for Various Derivatives and Subject areas
    Actively Participated in Team meetings and discussions to propose solutions to the problems
    Prepare QA Test Plan, Test cases, and QA Signoff documents
    Prepare test cases and Test Plans in HP ALM
    The QA Team is to validate the DB data in ICF7.4
    Analyzing the business requirements according to DMS logic
    Analyzing the user’s tasks and developing a model of the functions and the flow of work between the tasks
    The QA team is to validate the End-to-End testing from source table columns to target table columns, which includes business requirements, hard coded values, and straight copy
    The QA team is validating the workflow dependencies
    The QA Team is to validate the Trigger file functionality
    The QA Team is to validate the Command task and Event Wait task
    Verify that to pull cash pool accounts, the logic being used is BANKACCOUNTS.BANKACCTTYPEID='INT'
    Monitored project progress, identified risks, and took corrective action as needed.
    Verified the quality of deliverables and conformed to specifications before submitting them to clients.
    Produced quality standards, checklists, report templates, and processes.
    Negotiated contracts with vendors to secure favorable terms while maintaining project cost efficiency.
    Coordinated resources effectively, optimizing workload distribution among team members.
    Implemented change management processes, addressing scope changes and their impact on project goals.
    Maintained tactical control of project budgets and timelines to keep teams on task and achieve schedule targets.
    Analyzed financial reporting systems and project schedules to address potential problems proactively.
    Provided client and team members with clear communication of project updates and progress.
    Conducted thorough risk assessments to identify potential issues and develop mitigation strategies.
    Enhanced project efficiency by streamlining processes and implementing time-saving strategies.
    Facilitated workshops to collect project requirements and user feedback.
    Scheduled and facilitated meetings between project stakeholders to discuss deliverables, schedules, and conflicts.
    Adapted quickly to unexpected challenges during project execution by developing innovative solutions that ensured objectives were met.
    Planned, executed, and controlled assigned projects, ensuring work complied with contractual requirements.
    Facilitated regular meetings with stakeholders to review progress, address concerns, and adjust plans as needed for success.
    Assisted in developing project budgets and closely monitoring expenses throughout the project lifecycle.
    Elevated client satisfaction levels through consistently delivering high-quality results on time and within budget parameters.
    Tracked project and team member performance closely to intervene in mistakes or delays quickly.
    Updated customers and senior leaders on progress and roadblocks.
    Recruited, hired, and supervised resources for the staff project team.
    Set up and managed team meetings.
    Reported regularly to managers on project budget, progress, and technical problems.
    Ensured all documentation was accurate, up-to-date, and adhered to company standards throughout each project phase.
    Developed comprehensive project plans, outlining key milestones and deliverables for stakeholders.
    Analyzed competitors' approaches to gain insight into industry trends, enhancing overall performance in subsequent projects.
    Managed time efficiently to complete all tasks within deadlines.
    Assisted with day-to-day operations, working efficiently and productively with all team members.
    Worked effectively in fast-paced environments.
    Tools Used: Informatica 9.5.1, PL / SQL, oracle 11g, HPALM, UNIX, SHELL SCRIPTING.

Associate Projects

CTS
05.2018 - 10.2019
  • Video, Demand services to its clients
  • The VOD services statistics are then collected and stored within various source systems
  • The purpose of the Project is to integrate all statistics into one holistic view so that the business may utilize the outputs to perform reporting and analyze the data
  • This Project will also include a reconciliation process with various validation rules within the ETL process so that corrupt data can be pushed out to unreconciled tables
  • These data will further be reconciled. Solaris is being delivered to consumers via Maestro/COMCAST platform
  • VST requires data for settlement purposes
  • Transaction TVOD data (Events, Credits, or Reversals) is the library of the videos where subscribers need to pay for the rental videos
  • This record will be created using Maestro, as this rental is eligible for rating and billing
  • Exadata receives Asset information from Hadoop
  • It is agreed between Exadata and Hadoop that the latter team is going to send separate rows for each of the asset types (Movie, Poster, Preview, Title)
  • Exadata is responsible for consolidating all column values for each Package Asset IDs
  • The Asset data would go through the reconciliation process before updating the Legacy Tables
  • Responsibilities:
  • Responsible for gathering suit of business requirements, Prepare Source to Target Mapping specifications and Transformation rules
  • Created Source to Target Mapping Specification Document
  • Involved in system study, analyzing the requirements by meeting the client and designing the system
  • Developed mappings/Reusable Objects/Transformation/mapplets by using a mapping designer, transformation developer, and mapplet designer
  • Extracted data from different sources like Oracle, Flat files
  • Designed and developed complex aggregate, join, and look-up transformation rules (business rules) to generate consolidated (fact/summary) data identified by dimensions using the Informatica ETL tool
  • Used the Update Strategy Transformation to update the Target Dimension tables
  • Created connected and unconnected Lookup transformations to look up the data from the source and target tables
  • Involved in Performance tuning for sources, targets, mappings, sessions, and server
  • Developed batch file to automate the task of executing the different workflows and sessions associated with the mappings on the development server
  • People soft application engine was used to load the data marts
  • Created test cases and completed unit, integration, and system tests for the Data warehouse
  • Joined Tables Originating from Oracle
  • Wrote test cases and test conditions for Various Derivatives and Subject areas
  • Actively Participated in Team meetings and discussions to propose solutions to the problems
  • Environment: Informatica PowerCenter 9.6, SQL DEVELOPER, UNIX, JIRA, Shell Scripting

Education

B.Tech - ECE Branch

J N T University
Hyderabad
12.2017

Master of Science - Data Science

Cardiff Metropolitan University
Cardiff, UK
05.2023

Master of Science - Information Technology Project Management

Indiana Wesleyan University
Marion, IN
08.2024

Skills

  • Programming Languages : SQL,PL/SQL
  • Operating Systems: Windows
  • Database Systems: Oracle 10g/11g
  • ETL Tool: Informatica 9x / 10x
  • Metadata Management
  • Data Security
  • Data Migration
  • Business Intelligence
  • Change Data Capture
  • Professional Demeanor
  • Reliability
  • Real-time Processing
  • NoSQL Databases
  • Data Cleansing
  • Scripting Languages: UNIX/SHELL
  • RDBMS
  • Data Analytics
  • Production support
  • ETL Modeling
  • API Integration
  • Data Validation
  • Machine Learning
  • SQL Programming

Timeline

Data Analyst

Assurant
06.2022 - 09.2022

ETL Developer

Legato Healthcare Technologies
10.2020 - 01.2022

ETL Developer

Steric Infotech
10.2019 - 10.2020

Associate Projects

CTS
05.2018 - 10.2019

B.Tech - ECE Branch

J N T University

Master of Science - Data Science

Cardiff Metropolitan University

Master of Science - Information Technology Project Management

Indiana Wesleyan University
Tharun Kumar Aduri