Summary
Overview
Work History
Education
Skills
Certification
Professionaloverview
Timeline
Generic

Sankara Reddi Nallagundla

Summary

Seasoned Talend Lead and AWS Data Engineer, adept at driving significant data integration and warehouse transformations, notably at Truist Bank. Leveraged expertise in Talend (versions 5.6 to 7.1), AWS Glue, and Snowflake to enhance data processing efficiency by over 30%. Renowned for exceptional problem-solving skills and a proactive approach to innovation and technical leadership.

Overview

11
11
years of professional experience
1
1
Certification

Work History

Talend Lead(developer& Admin),AWS Data Engineer

Truist Bank, Livemindz LLc
Irving, Texas
04.2024 - Current
  • Experience in working with large data warehouses, mapping and extracting data from legacy systems, Netezza/Oracle
  • Designed and developed a metadata-driven ETL framework in Talend, enabling dynamic data integration for multiple business processes with minimal hardcoding, enhancing scalability and reusability
  • Provisioned and configured Talend Cloud environments, including the Management Console, Data Integration, and other Talend services.
  • Managed user accounts, roles, and permissions to ensure secure and compliant access to Talend Cloud resources.
  • Created an RBAC model for project developers, admin users, support admin users, and super users.
  • Involved in Azure ad group SSO login authentication for Talend AD groups
  • Installed Talend cloud remote engine .exe files to on-prem and on cloud servers and authenticated with Remote engine keys
  • Involved to create remote engine auto repairing through TMC api calls for Talend cloud Servers
  • Used terraform user script to handle that code
  • Designed Login Pattern for Developers through RSA and admins to Microsoft Authenticator via Azure AD group team
  • Created Talend Role mapping for Via Talend Api tester
  • Used the same identifiers to sync from Azure ad group
  • Created Talend mapper structures, Maps to retrieve the SF Hashicorp Vault details through Talend job tsystem component and added tHashmap component to retrieve the created Map
  • Created Joblets to retrieve the Process and column mapping details from Snowflake tables and assigned them into context Parameters.Used tsystem,tHashmap,tflowtoIterate,tSetglobalvar ,TdbInput,tDbrow(snowflake),tjoblog,tflowtoiterate, tSetkeystore,tDbconnection
  • Created a Joblet for control Tables to enter the details based on each step of the main job
  • Involved in creating Framework wrapper through Shell Script and Trigger the Main Data Frame Work Job Task Tmc through Wrapper script
  • Involved creating , Update Snowflake tables forData movement framework
  • I
  • Involved in POC wrapper scripts and CI/CD script, Potegrity Api call for Data Movement Framework
  • Conducted performance tuning and optimizations of the Talend metadata-driven framework, leveraging parallel processing, optimized data partitioning, and resource management to achieve an improvement in job execution time compared to the previous PowerCenter/IICS system
  • Re-engineered complex IICS data mappings into Talend's graphical job designs, ensuring the efficient handling of large datasets while maintaining data integrity
  • Conducted rigorous unit and integration testing to validate data migration from Informatica PowerCenter to Talend, ensuring consistency and accuracy between the two platforms
  • Created Talend ETL jobs with snowflake components such as Snowflake Configuration, Snowflake input, Snowflake output, Snowflake Row, Snowflake Connection, Snowflake Close
  • Used snow pipe for continuous Data ingestion from s3 bucket
  • Expertise loading data into Snowflake from external sources, such as JSON files
  • Expertise in creating snowflake Tables, views in Python script and With Talend Components
  • Expertise In creating metadata Tables and store the queries with column name and use Dynamic schema concept to call those columns through Talend Job
  • Designed, developed, and scheduled ETL jobs using Talend Cloud, ensuring efficient data extraction, transformation, and loading
  • Managed cloud resources effectively, scaling up as needed to handle peak loads
  • Set up and manage connections to various data sources, including databases, cloud storage, and APIs
  • Provided support for integration development, assisting with data mapping and transformation logic
  • Provided technical support to users and developers, resolving issues and answering queries
  • Conducted training sessions on Talend Cloud best practices and usage
  • Implementing complex business rules by creating reusable transformations and robust mappings using Talend transformations like tConvertType, tSortRow, tReplace, tAggregateRow, tUnite etc
  • Implemented update strategy on tables and used tJava, tJavarow components to read data from tables to pull only newly inserted data from source tables
  • Created a Python script for File splitting And used the script through Talend job to retrieve the Script
  • Created many complex ETL jobs for data exchange from and to Database Server and various other systems including RDBMS, XML, CSV, and Flat file structures
  • Integrated java code inside Talend studio by using components like tJavaRow, tJava, tJavaFlex, Used Talend most used components (tMap, tDie, tConvertType, tFlowMeter, tLogCatcher, tRowGenerator, tSetGlobalVar, tHashInput ;tHashOutput and many more) and Routines
  • Troubleshoot data integration issues and bugs, analyze reasons for failure, implement optimal solutions, and revise procedures and documentation as needed
  • Created data framework to load data from different sources to Target snowflake database
  • Implemented Reusable framework for each data source from Source
  • Used common joblets for error handling
  • Implemented jobs get the password for functional users from the Truist CyberArk secret manager
  • Worked with Truist Business analysts to get the requirements from Truist business use cases such as Reward Program and mortgage Loan programs
  • Involved in Creating Aws Talend cloud Servers (EC2 instance) through Terraform and installed remote engine .exe file In remorse engine through Terraform user scripts
  • Involved In created Talend Infra design for Truist and Involved Talend data framework for Multiple users
  • Involved Talend Admin activities like Creating workspaces, adding users manually to workspaces, giving permission based on the project allocation Adgroups
  • Involved in resolving the studio level issues for developers
  • Involved in creating log structure into different on-premise servers
  • Involved in creating Truist specified CI/CD structure for Talend promotion activities from Gitlab YML to TMC artificiality
  • Environment: Talend 8.0, Talend Management Console, Oracle 11g, MySQL, gitlab, JAVA, pyspark, Python,UNIX Shell Scripting, Snowflake , AWS EC2,VPC,Terraform, Snowflake, Netezza,DB2, Hive, Hadoop
  • Parent Company: LussoTech LLC (work authorization: H1B)

ETL Lead(Talend&AWS Glue) And ETL Operational Lead

Bayer Pharmaceuticals
08.2021 - 02.2024
  • (Talend and AWS Glue)
  • Worked in the Data Integration Team to perform data and application integration with the goal of moving high-volume data more effectively, efficiently, and with high performance to assist in business-critical projects.
  • Developed custom components and multi-threaded configurations with a flat file by writing Java code in Talend.
  • Worked as Data Operations Lead for three business units of Bayer (Market Access, Marketing Mix, and CRM team).
  • Providing the needed Talend infrastructure builds, setups, day-to-day platform support for the Talend application teams, and development projects from the initial phases of POC right into production and go-lives.
  • Worked on production issues with client data specialist team and involved on call support to resolve the issues
  • Interacted with Solution Architects and Business Analysts to gather requirements and update Solution Architect Document Created mappings and sessions to implement technical enhancements
  • Deployed and scheduled Talend jobs in the Talend cloud and monitored the execution
  • Design, Build and operationalize the enterprise data solutions and applications using AWS data analytics in combination with 3rd parties – Redshift, Talend, snowflake
  • Worked on migration projects such as SAP to Denodo, Denodo to Snowflake, through Talend.
  • Created separate branches within the Talend repository for Development, Production, and Deployment.
  • Prepared ETL mapping documents for every mapping, and data migration document for a smooth transfer of projects from the development to the testing environment, and then to the production environment.
  • Excellent knowledge of the Talend Administration Console, Talend installation, and using context and global map variables in Talend.
  • Create cross-platform Talend DI jobs to read data from multiple sources
  • Worked on different data sources and target systems such as Oracle, Netezza, MySQL, snowflake, hive, Redshift, Flat files etc
  • Implemented Lambda to configure Dynamo DB Autoscaling feature and implemented Data Access Layer to access AWS DynamoDB data
  • Implemented AWS step functions to automate and orchestrate the amazon sagemaker related tasks such as publishing data to s3, training ML model and deploying it for prediction
  • Experience with implementation of Snowflake cloud data warehouse and operational deployment of Snowflake DW solution into production
  • Design and develop ETL integration patterns using Python on Spark
  • Talend Administrative tasks like - Upgrades, create and manage user profiles and projects, manage access, monitoring, setup TAC notification
  • Observed statistics of Talend jobs in AMC are observed to improve performance and in what scenarios errors are causing
  • Performed Data Manipulations using various Talend Components like tMap
  • Tjavarow, tjava, tOracleRow, tOracleInput, tOracleOutput, tMSSQLInput and many more.Created Generic and Repository schemas
  • Implementing complex business rules by creating reusable transformations and robust mappings using Talend transformations like tConvertType, tSortRow, tReplace, tAggregateRow, tUnite etc
  • Implemented update strategy on tables and used tJava, tJavarow components to read data from tables to pull only newly inserted data from source tables
  • Created complex mappings in Talend 7.3.1/ Talend8.0.1 using tmap, tdbinput, tdboutput, tfileinputdelimited, tfileoutputdelimited, tunique, tFlowToIterate,tlogcatcher, tflowmetercatcher, tfilelist, taggregate, tsort, tMDMInput, tMDMOutput, tFilterRow
  • Used tStatsCatcher, tDie, and tLogRow to create a generic joblet to store processing stats into a Database table to record job history
  • Created Spark sessions by using pyspark file through Builder () method
  • Used enableHiveSupport() to use the hive in pyspark code
  • I was involved in creating data frames in pyspark code and created hive tables using Spark.sql method
  • Involved in creating spark sql based requirements
  • Created a data pipeline using Python script during Snowflake load for one of the migration Project
  • Production support was done on a rotation basis, coordinated with the teams during the failure and applied quick fixes and met the SLAs and created a ITSM for further deep analysis
  • Working on Migration projects such as SAP to Denodo, Denodo to Snowflake through Talend
  • Creating separate branches within the Talend repository for Development, Production and
  • Deployment
  • Preparing ETL mapping Documents for every mapping and Data Migration document for smooth transfer of projects from development to testing environment and then to production environment
  • Troubleshoot data integration issues and bugs, analyze reasons for failure, implement optimal solutions, and revise procedures and documentation as needed
  • Responsible for tuning ETL mappings, Workflows, and underlying data models to optimize load and query performance
  • Configure Talend Administration Center (TAC) for scheduling and deployment
  • Create and schedule Execution Plans - to create Job Flows
  • Performed Data Analysis using SQL queries on source systems to identify data discrepancies and determine data quality
  • Performed extensive Data Validation, Data Verification against Data Warehouse and performed debugging of the SQL-Statements and stored procedures for business scenarios
  • Designed and developed Tableau dashboards using stack bars, bar graphs, scattered plots, and Gantt charts
  • Created Metadata tables in AWS Dynamo db to get File and file group id’s, and s3 path, SFTP location and other file mapping details to push the data from files to Snow flake stage tables.Created AWS glue jobs for CDP Global Migration project in Bayer
  • Created Glue jobs and Step functions for CDP GLOBAL Migration project from Talend to AwS Glue
  • Created Snowflake Stored procedures and associated snowflake store procedures with Aws Glue jobs
  • Created step functions and Monitor the Batch Job flow
  • Fixed the aws glue jobs failure and finding issues at aws cloud watch report
  • Designed and built multiple ETL jobs for loading data from multiple sources – flat files, Oracle DB, MySql into Snowflake using Snowflake sql and Talend
  • Designed and developed database objects and scripts for creating tables, sequences, indexes, triggers, views, constraints, stored procedures, functions, packages, and triggers on the tables that are required for the application in T-SQL/SQL programming
  • Worked with multiple teams to analyze and document configuration information for source and target databases systems using ServiceNow
  • Participating in data modeling and in developing conceptual and logical data models
  • Designed and built multiple ETL jobs for loading data from multiple sources – flat files, Oracle DB, MySql into Snowflake using Snowflake sql and Talend
  • Designed and developed a metadata table framework for specific interfaces running the redshift queries and loaded them into the stage and core layer of Redshift data warehouses
  • Experience in working with large data warehouses, mapping and extracting data from legacy systems, Redshift/SQL Server
  • Developed the Talend jobs with redshift components such as tRedshiftConnection, tRedshiftInput, tRedshiftClose,tRedshiftOutput,tRedshiftOutputBulk, tRedshiftRow,tRedshiftRollback,tRedshiftUnload
  • Created Talend ETL jobs with snowflake components such as tSnowflakeConfiguration, tSnowflakeinput, tSnowflakeoutput, tSnowflakeRow,tSnowflakeConnection,tSnowflakeClose
  • Environment: Talend studio 7.3.1/Talend 8.01/ Talend Administrator Console, Oracle 11g, MySQL, git, JAVA, pyspark, Python, UNIX Shell Scripting, Redshift, Snowflake , AWS (Glue, Lambda, Step Functions, Aws Dynamo DB)
  • Parent Company: LussoTech LLC (work authorization: H1B)

Talend Developer

Optum State Govt solutions
02.2021 - 08.2021
  • Worked in the Data Integration Team to perform data and application integration with a goal of moving high volume data more effectively, efficiently and with high performance to assist in business-critical projects
  • Developed custom components and multi-threaded configurations with a flat file by writing JAVA code in Talend
  • ITIL process adoption in project life cycles
  • Worked In Tennessee, South Carolina, and MT Provider module development with Talend DI and Azure data factory
  • Worked on Data conversion framework for Tennessee, South Carolina, and MT Provider with the Talend ETl tool
  • Daily Talend Platform supports leveraging years of technical expertise in resolving all Talend production issues from job failures, long-running jobs, job design best practices and efficient production support
  • Modified Legacy codes based on User requirements and Deployed into Production
  • Interacted with Solution Architects and Business Analysts to gather requirements and update Solution Architect Document Created mappings and sessions to implement technical enhancements
  • Deployed and scheduled Talend jobs in the Talend cloud and monitored the execution
  • Created separate branches within the Talend repository for Development, Production and Deployment
  • Excellent knowledge of Talend Administration console, Talend installation, using Context and global map variables in Talend.Create cross-platform Talend DI jobs to read data from multiple sources
  • Worked on different data sources such as Oracle, Netezza, MySQL, Flat files etc
  • Create Talend Jobs for data comparison between tables across different databases, identify and report discrepancies to the respective teams
  • Talend Administrative tasks like - Upgrades, create and manage user profiles and projects, manage access, monitoring, setup TAC notification
  • Created Generic and Repository schemas
  • Designed and created SOAP based Webservices using Talend ESB for various interfaces
  • Experienced in handling data from complex files like IDoc, XML, json and orc
  • Worked on Provider Module API
  • Developed the Talend ESB jobs using components like tREST,tRESTClient,txmlmap,tfileInputJSON, tfileInputXML, tSOAP,tWebservice,tWebserviceinput,tHttpRequest for REST and SOAP calls
  • Designed and created SOAP based Webservices using Talend ESB for various interfaces
  • Experienced in handling data from complex files like IDoc, XML, json and orc
  • Worked on Provider Module API
  • Extracted the data from the Medicare management information system on hourly and Daily basis through ESB jobs
  • Created a common job framework with Joblets for extracting the Api parameters and other required Properties
  • Experience in maintaining code version, code migration and scheduling to Dev,QA,UAT and Prod environments
  • Deploy fixes after root cause analysis by following the change management procedures and policies
  • Performed Data Manipulations using various Talend Components like tMap, tjavarow, tjava, tOracleRow, tOracleInput, tOracleOutput, tMSSQLInput and many more
  • Implementing complex business rules by creating reusable transformations and robust mappings using Talend transformations like tConvertType, tSortRow, tReplace, tAggregateRow, tUnite etc
  • Created complex mappings in Talend 5.0 using tmap, tmssqlinput, tmssqloutput, tfileinputdelimited, tfileoutputdelimited, tmssqloutputbulkexec, tunique, tFlowToIterate, tintervalmatch, tlogcatcher, tflowmetercatcher, tfilelist, taggregate, tsort, tMDMInput, tMDMOutput, tFilterRow
  • Used tStatsCatcher, tDie, and tLogRow to create a generic joblet to store processing stats into a Database table to record job history
  • Created standards and best practices for Talend ETL components and jobs
  • Extraction, transformation and loading of data from various file formats like .csv, .xls, .txt and various delimited formats using Talend Open Studio
  • Troubleshoot data integration issues and bugs, analyze reasons for failure, implement optimal solutions, and revise procedures and documentation as needed
  • Responsible for tuning ETL mappings, Workflows, and underlying data models to optimize load and query performance
  • Configure Talend Administration Center (TAC) for scheduling and deployment.Create and schedule Execution Plans - to create Job Flows
  • Used ITSM tool Service Now for troubleshooting user issues and service requests
  • Participating in data modeling and in developing conceptual and logical data models
  • Environment: Talend 7.1and 7.3, Talend Open Studio Big Data/DI/ESB, Talend Administrator Console, Talend Management Console, Oracle 11g, Jenkins, git, JAVA, UNIX Shell Scripting, TWS Scheduler, MySQL, Mongo DB,, PowerBI, REST,XML,Soap,JSON,Swagger UI
  • Parent Company: LussoTech LLC (work authorization: H1B)

Talend ETL developer and Talend Admin

Southern California Edison
06.2019 - 12.2020
  • Projects: Energy Procurement Management (EPM) Data Analytics Platform: UMA, UMA_CSRP, UMA_IMEP, PDR and CDS, As a Technology Senior Developer, I used to connect with Client SMEs for requirement gathering and source’s information
  • Working as Talend Senior Developer for UMA (Usage Measurement Aggregation) and CDS (Common data Storage) projects as a part of the Energy Procurement Management (EPM) Data Analytics Platform Program
  • Lead 5 members team located onsite and offshore for mainly two upstream source systems and multiple downstream applications in the project
  • Documented the High Level and Low-Level design documents with domain knowledge
  • Co-ordinated with multiple vendors in the project to get the data from source teams and as well providing the transformed data
  • Using Talend transformations like tConvert Type, tSort Row, tReplace, tAggregateRow, tUnite etc
  • Developed Talend jobs to populate the claims data to the data warehouse - star schema
  • Implemented talend jobs using re-usable joblets, tOracle Connections, tFileDelimitedInput, HDFSConnection, HiveConnection, tHiveInput, tHiveRow, tImpala Row, tImpalaLoad, tSSH and tLogCatcher components
  • Worked on Talend MDM5.1.1 Aand created Business rules and workflow System
  • Used different Talend mdm components such as tMDMinput, tMDMoutput,tMDMBulkLoad,tMDMConnection,Tmdm Receive and Rollback
  • Performed Data Manipulations using various Talend Components like tMap
  • Tjavarow, tjava, tOracleRow, tOracleInput, tOracleOutput, tMSSQLInput, tHiveinput, tHiveoutput, tHiverow and many more
  • Created many complex ETL jobs for data exchange from and to Database Server and various other systems including RDBMS, XML, CSV, and Flat file structures
  • Integrated java code inside Talend studio by using components like tJavaRow, tJava, tJavaFlex, Used Talend most used components (tMap, tDie, tConvertType, tFlowMeter, tLogCatcher, tRowGenerator
  • TSetGlobalVar, tHashInput ;tHashOutput and many more) and Routines
  • Load the user exceptions, thresholds, and CAISO calendar dates data from NAS (Network Accessed Storage) to Hadoop file system as pre-requisites for UMA jobs
  • Created Power Bi Queries for Users and resolved the issues while connecting them from hive, Impala, Oracle Disaster Recovery Tables
  • Designed and implemented a Monthly Resource Allocations system using Excel Power Query & Power Pivots to be included in dashboard reporting
  • Assisted Power supply deportment users with automating reporting functionality using Power BI tools, Power BI reporting, Dashboards & Scorecards (KPI) and MySQL, Hive, Impala & other Data warehouse data sources
  • Manage User requests on Power supply Departmental Reports, troubleshooting daily issues, and integrating existing Access databases with numerous external data sources including (SQL, Hive,Impala )
  • Designed and implemented multiple dashboards using Power BI - PowerPivot & Power Query tools for in-house metrics
  • Scorecard ADHOC development of SQL to Excel dashboards, charts, graphs, and Pivot tables
  • Extracting snapshot types from CAISO calendar and calculating the usage for each snapshot type which are classified as 7 types
  • Worked on SOAP and REST services to get XML or JSON files and loaded them into Hadoop UMA project related tables
  • Used them in the application
  • Developed the Talend ESB jobs using components like tREST,tRESTClient,tfileInputJSON, tfileInputXML, tSOAP,tWebservice,tWebserviceinput,tHttpRequest for REST and SOAP calls
  • Worked on few jobs in CDS projects using ESB components like tESBConsumer, tESBProviderFault, tESBProviderRequest, tESBProviderResponse, tRESTClient, tRESTRequest, tRESTResponse
  • Responsible for monitoring the Daily/Weekly/Monthly Jobs during transitions that are scheduled using Redwood Schedular
  • Developed the talend ETL jobs for aggregating Usage Measure of service accounts readings for each rate class and voltage group based on receiving and delivery which includes user and system exceptions and some of the service accounts and zip codes excluded
  • SCE will go for bidding at CAISO based on calculated aggregation usage
  • Developed data pipelines from source systems to lading, staging, core and consumption layers in Redwood scheduler using Job chains and jobs
  • Developed the Scripts to trigger some common Talend Jobs which can be used for Main data load jobs
  • Worked on creating CR (Change requests) for moving code Productions
  • Worked on Production activities before release and after release
  • Working on daily production support activities until production cutover to the Operational Team
  • Creating Transition plan for Production Support Team
  • Worked on different data sources such as Oracle, hive, impala , Flat files etc
  • Create Talend Jobs for data comparison between tables across different databases, identify and report discrepancies to the respective teams
  • Talend Administrative tasks like - Upgrades, create and manage user profiles and projects, manage access, monitoring, setup TAC notification
  • Participating in data modelling and in developing conceptual and logical data models
  • Designed and built multiple ETL jobs for loading data from multiple sources – flat files, Oracle DB, MySql into Snowflake using Snowflake sql and Talend
  • Experience in using AWS cloud components and connectors to make API calls for accessing data from cloud storage (Amazon S3, Redshift) in Talend Enterprise Edition
  • Executed Hive queries on Parquet tables stored in Hive to perform data analysis to meet the business requirements
  • Multi-tasking on couple of projects at all stages of project development including Project plan, ETL Design, Coding, Testing, Implementation, and support
  • Perform security configuration for users, projects, roles in TAC for SVN, GIT working
  • Extracted data from multiple operational sources for loading staging area, Data warehouse, Data Marts using SCDs (Type 1/Type 2/ Type 3) loads
  • Troubleshoot data integration issues and bugs, analyze reasons for failure, implement optimal solutions, and revise procedures and documentation as needed
  • Interacted with Solution Architects and Business Analysts to gather requirements and update Solution Architect Document Created mappings and sessions to implement technical enhancements
  • Worked on Data stage to Talend migration for CDS, PDR Project
  • Worked on Oracle to Snowflake Migration for PDR Project
  • Developed Pyspark scripts for Multiple interfaces in the UMA project for loading data into core and Semantic layer Tables in UMA
  • Design and develop ETL integration patterns using Python on Spark
  • Optimize the Pyspark jobs to run on Kubernetes Cluster for faster data processing
  • Design and Implement ETL processes to import data from and into Microsoft Azure
  • Used tSnowflakeClose, tSnowflakeConnection, tSnowflakeInput,tSnowflakeOutput,tSnowflakeRow for creating Talend jobs with other required regular Talend Components for PDR Project
  • Excellent debugging skills and communication with the different teams like Oracle DBA’s, Cyber team, Network Team, I Am operations for the Service accounts to renew the functional id’s on time and report any security issues and vulnerabilities on time to ensure the security of the data
  • Worked with vendors, Other internal teams and provided some of the inputs for the development task which can make the Talend job have higher performance and durability
  • Working with Multifunctional teams such as Talend infra team, Oracle Apps team, Hadoop application Team, Middleware Team
  • Implemented update strategy on tables and used tJava, tJavarow components to read data from tables to pull only newly inserted data from source tables
  • Utilized Big Data components like tHDFSInput, tHDFSOutput, tHiveLoad, tHiveInput, tHbaseInput, tHbaseOutput
  • Hands on Experience on many components which are there in the palette to design Jobs & used Context Variables to Parameterize Talend Jobs
  • Worked with Parallel connectors for Parallel Processing to improve job performance while working with bulk data sources in Talend
  • Performed root cause analysis on failed components and implemented corrective measures and Assist application development teams during application design and development for highly complex and critical data projects, also identifying the root cause of slow performing jobs / queries (HDFS)
  • Created Spark sessions by using pyspark file through Builder () method
  • Used enableHiveSupport() to use the hive in pyspark code
  • I was involved in creating data frames in pyspark code and created hive tables using Spark.sql method
  • Involved in creating sparksql based requirements
  • Used Talend most used components (tMap, tDie, tConvertType, tFlowMeter, tLogCatcher, tRowGenerator, tSetGlobalVar, tHashInput & tHashOutput and many more)
  • Worked on creation of Redwood Processes,Tables,Scripts and Jobs to trigger the Jobs In TAC (Talend Administration Console
  • Worked with Problem management teams and other Resolver Groups to Resolve specific Project Production Issues
  • Used UNIX scripts to monitor and start and resume the jobs as needed
  • Developing and Deploying Talend jobs using Talend studio for enhancements and bug fixes in Agile process
  • Involved in the design of metadata and context parameters for jobs as per the standards.Facilitated data extraction, data interpretation, and analysis using SQL queries
  • Developed both inbound and outbound interfaces using Azure Data Factory to load data into targeted database and extract the data from the database to flat files
  • Environment: Talend 6.2.1/6.0.7.1and 7.3, Talend Open Studio Big Data/DI Talend Administrator Console, Talend Management Console, Talend MDM5.1.1, Oracle 10/G11g,git ,Hive, Impala, JAVA, Pyspark,UNIX Shell Scripting, Redwood Scheduler, Power BI, Azure, Blob Storage, IBM data Stage, SSIS ,Hadoop 2.x (HDFS, MapReduce, Yarn), Spark.

Talend Developer

LussoTech LLC
03.2018 - 06.2019
  • Worked in the Data Integration Team to perform data and application integration with a goal of moving high volume data more effectively, efficiently and with high performance to assist in business-critical projects
  • Developed custom components and multi-threaded configurations with a flat file by writing JAVA code in Talend
  • Interacted with Solution Architects and Business Analysts to gather requirements and update Solution Architect Document Created mappings and sessions to implement technical enhancements
  • Deployed and scheduled Talend jobs in Administration console and monitoring the execution
  • Created separate branches within the Talend repository for Development, Production and Deployment
  • Excellent knowledge of Talend Administration console, Talend installation, using Context and global map variables in Talend
  • Create cross-platform Talend DI jobs to read data from multiple sources like Hive, Hana, Teradata, DB2, Oracle, ActiveMQ
  • Worked on different data sources such as Oracle, Netezza, MySQL, Flat files etc
  • Create Talend Jobs for data comparison between tables across different databases, identify and report discrepancies to the respective teams
  • Talend Administrative tasks like - Upgrades, create and manage user profiles and projects, manage access, monitoring, setup TAC notification.Created Generic and Repository schemas
  • Called coupa poratl api for Supplier information through Talend ESB and Internet components
  • Developed the Talend ESB jobs using components like tREST, tRESTClient,txmlmap,tfileInputJSON, tfileInputXML, tSOAP,tWebservice,tWebserviceinput,tHttpRequest for REST and SOAP calls
  • Used tloop component pagination while extracting the data third-party Coupa api for supplier and user inormation
  • Extracted ECG Gateway through Talend ESB components like tESBConsumer, tESBProviderFault, tESBProviderRequest, tESBProviderResponse, tRESTClient, tRESTRequest, tRESTResponse
  • Created Json and XML files Usng Talend ESB and created Common framework for metadata details of Talend esb jobs
  • Used the soap UI for providing required details to extracting the payload while developing The talend jobs and tested with different scenarios
  • Used for manual Rest api calls through Posteman while finalising the Table columns in stage table
  • Performed Data Manipulations using various Talend Components like tMap
  • Tjavarow, tjava, tOracleRow, tOracleInput, tOracleOutput, tMSSQLInput and many more
  • Implementing complex business rules by creating reusable transformations and robust mappings using Talend transformations like tConvertType, tSortRow, tReplace, tAggregateRow, tUnite etc
  • Created complex mappings in Talend 5.0 using tmap, tmssqlinput, tmssqloutput, tfiledelimitede, tfileoutputdelimited, tmssqloutputbulkexec, tunique, tFlowToIterate, tintervalmatch, tlogcatcher, tflowmetercatcher, tfilelist, taggregate, tsort, tMDMInput, tMDMOutput, tFilterRow
  • Extraction, transformation and loading of data from various file formats like .csv, .xls, .txt and various delimited formats using Talend Open Studio
  • Troubleshoot data integration issues and bugs, analyze reasons for failure, implement optimal solutions, and revise procedures and documentation as needed
  • Responsible for tuning ETL mappings, Workflows, and underlying data models to optimize load and query performance.Configure Talend Administration Center (TAC) for scheduling and deployment.Create and schedule Execution Plans - to create Job Flows
  • Designed and developed database objects and scripts for creating tables, sequences, indexes, triggers, views, constraints, stored procedures, functions, packages, and triggers on the tables that are required for the application in T-SQL/SQL programming
  • Created normalized tables from relational schema, obtained after translating requirements into ER diagrams
  • Developing and custom components and multi-threaded configurations with a flat file by writing JAVA code in Talend
  • Modified Legacy codes based on User requirements and Deployed into Production
  • Interacting with Solution Architects and Business Analysts to gather requirements and update Solution
  • Worked with production support in finalizing scheduling of workflows and database scripts using AutoSys
  • Environment: Talend 6.2.1/6.0.1, Talend Open Studio Big Data/ESB/DI/Cloud, Talend Administrator Console, Oracle 11g, Teradata V 14.0, git, Hive, Netezza, PL/SQL, DB2, XML, JSON,REST, SOAP UI, Postman,JAVA, Pyspark,UNIX Shell Scripting
  • Parent Company

Programmer Analyst

Vega soft inc
02.2017 - 08.2017
  • Designed technical documents and Mapping solutions for business requirements
  • Worked on support activities after deploying the code into production and troubleshoot software systems and Applications
  • Optimize performance, resolve problems, and provide follow-up on issues
  • Trained and developed the presentation layer using HTML, CSS, JSPs, Ajax, and AngularJS
  • Used JavaScript, HTML and CSS for manipulating, validating, customizing error messages to the User Interface.Deep understanding of JavaScript and the jQuery framework
  • Involved in Writing jQuery-based Ajax Requests using jQuery,Get, Post, or jQuery
  • Implemented the Drag and Drop functionality using jQuery framework
  • Trained and Implemented Object-relation mapping in the persistence layer using hibernate framework in conjunction with spring functionality
  • Data Cleansing, Integration and Transformation using Talend.Involved in exporting and importing data from local file systems and RDBMS to HDFS
  • Reading and translating data models, data querying and identifying data anomalies and providing root cause analysis
  • Proof of concepts on new technologies in Talend Cloud that are available in the market to determine the best suitable one for the Organization needs.

Programmer Analyst

Tekforce Corp
10.2016 - 02.2017
  • Involved in developing the UI using Ext-Js / JSP / HTML, XSL, XML, and java script
  • Created application portal with Java and JavaScript
  • Design, and develop technical documents and Mapping solutions for business requirements
  • Trained and Developed web applications using Java/J2EE Technologies
  • Design and develop data load integration patterns using Python on Spark
  • Involved in analysis, designing, and implementing of Business requirements
  • Developed project specific 'Deployment' job responsible to deploy jar files on to the windows environment as a zip file, later, this zip file is unzipped, and the files are again deployed to the UNIX box
  • Exported jobs to GitHub and SVN repository
  • Developed POCs to compare performance of Oracle & DB2 databases
  • Responsible for maintaining versioning of the Java jobs that are deployed in the UNIX environment
  • Created triggers/scripts for data migration and reporting purposes
  • Developed scripts for importing and exporting data into HDFS and assisted in exporting analysed data to RDBMS
  • Involved in creating Unit Test cases for testing team and Users.

Software Engineer

Softwood Software Solutions Pvt Ltd
08.2013 - 08.2015
  • Projects: Portfolio Construction and Modeling Solution, Trade Order Management (TOM) and Health care connect
  • Responsibilities:
  • Involved in all the phases of Software development lifecycle (SDLC) in an Agile/Scrum methodology which includes requirement analysis, design, development, documentation, testing, implementation, and maintenance of application software using Java/J2EE in real-time enterprise applications, Distributed n-tier
  • C++/MFC coding with respect to enhancements
  • Coding and debugging of MFC Regular/Extension Dll’s
  • Designing of GUI Modal/Modeless dialog boxes and developing GUI using MFC Classes
  • Trained in service Oriented Architecture (SOA) Design and development using Java, J2EE, Spring, Spring Boot, Gradle, Rest/JSON, Soap/XML and Microservices
  • Ensure all bug fixes are tested and released to the client in a timely manner
  • To keep abreast of all the changes and enhancements in the ITS-Invest application
  • For Enterprise clients, the client raises an issue through GSS and GSS raises a case in the Case tracker application
  • Based on research, the Client Support team assigns this case to any one of the various teams supporting ITS-Invest
  • Understanding the design requirements and specification algorithms
  • Generating auto-sys reports of all trades.Modal/Modeless dialog screens designed with standard templates.Compiling/Registering the COM Dll’s and testing the application code
  • Ensure all bug fixes are tested and released to the client in a timely manner
  • To keep abreast of all the changes and enhancements in the ITS-Invest application.Generating auto-sys reports of all trades
  • Implementing the unit test scenarios for developed code related to Trade order management and Portfolio construction and modeling solution projects.

Education

Master of Science - Information Technology Management

Campbellsville University
USA
05.2019

MASTER OF SCIENCE - COMPUTER SCIENCE

Silicon Valley University
SAN JOSE, CA, USA
08.2016

PG diploma - International business and management

University of Bedfordshire
01.2013

B. Tech - Computer Science & Engineering

JNTU Hyderabad
01.2009

Skills

  • Talend 641
  • Talend 56
  • Talend 71
  • Talend MDM 511
  • Informatica 71
  • Informatica 61
  • Hadoop
  • Big Data
  • Spark
  • HDFS
  • Map Reduce
  • HIVE
  • Redwood Scheduler
  • Power BI
  • AWS Glue
  • AWS Cloud Watch
  • Lambda
  • Step Functions
  • Visual Studio 2008
  • Visual Studio 2010
  • Visual Studio 2013
  • C#
  • Java
  • C
  • C
  • VC
  • HTML
  • SQL
  • PySpark
  • T-SQL
  • PL/SQL
  • Python
  • Oracle 9i
  • Oracle 10G
  • Oracle 11G
  • SQL Server 2008
  • SQL Server 2012
  • Amazon Redshift
  • MySQL
  • Snowflake
  • MongoDB
  • AWS Dynamo DB
  • CSS
  • JavaScript
  • XML
  • JSON
  • JQUERY
  • NODEJS
  • SOAP
  • REST
  • Windows
  • Linux

Certification

  • AWS Certified Data Analytics
  • AWS Certified Solution Architect Associate
  • Azure Data Engineer

Professionaloverview

9, Analysis, Design, Development, Testing, Implementation, Enhancement, Support of ETL applications, OLTP & OLAP environments, Data Warehouse/Business Intelligence, Talend Open Studio, Big Data, Talend Data Fabric tools, Data Warehousing Concepts, ETL/Big Data code and Mappings, Hadoop Distributed File Systems (HDFS), MapReduce, Data Modeling, J2EE platform, SQL Server, Agile methodologies

Timeline

Talend Lead(developer& Admin),AWS Data Engineer

Truist Bank, Livemindz LLc
04.2024 - Current

ETL Lead(Talend&AWS Glue) And ETL Operational Lead

Bayer Pharmaceuticals
08.2021 - 02.2024

Talend Developer

Optum State Govt solutions
02.2021 - 08.2021

Talend ETL developer and Talend Admin

Southern California Edison
06.2019 - 12.2020

Talend Developer

LussoTech LLC
03.2018 - 06.2019

Programmer Analyst

Vega soft inc
02.2017 - 08.2017

Programmer Analyst

Tekforce Corp
10.2016 - 02.2017

Software Engineer

Softwood Software Solutions Pvt Ltd
08.2013 - 08.2015

Master of Science - Information Technology Management

Campbellsville University

MASTER OF SCIENCE - COMPUTER SCIENCE

Silicon Valley University

PG diploma - International business and management

University of Bedfordshire

B. Tech - Computer Science & Engineering

JNTU Hyderabad
Sankara Reddi Nallagundla