Summary
Overview
Work History
Education
Skills
Websites
Certification
Timeline
Generic
Gabriel Llanes

Gabriel Llanes

Software Engineer
St. Petersburg,FL

Summary

Experienced in tutoring programming, data structures, and discrete mathematics during undergraduate studies at Florida International University\'s Computer Science department. Hired as a PySpark developer after graduating and trained under an Advanced Data Engineering practice, gaining skills in ETL operations and Data Analysis techniques. Contributed to the development of various frameworks for Big Data solutions, collaborating with data scientists, solution architects, DB admins, and other engineers. Took on responsibilities in Data Management, Data Engineering operations, and Test Automation, including Data Integration, Data Quality, and testing. Gained hands-on experience in cloud and DevOps tasks while developing reusable CDK files for automating AWS infrastructure configurations and deployments through AWS Code Pipeline. Owned disaster recovery planning and implementation logic and templates for the rest of the resources. Worked on lambda APIs using Python runtimes, successfully deploying resources in Dev and QA environments across multiple regions. Ensured the functionality of these resources met expectations through automation and thorough testing. Accomplished engineer with extensive cloud monitoring, deployment, and troubleshooting skills. Track record of defining, building, and maintaining infrastructure using vendor-neutral and platform-specific tools. Exceptional leadership acumen and strong communication skills. Resourceful in evaluating client requirements and implementing user-centric systems using code and cloud-native technologies. Proficient in AWS and Python with foundational knowledge in cloud computing, virtualization, and network management. Committed to leveraging expertise to drive operational efficiencies and foster innovation.

Overview

6
6
years of professional experience
7
7
years of post-secondary education
5
5
Certifications

Work History

Software and Cloud Engineer/ AWS Infrastructure

Accenture
09.2024 - 01.2025
  • Company Overview: Client: Rank 106th on Fortune 500 Finance/Tech Company
  • Responsible for developing, configuring, and testing new components AWS
  • Created brand new repository from scratch that deploys a Lambda and SQS Queue
  • SQS would receive data from a lambda API, the SQS would then send the message to the Lambda and process/transform the message to build and API body and properly authenticate and make an API request to submit the API to a data stream and publish the data to a data lake
  • Wrote the logic and the unit/component tests for these components and have successfully deployed to Dev and QA environments, in both regions of the QA environment
  • Validated data was successfully submitted in the data lake by creating snowflake tables to map to data lake data sets and querying the data in snowflake
  • Provided leadership and support when lead was OOO or whenever team was blocked or stuck on any issues
  • Whenever our team lead was OOO, I took over running daily standups, noting the team members blockers or any issues they faced
  • Discussed a game plan for the rest of the sprint with the team and assigned Jira tickets to the rest of my team and reached out to my team providing any support they needed while working on my tickets
  • Worked closely with team lead and other team members to make progress on any blockers they faced and discussing next steps, and any concerns necessary to raise to the client to confirm or gather their input
  • Provided a demo and presentation to client after implementation of updated instrumentation on components and provided insights to how new metrics of the listed APIs were being handled
  • Raised any concerns to the client and work arounds for trade offs that were noticed after implementation of new monitoring instrumentation on APIs
  • Client gave positive feedback in my style of presentation and addressing any discrepancies of the way metrics of monitoring were captured and displayed
  • Client: Rank 106th on Fortune 500 Finance/Tech Company

DevOps Engineer/ AWS CDK Infrastructure (Data Migration)

Accenture
04.2023 - 06.2024
  • Company Overview: Client: Large Insurance group ranked in Fortune 500
  • Responsible for delivering CDK stacks for pipeline stack deployments in AWS
  • Created stacks of AWS resources that included, lambdas, API gateways, statemachines, event bridges, IAM roles, secrets in secret manager, batch job definitions, compute environments
  • Built and deployed pipeline stack in AWS code pipeline and monitored and troubleshooted failures with stages of the pipeline deployment
  • Troubleshooting involved looking at logs of the AWS resources being deployed, Cloudformation logs, Cloudwatch logs, and CDK logs
  • Coordinated with leadership to acquire use cases for realistic mobile test cases from actual users of the application
  • Provided support in debugging any failures with App Dev team
  • Assigned defect tickets and worked with developers to fix any bugs in different repos that came up in different environments
  • Worked with developers to troubleshoot failures they were in deployments and ran and monitored deployments in different environments when developers were ready to deploy
  • Provided support internally (DevOps team)
  • Offered help and guidance with other peers in my team whenever they got stuck or faced issues, I would help them get unstuck and help them deliver their assigned tickets before deadline
  • Jumped in calls and tickets assigned to my teammates to help guide and approach the ticket with a solution
  • Shared templates of my CDK code this I have created for new resource implementation for other team members to reuse and make necessary modifications
  • Client: Large Insurance group ranked in Fortune 500
  • Improved code deployment efficiency by automating processes with CI/CD pipelines.

DevOps Engineer/Production Support

Accenture
04.2022 - 02.2023
  • Company Overview: Client: Rank 106th on Fortune 500 Finance/Tech Company
  • Owned several vulnerability remediation tasks from Checkmarx scan results
  • Made code changes to mostly java files and some python to remediate vulnerability to pass Checkmarx scanner
  • Identify false positive results and discuss with checkmarx team to close vulnerability and push components to prod
  • Documented process of deployment for hosting server for new updates and releases
  • Migrated repositories to a new system of deployment process
  • Identify components and services that were going to be deployed in AWS
  • Updated code base files by cloning repo and pushing changes to repo triggering automatic deployment in Jenkins
  • Monitor and trouble shoot deployments in Jenkins until successful deployments were made in Dev, QA, then Prod
  • Coordinate with team of developers to fix deployments failures
  • On call shifts for primary support for Prod
  • Monitored Daily, Nightly, and Overnight jobs that were ran in job scheduler Arow
  • Received alerts on my phone from Pager Duty and attended to the failures from the jobs received from the alerts
  • Viewed logs in job scheduler and production server to analyze errors and coordinate with development team to trouble shoot the failures
  • Failures mainly involved database issues and analyzing SQL queries from XML files in the production server and were executed and analyzed in the Microsoft SQL Server databases
  • Client: Rank 106th on Fortune 500 Finance/Tech Company

Data Engineer / ETL Developer

Capgemini
10.2019 - 12.2019
  • Company Overview: Client: Retail / Restaurant
  • Project Description: Migrate customer data from Oracle legacy system to Postgres GCP instance seamlessly
  • The data model created was for the different API objects that would be called upon in the next phase of development
  • Responsibilities: Validate the data mapping by ensuring the target data model fit the source data model that would be extracted into the pipeline and flag any fields that may cause a mismatch, missing target, missing values, or duplicates
  • Once mapping was validated, Talend was used to develop the jobs that would run the different pipelines for the different APIs in parallel
  • These jobs pulled from the Oracle source and loaded into the Postgres GCP staging tables
  • Once the staged tables were loaded, transformations were made on the data with PostgreSQL scripts and loaded into the target tables in the same database using the same scripts
  • These jobs were then replicated in Talend
  • Client: Retail / Restaurant
  • Special Tools: Talend, Excel, Oracle, Google cloud, Postgres

Data Engineer / Developer

Capgemini
01.2019 - 04.2019
  • Company Overview: Client: Cruise Lines (Travel / Entertainment)
  • Project Description: Delivering a data management platform that integrates cloud native data lake architecture that leverages vendor managed services which are Microsoft's ADLS and ADF, MongoDB, and Databricks
  • Container based services included Talend, Dremio, and Neo4j
  • This platform would then offer data driven decisions that would increase efficiency for business users, consumers, and company applications on their designated platforms
  • Responsibilities: Phase 2 of the data pipeline which was develop jobs in Talend that would ingest raw data and transform the data and persist the data in different storages based on their transformation jobs
  • Generate mapping entities of the consumer objects to load into Neo4j graph database
  • Extract consumer objects from ADLS and load into MongoDB as a document store
  • These document objects were then extracted as CSV files and converted to parquet format using PySpark in Databricks to leverage faster in-memory computing capabilities with spark and Dremio
  • Tested and recorded metrics for the batches in each of the jobs
  • Then trouble shoot and document the next steps and their results
  • Client: Cruise Lines (Travel / Entertainment)
  • Software / Languages: Neo4j, cipher, python, Talend, Databricks, Spark, Azure, MongoDB, Dremio

Data Analyst / Developer

Capgemini
09.2018 - 01.2019
  • Company Overview: Client: Cruise Lines (Travel / Entertainment)
  • Project Description: A virtual concierge application leveraging AI capabilities such as Recommendation-as-a-Service that would use customer data to offer a better shopping experience for the customer
  • Customer demographics, historical purchase, and location data (captured through computer vision) were used to analyze and predict customer behavior based on a chosen ML model using Azure's indigenous framework
  • Responsibilities: Built interactive data visualization POCs using client's historical dataset with R Shiny framework for the front-end and controller layer of the application
  • The backend end was supported by the Neo4j graph database that would create and store customer relationship data
  • These relationships would provide deep insights depending on the data model established in the database
  • Cleansed client data flattening files using Pandas library, removing nulls, matching standard values and character casing
  • Then loaded data into graph data testing the k means clustering model and displaying the dashboards to the clients
  • Client: Cruise Lines (Travel / Entertainment)
  • Software / Languages: Neo4j, cipher, R, Shiny, python, HTML/CSS

Education

B.S. - Software & Information Technology

Florida International University
Miami, FL
01.2011 - 06.2018

Skills

Data Analysis and Development

Certification

Pyramid Academy Junior DevOps Engineer, 01/01/21

Timeline

Software and Cloud Engineer/ AWS Infrastructure

Accenture
09.2024 - 01.2025

DevOps Engineer/ AWS CDK Infrastructure (Data Migration)

Accenture
04.2023 - 06.2024

DevOps Engineer/Production Support

Accenture
04.2022 - 02.2023

Data Engineer / ETL Developer

Capgemini
10.2019 - 12.2019

Data Engineer / Developer

Capgemini
01.2019 - 04.2019

Data Analyst / Developer

Capgemini
09.2018 - 01.2019

B.S. - Software & Information Technology

Florida International University
01.2011 - 06.2018
Gabriel LlanesSoftware Engineer