A founding platform engineer at Codewired, Inc. (Established in July 2019 in New York) with an amazing 18-year track record (Since 2005) of using cutting edge proprietary and open-source software’s to build solutions consumable by humans and machines for all kinds of organizations and businesses in the United States and Sub-Saharan Africa. In the U.S, I have joined over a dozen companies and teams to do impactful work that has given me the opportunity to acquire different skillsets in the I.T industry.
Apart from the big companies i have joined forces in the United States, i have also worked on multiple mission critical one-off projects involving modern technology tools and platform and i am willing to share on demand. Below are some of the headlines:
During the pandemic year, i went deeper into Web3 and AI, acquired nano degrees while mastering WASM & EVM based Blockchains (Substrate on Polkadot, Solana, Avalanche, Cosmos & Polygon), Deep Learning, Natural Language Processing, and Generative A.I core tools and Large Language Models like: Keras, PyTorch, TensorFlow, Copilot, Jupyter AI, OpenAI, LLaMA, Gemini, Vertex AI, Hugging Face, GPT, LangChain, DALL-E, Whisper, Pinecone.
I have multiple mission critical solutions that i have built from 2020 till date and will share project links on demand.
A Lead, hands-on architect and solid team member on a Global security focused project, involving a small team of Full-Stack Kubernetes Engineers, administering, configuring and building out innovative security tools for a larger team of support engineers, to manage and secure over 600+ Kubernetes clusters. Actively contributing to the following:
Rancher Architecture & Administration: => Hands-on advanced level Cluster Administration and configuration involving the following activities: Installing Autoscalers, setting up k8s namespaces & persistent storage, Certificate Rotation, Encryption Key Rotation, configuring Nodes and Nod Pools, creating security policies for pods and objects, setting up highly available K3s & RKE Kubernetes Clusters, creating and maintaining Helm Charts and Apps, setting up monitoring for workloads and customizing Grafana dashboards, enabling Istio, adding deployments and services with Istio Sidecar, setting up Istio Gateway and components for Traffic management, enabling Rancher experimental features in DEV, STG and PROD clusters and much more.
React & Typescript (Dashboard Portal): => Advanced usage of core React and JavaScript libraries and concepts with Typescript to extend the Next.js framework to build a quick and responsive progressive Single Page Application Dashboard, from scratch. The first SPA version of the web console utilized Next.js 13, bootstrapped with the Page Router version, utilizing Microsoft Identity and MSAL React for token access management and RBAC, and distributed as a client side rendered web application for CDNs and packaged with Nginx Server for container workloads. The second version of the App utilized Next.js 14 with the App Router version, but this time implemented as a fully server side rendered App for advanced security. Second versions Identity and Access Management concept was totally different, it used a custom-built Microsoft Identity OAuth implementation which involved advanced usage and combination of Iron-Session with MSAL Node Auth libraries and concept to deliver an amazing User Experience.
FastAPI & Python (Backend Rest API): => Advanced usage of FastAPI, Starlette, Pydantic and SQL Alchemy modules with Python for building durable, fast REST API end points. In depth usage of modern SDLC to craft scalable, secure and highly responsive API endpoints used for infusing Artificial Intelligence, database integrations and efficient communication with various external endpoints and the Microsoft Azure and AWS Python SDKs for compute and data storage resources, enabling connecting clients on different form factors (Web, Mobile, Desktop). The core functionality of the endpoints includes advanced usage of SQLModel & Pydantic libraries with Python Objects to retrieve chunks of data from the source of truth Postgres database and serializing these chunks to JSON data for network transportation to implementing clients. Other endpoints received data and relayed them through secure custom-built Natural Language Processing modules, integrated with Large Language Models, all implemented with Python.
Azure Databricks, Postgres && Redis (Database and Distributed Caching): => Design and implementation of schemas and scripts for creating database objects, functions and procedures for secure extraction, transformation and loading of external data into Azure Postgres. Advanced usage of Redis Cache on Azure to store and retrieve bulk data securely, using different data formats and structures. Advanced usage of SQL Warehouses in Azure Databricks to provide transactional operations in a relational database like environment for REST APIs. Creation and administration of Delta Lake tables in Databricks, architecture and implementation of PySpark libraries that use Spark’s Structured Streaming to stream data using Delta Lake table as a streaming source or sink. Created, ran and maintained various Apache Spark Jobs on Azure Databricks to transform, analyze and visualize data at scale.
Azure Data Lake Storage Gen2 (Azure Blob Storage Lake House): => Carried out hands-on projects targeting syncing and connecting data stored in Azure SQL Storage accounts that has ADLS Gen2 enabled with a variety of Azure Big Data Resources, including but not limited to:
1. Azure Synapse serverless SQL pool.
2. Azure Databricks and Spark
Performed ETL of large volumes of data using Azure HDInsight Clusters. Handled data transformations using Apache Hive, transported and loaded data efficiently into Azure SQL Databases using Sqoop. Implemented data lake capture patterns to update Databricks Delta tables by creating Event Grid subscriptions and Azure Functions that received notifications from events to run complex jobs in Azure Databricks.
CI/CD Pipelines & Tooling Automation: => A fully automated Dev-Sec-Ops CI and CD pipeline for creating secure cloud resources and Kubernetes clusters, while engaging directly in creating custom tools for building and maintaining secure images that undergo real-time advanced security scans and packaging these application builds and deploying them as workloads into Kubernetes Clusters using approved open-source software’s and Cloud Native Computing Foundation tools. Full participation in building cloud native tools with programing languages like; Bash, Python & Golang. Tooling Automation included but not limited to:
A highly dependable team member and lead engineer, supporting, implementing, extending and deploying Kotlin based Multiplatform REST APIs, libraries and applications that power various core functionality of the core Grindr framework.
The day to day activities involves creating structured Multiplatform concurrency programs and shared business logic layers used across multiple teams and products.
KEY PROJECT:
Joined a small team of amazing engineers in the Trust & Safety department to deliver a multipurpose internal and external API mission critical service that was integrated into the core Grindr framework of smart multipurpose services used daily by millions of users.
Daily activities involved working with teams from other cross sections to design, implement, secure and test a mission critical messaging service that was layered on top other core micro-services, relaying sensitive data across different channels, devices and form factors.
I built a service layer that integrated nicely and made use of all the important platforms and tools used within the core platforms framework; AWS EKS, ECS, Fargate and DynamoDB Services, Web-sockets, Kafka, Postgres, PostGIS, Redis, DataDog
Tools & Skillsets:
My primary role was to establish, monitor and maintain automated testing (Unit/Component, Integration, UI based End to End and Performance/Load Testing) as well as integrate all testing features and components into an existing Continuous Integration and Delivery Pipeline, for a suite of web, mobile and server-side applications that powered the core of the client facing applications used by hundreds of thousands of daily users.
The secondary role involved advanced usage and implementation of different open source and proprietary OAuth providers like Okta, Auth0 and Microsoft Entra’s MSAL to orchestrate Identity and Access Management Test Automation alongside Token based Authentication for API Integration Tests with libraries like Jest and Pytest, automated multi-factor authentication enabled browser-based End to End UI tests with tools like Cypress and GetMyMFA, advanced robust REST API Load Testing with tools like Locust and python.
We also introduced and took responsibility for the following:
Advanced planning, designing and usage of Auth0 APIs and SDKs, in-depth configuration of Auth0 Teams, Tenant Settings, Applications, Dashboard Access, Multi-Factor Authentication and Security from the web portal console. A few of the Application protection concepts utilized includes but not limited to:
Platforms, Tools & Skillsets:
I joined an amazing and exceptionally talented team of software and quality assurance engineers, in a fun and conducive work environment to perform the following activities: =>
Some of the core .Net Framework and ASP.Net tools and concepts used during legacy application code conversion: =>
Platform, Tools & Skillsets:
I was engaged to help with the design and development of two major green field projects which involved full automation of critical business processes on Azure cloud using various Microsoft Azure core SDKs, Artificial Intelligence and Machine Learning tools with Python Machine Learning tools. Both projects were completed successfully. We also helped in creating a custom Microsoft Azure and 365 solution that Microsoft did not have an immediate solutions for.
CORE RESPONSIBILITIES:
A.I Orchestration Engine: Played a key role in building out Advanced Node.JS Durable Azure functions that served as the backbone for an A.I Orchestration engine which was layered on top of Azure Cognitive Services. We used advanced Node.JS concepts with typescript to extract data from Large PDFs in a timely and efficient manner. We also had other specialized Azure functions which worked in aggregation to load extracted data into a structured and unstructured data store.
A Detection Engine: Two months down the line, we had issues with anomalies in our data sets. We found holes in the data stored on the semi-structured database and we were handed over the task to use Machine Learning to detect anomalies from bad data sets extracted with PowerBI from the database, fix the holes and migrate clean data to spreadsheets for Data Scientist and Analysts to consume and visualize.
A.I Powered Clustering Engine: This led to the Analysis, design, implementation and deployment of an unsupervised machine learning algorithmic solution used for clustering large data to detect anomalies from document classifications within the larger orchestration engine. We developed and deployed a set of Event Based and HTTP Triggered Azure Python function apps, using Pandas, NumPy and Scikit-learn Machine Learning modules to fully automate the entire process which involved reading large data files uploaded to Azure Storage, followed by exploratory data analysis and data clustering, predictions and visualization.
Custom Microsoft Graph SDK Solution with Microsoft 365: Greatly involved in building a distributed modern business solution that cut across multiple Azure Logic and Function Applications integrating real time with SharePoint and custom aggregated function apps using .NET (C#) and Python. After the initial versions of the respective application was deployed the business ran into a roadblock with Azure Logic Apps not being able to process excel files larger than 100mb. We had to re-design and re-architect the entire system and then built a custom .NET (C#) Function App that did the heavy lifting of processing larger files up to 1gigabyte. We created a custom solution which was not available in the larger Azure Cloud community which involved using some advanced concepts hidden within the Graph SDK to manipulate Microsoft 365 core processes.
Python Automation: Complex Spread Sheet (CSV & Excel) automation with Pandas and NumPy, File System Operations’ Automation, email scheduling and broadcasting automation, plus automation of other mundane and redundant daily tasks.
Platform, Tools & Skillsets:
A dependable team member that also worked independently to research, create and find new ways, techniques and technologies to solve critical business needs within the larger eco system where no solutions were readily available. I embarked on two greenfield projects that was a huge success, while mentoring and leading other senior engineers by example.
Design and Implementation of AWS Lambda backend services:
Individual and Team contributor to Serverless REST APIs built with Typescript and Nest.js (Nest.js is a Node.js Typescript Framework) that powered distributed AWS Lambda Serverless applications built with the Serverless Framework and Serverless Stack (SST) to interact with a good number of other AWS resources in an asynchronous event driven environment. Directly involved with the design and implementation of cloud native automation tools for data backup and node synchronizations using multiple AWS SDKs, programming languages and techniques.
Created distributable, multi-layered and innovative Linux based Docker Templates, used for orchestrating custom AWS Lambda Services that ran inside custom built containers using newer processes that supported bringing your own container with packaged application suites (BYOC). These custom lambda layers where configured and deployed to fit business needs that could not be met with the standard AWS Serverless offerings.
Directly involved in creating PowerShell Core and Bash scripts that ran inside AWS Lambda child processes within the custom containers we built. These environments were used to power Extraction Transformation and load of Data from Structured (Microsoft SQL Server Containers) and Unstructured (MongoDB) Databases into AWS DynamoDB and Document DB within a larger and more robust Step Functions workflow.
Platform, Tools and Skillsets
I was an active member of the Core Backend Production Engineering team responsible for building out and maintaining new features and other complex server-side logics and algorithms that was built from the ground up for Belong Gaming's web portal, desktop and mobile platforms. All of the application suites were hosted on Amazon Web Services using cloud native technologies.
I worked directly with the CTO and Senior Architects to implement event driven architecture and server side frameworks, designed and built core infrastructure and GitOps tools for automating development operations and extended CI/CD processes and pipelines with GitLab and Linux containers.
Core Responsibilities:
Platform, Tools & Skillsets
A dependable team member that worked independently to specify, plan, design, develop, test and support software components as assigned. Worked directly with the team to establish necessary requirements specifications and test plans for software product validation and was responsible for translating requirements into design and implementation of well-structured and documented software components.
Collaborated with the team to implement new software component designs or enhancements to existing software products; participated in the implementation of more complex subsystems and systems. Troubleshooted and debugged issues within existing automation systems and implemented modifications to resolve production issues.
Participated with the team in design reviews and code inspections in a constructive manner. Also ensured adherence to development policies and procedures while taking part in technical design reviews and providing clear, actionable feedback for project team members.
Daily .NET Engineering Activities:
Platform, Tools & Skillsets
My fourth engagement under the umbrella of Codewired came in form of solution architecture and guidance to a core internal team at AT&T. I was a hands on DevOps developer and architect working with a small team of smart engineers who's sole aim was to migrate two on premises Kubernetes clusters and multiple VMs, running application suites used by multi-million users across the United States.
During my short stint, I created a POC infrastructure and application workload deployment pipeline for the migration project using Azure DevOps and Terraform.
Below were some of the Terraform concepts i used to succeed:
I also took part in the following activities: =>
Architected and designed Azure network services, including Azure Firewall, Hybrid Connectivity, NSGs, Routing, Networking for AKS (Azure Kubernetes Service), APIM (Azure API Management), Azure Front Door, and Application Gateway.
Established and enforced Azure governance best practices, including Identity and Access Management (IAM), Access Control, Azure Policy, and Management Groups.
Collaborated with cross-functional teams to gather requirements, developed technical documentations to show case implementation of Azure infrastructure architectures for cross team reference.
My third engagement with Codewired, Inc. involved DevOps development and orchestrating infrastructure and application CI/CD pipelines on private and public cloud platforms using various programming, scripting and DevOps orchestration tools and technologies (PowerShell & Bash, Python, GoLang, ASP.Net Core 3.1(C#), NodeJS, Kubernetes, Docker, Keel, Helm, Knative, Prometheus, Ambassador, Istio, Jaeger, Grafana and many others).
Some Key Projects:
1. Enhanced, automated infrastructure provisioning for multiple Azure Data Resources: We designed and built Azure DevOps Infrastructure as Code pipelines using Azure RM, PowerShell Core and Bash, with full automation of several Azure Data Resource provisioning like: Databricks, Data Factory, Cosmos DB, SQL Server and Redis. We implemented post-deployment strategies with automated scripts to create, retrieve and apply configuration and copy access tokens to Key Vaults for resources like Databricks and Data Factory.
2. Enhanced Custom Pipeline Triggers and State Management for Azure DevOps Pipelines: We created a monstrous parallel build and deployment pipeline with the modern build only pipeline that supported YAML and scripting. We went further to expose our pipeline as a REST API with custom payloads. And because the client couldn’t use Terraform at that time, I went had to build in intelligent state managements with PowerShell Core and Cosmos DB data store. This enabled our pipelined to save current deployment states and restart intelligently during failures.
3. Extending Kubernetes with Golang: I helped the team build custom tools and resource definitions that were designed to use and to operate in concert with Kubernetes facilities within the control plane, as well as enterprise grade REST APIs we deployed as micro-services to worker nodes using tools and concepts like:
I was later promoted to Associate hands-on Cloud Architect that greatly contributed to the modernization of EY's enterprise AI powered platform running on multi-tenants, multi-regional Cloud Platforms.
Platforms, Tools & Skillsets
This was the second engagement as a consultant from Codewired, Inc., geared towards helping the Big Data merchants migrate legacy robust APIs to GraphQL. In my first two weeks, I took on an uphill task to build a POC application that was meant to upgrade one of the major application's Angular frontend native http client to the sleek GraphQL powered Apollo Client.
The first project was a huge success and led to a second successful project to convert the internals of some ASP.NET Core powered AWS Lambda serverless applications from their native Open API powered REST services to use ASP.NET GraphQL server. The Apollo clients from the first project, successfully integrated and consumed the new GraphQL server endpoint. I also worked on Node.js Powered AWS Lambda endpoints that where implemented with JavaScript and Typescript. This was quite a lot for me to process in short time but ultimately, it was obvious i didn't bring a knife to a gun fight.
Platform, Tools & Skillsets:
This was my very first engagement as a consultant for Codewired, Inc. and the mission was to help represent and lead a team of eight off-shore engineers, restructure, manage and evolve American Water's home owners web portal. In my first month i built boiler plates for the Angular frontend and Node.js backend, helped normalize the mongodb database while working with the Kubernetes and MuleSoft API team to greatly reduce application suite downtimes.
Two months in, i started working with the Data team to improve and increase the usage of Kafka across the organization, introduced Kafka SQL (KSQL) and greatly contributed to the architecture and re-design of core Microsoft Dynamics modules on the CRM platform
Platform, Tools & Skillsets:
I was a full time employee of Tata Consultancy Services deployed to lead a team of on-shore and off-shore engineers who built and released the MVP for an Artificial Intelligence powered Analytics portal solution, two weeks before i came onboard.
I steered the ship forward by solving critical production bugs and providing better direction and technical solutions for the Kendo UI frontend and the ASP.NET MVC5 backend.
Four months into the project, i started leading discussions, designs and the actual implementation of some of the Azure infrastructure and Big Data solutions the application suites depending on. This led to the re-design and upgrade of the entire system and solution.
Platform and Tools:
Skill Sets
I was a full time WIPRO employee deployed to the San Francisco East Bay area to join a four man team of contractors that was onboarded to Change Healthcare's core development team to completely re-brand five core applications built ten years earlier by the premier technology team of McKesson's Relay Health subsidiary.
In June 2018 after a successful first phase of the re-branding project, i was recommended by my manager to help the DevOps team build infrastructure tools that will completely automate their cloud migration efforts to Azure and AWS.
Platform, Tools
Skill Sets
Convolutional Neural Networks
Deep Learning & Large Language Models
Generative AI Integration with existing systems
Cloud Migration of Monolith and Legacy Applications
Mentoring junior, intermediate and senior developers
DevSecOps and Site Reliability Expert
Extending Blockchain protocols (WASM & EVM)
Advanced Smart Contracts development
Building robust REST APIs on multiple platforms
Kubernetes administration and development
Big Data Streaming, Extraction, Transformation and Loading