
Experienced, result-oriented, resourceful and problem-solving Data Engineer with 8 years of diverse experience in Information Technology field, includes Development, and Implementation of various applications in Big Data and Cloud environments in Storage, Querying and Processing.
We are working on the data migration for patients with American heart association and American college of surgeons. The Main aim of the Project is to Migrate the Legacy data from Traditional databases to Cloud GCP and Distributed environments. When we are doing this effort lot of applications got merged into Modern tools like Teradata system to Snowflake database and also using advanced schema registry called Nebula to capture the nature of the patient data. Oracle is the Data-ware house on top of GCP where we can perform any kind of Ad hoc queries on data with in-memory techniques with very less latency even petabytes we can spin in secs. On the other hand, we implemented lambda jobs to push file from one source to Other without manual effort and having great expertise on IAM roles creation and config changes between the VPC. Provisioned Dev Hadoop cluster on Google Compute Instances and Installed all the client’s Through Ambari and attached EBS volumes to make enough resource for the Developing Team!
Responsibilities: