You are on page 1of 3

Abdul Khayuum

abdulkhayuum@gmail.com
+91-9865139663
1d5720da-e7ae-424e-92d4-4f45695c5e5b

Objective
In pursuit of challenging assignment that would facilitate the maximum utilization and application of my
broad skills and expertise in making a positive difference to the organization

Technical Expertise

• Over 8+ years of experience in IT industry which includes various aspects of software development, Hadoop
& DevOps Administration
• Strong experience in Hadoop distribution such as Hortonworks.
• Hadoop Administrator Certified
• In-depth understanding of Hadoop Architecture and various components such as HDFS, Yarn, Hive, Kerberos,
Zookeeper, Kafka, Ranger, Hbase.
• Worked On automating incident creation to Service/Ask Now ticketing tool when there is any Hadoop
service failures
• Hands on experience with version control tools like GIT, GitHub and BitBucket.
• Expertise in DevOps Tools such as Git, Maven, SonarQube, Nexus/Jfrog Artifactory, Jenkins, Docker,
Kubernetes along with Shell scripting.
• Extensively worked on Jenkins for continuous integration from End to End automation for all build and
deployments & experience in report monitoring and notification functionality to report success or failure.
• Good understanding of Amazon Web Services (AWS), Creating EC2 Instances, S3, Auto scaling, ELB, IAM, VPC
and Configuring all necessary services.
• Experience in working with the release and deployment in Java/J2EE and Web applications environment.
• Hands on experience in Installing Ansible & configuring multiple hosts for automation deployment.
• Experience in working with Docker-swarm & Kubernetes container orchestration tool to deploy and maintain
Containers/Pods

Professional Experience

• Hadoop Platform Engineer at Visa Inc. from March 2019 to Present


• Lead Administrator at Dell International services from September 2015 to March 2019
• Business Analyst at Oracle India Private Limited, Bangalore from June- 2013 to September -2015.
• Junior Software Engineer at Ideation R & D Labs from 2012 July to May 2013

Technical Skills

Big Data Technologies HDP Services- HDFS, Yarn, Hive, Zookeeper, Kafka,
Kerberos, Sqoop
Version Control System tools GIT, GitHub
Operating Systems Windows, Linux
Configuration Management Ansible
NoSQL Database Hbase
Scripting Shell Scripting, Python
Application Server Apache Tomcat & Jboss
Continuous Integration Tools Jenkins
Containerization tool Docker , Docker Swarm and Kubernetes
Bug Tracking Tool JIRA, AskNow
Artifactory Repos Nexus and Jfrog Artifactory
Projects

Project Name: Visa Kubernetes Platform (Pharos)

Visa’s Machine Learning Platform (MLP) is a fully managed, multi-tenant, scalable platform to build and support
offline Machine Learning models. MLP offers the latest data science tools, seamless integration, and a secure environment
delivered as Platform-as-a-Service (PaaS).

Responsibilities:
• Building/Maintaining Docker container clusters managed by Kubernetes Linux, GIT, Docker, on GCP (Google Cloud
Platform). Utilized Kubernetes and Docker for the runtime environment system of the CI/CD to build, test deploy.
• Designed and implemented a continuous build-test-deployment (CI/CD) system with multiple component pipelines
using Jenkins to support weekly releases and out-of-cycle releases based on business needs
• Involved in development of test environment on Docker containers and configuring the Docker containers using
Kubernetes.
• Monitoring of micro services by using Grafana & Prometheus to find root cause analysis of failure services
• Enhancement / support of existing micro services
• Responsible for on boarding new users to Kubernetes cluster
• Provided end-to-end support to make sure application runs well in Kubernetes cluster
• Tracking all of the Pod's CPU/RAM utilization
• Monitoring Grafana to identify Kubernetes system metrics
• Respond to incidents and write blameless RCA's/postmortems

Project Name: India Localization

Deliver the technology solution for a domestic processing system in India that is compliant with Reserve Bank of India
(RBI) regulations.

Responsibilities:

• Designed/Installed/Configured/Maintained HDP2.6/HDP3 clusters for application development/Production.


• Built two Hadoop clusters with HDP 2.6.5 & 3.1 stacks respectively
• Responsible for cluster maintenance, adding and removing cluster nodes, Cluster Monitoring, troubleshooting,
manage and review data backups, Hadoop log files
• Engagement with all stakeholders for all Hadoop Upgrade/Patching Activities in Production
• Day to day responsibilities includes solving developer issues,
• Onboarding new users/Service id’s by providing appropriate access.
• Setting up Jenkins job for code deployments from one environment to other environment.
• Providing instant solutions to reduce the impact and documenting the same and preventing future issues.
• Secured cluster by integrating Kerberos & Ranger with Active Directory
• On boarding new projects and allocating space, file quota as per the project norms
• Responsible for cluster availability and available 24x7 on call support
• Monitor running applications and provide guidance for improving DB performance for developers
• Integrated Ambari with Service Now incident management tool using Python REST for auto incident creation.
• Involved in scheduling the jobs using D-series and Linux crontab
• Involved in integrating Prometheus with Hadoop Clusters
• Written the shell scripts to deploy /automate small service in all of the nodes & monitoring the health check of
Hadoop daemon services and respond accordingly to any warning or failure conditions.
• Worked on Configuring queues & node labels with capacity scheduler.
• Worked on trouble shooting the Hadoop cluster issues and fixing the cluster issues
Proof of Concepts AskNow - Ambari Integration

Languages - Python, Shell script

Description
This project helped us to integrate ServiceNow, which is ticket portal in Dell and Ambari, which provides UI
interaction to Hortonworks distribution.

Roles & Responsibilities


• Using Curl Command to get service status and filter out services which are not running
• Involved in writing script in python to cleanse the curl output
• Written bash script to combine curl & python script execute in Linux
• Automated this job by using D series

Passport Details

Passport Number Date of Issue Expiry Date Place of Issue


K5802865 18/09/12 17/09/22 Chennai, INDIA

You might also like