You are on page 1of 8

Shampa MAKINENI

Sr. Cloud/DevOps Engineer


mrisshie@gmail.com
+1 (434) 767-8112
LinkedIn: www.linkedin.com/in/sai-m-7192b8207

Professional summary:

6+ years of extensive experience in Automating, configuring, developing and deploying instances on cloud environments
and Data centers and cloud technologies like AWS /Azure. And software configuration and build/ release management tools
Maven, Chef, Ansible, Puppet, Terraform, Docker, Kubernetes, Terraform and monitoring tools.
 Experience with Microsoft Azure Cloud Services (PaaS & IaaS) – Virtual Networks, Virtual Machines, Cloud Services,
Resource Groups, Express Route, Traffic Manager, VPN, Load Balancing, Application Gateway, Auto Scaling.
 Experience deploying Infrastructure as Code (IoC) applications using ARM Templates (JSON).
 Worked with Google Cloud (GCP) Services like Compute Engine, Cloud Functions, Cloud DNS, Cloud Storage and Cloud
Deployment Manager and SaaS, PaaS and IaaS concepts of Cloud Computing and Implementation using GCP.
 Experience in Designing, Architecting, and implementing scalable cloud-based web applications using AWS and GCP.
 Experience in writing Infrastructure as a code (IaC) in Terraform, Azure resource management, AWS Cloud formation.
 Created reusable Terraform modules in both Azure and AWS cloud environments.
 Certified AWS Developer with experience in solution architecting applications and running them in the cloud and
experience in network and security design for applications in the cloud. Also done extensive automation using
cloud APIs and SDK’s to create infrastructure programmatically in the cloud.
 Well versed in using Ansible and Ansible Tower to automate repetitive tasks, to deploy critical applications quickly,
and proactively manage the changes and wrote several playbooks to manage Web applications.
 Highly experienced in writing Ansible playbooks with Python SSH as the Wrapper to Manage Configurations of AWS
Nodes and Test Playbooks and run Ansible Scripts on AWS instances using Python.
 Hands-on experience in configuring the Chef Server Enterprise on-premises, workstation bootstrapped the nodes
using Knife and automated by testing Chef Recipes, Cookbooks with Test – Kitchen and Chefspec.
 Proficiency in creation and deployment of a tool to automate branch and project creation
in GIT using Groovy in Jenkins file and automating with Chef and Ansible.
 Experience in installing and configuring Puppet that includes the installation and configuration of the Puppet Master,
agent nodes and admin control workstation
 Hands-on experience in using OpenShift for container orchestration with Kubernetes, container storage, automation,
to enhance container platform multi-tenancy.
 Extensive knowledge on using Terraform and Ansible, migrate legacy and monolithic systems to AWS and managing
Ubuntu, Amazon Linux and RHEL virtual servers on AWS EC2 instance by creating Ansible Nodes.
 Worked on optimizing volumes and EC2 instances and created multiple VPC instances. Deployed applications on AWS
using Elastic Beanstalk and Implemented and set up Route53 for AWS Web Instances.
 Experienced in deploying infrastructure as code with Terraform ensuring scalability and fault tolerance using AWS
services like Elastic Load Balancer (ELB) and availability zones.
 Expertise in Deployment Automation, Release Management, Provision full stack using AWS CloudFormation and Elastic
Beanstalk.
 Created Docker Images using a docker file, worked on Docker Container Snapshots, removing images, and managing
Docker Volumes.
 Virtualized the servers using Docker for the test environments and dev-environments needs and also configuring
automation using Docker containers.
 Experienced in Installing and configuring Splunk to monitor applications deployed on application server, by analyzing
the application and server log files.
 Developed and Implemented Kubernetes manifests for deployment of microservices and installation of Prometheus,
Grafana monitoring pods into Kubernetes.
 Experience in setting up Application Metrics dashboards using tools such as Elastic - search, Kibana and Grafana.
 Relied upon as SME for (Splunk, Datadog, Grafana) monitors.
 Skilled with Python, Bash/Shell, PowerShell, Ruby, Perl, YAML, Groovy. Developed Shell and Python Scripts used to
automate day to day administrative tasks and automation of the build and release process. Skilled in working as a
Linux/Unix system administrator on RHEL, Ubuntu, CentOS.
 Good understanding of the principles and best practices of Software Configuration Management (SCM) in Agile,
SCRUM, Waterfall methodologies.
 Wrote PowerShell scripts to automate application deployment and data collection.
 Automating the hardware flow using the batch files and shell scripting.

Technical Skills:

Cloud Platforms AWS, Microsoft Azure, Google Cloud Platform (GCP), OpenStack and PCF.

Continuous Integration Tools Jenkins, Bamboo, TeamCity.

Continuous Deployment Tools Docker, Kubernetes Clusters.

Configuration Management Tools Ansible, Puppet and Chef.

Source Control Management Tools GIT, Bitbucket and SVN.

Build Tools Maven, ANT and Gradle.

Tracking Tools JIRA and Orange Scrum.

Artifact Repositories Nexus and Artifactory.

Logging & Monitoring Tools Nagios, Splunk and ELK Stack (Elastic search, Log stash & Kibana), CloudWatch.

Web Servers and Application Servers Apache, Nginx, JBOSS, Apache Tomcat and WebLogic.

Puppet Amazon Aurora, Dynamo DB, MongoDB, Oracle, SQL Server, MySQL.

Operating Systems Windows, Linux/Unix and MAC OS.

Network Services and Topologies LDAP, DNS, Web, FTP, Sub netting, LAN, VPC, WAN and firewalls.

Scripting languages Shell, Python, SQL, XML, HTML, CSS3, Ruby, JSON and YAML.

Cloud Migration, Infrastructure Spin-up Terraform, CloudFormation and Azure Resource Manager Templates.
Tools

PROFESSIONAL EXPERIENCE:
Client: Alaska Airlines, Seattle, WA Jan 20 - Present
Role: SRE/Cloud Engineer

Roles and Responsibilities:

• Configured Azure ExpressRoute to establish connection from Azure to On-premises datacenter. Working knowledge on
Azure Fabric, Micro services, Lot & Docker containers in Azure.
• Expertise in Microsoft Azure Cloud Services (PaaS & IaaS), Application Insights, Document DB, Internet of Things
(IoT), Azure Monitoring, Key Vault, Visual Studio Online (VSO) and SQL Azure.
• Created scripts using Azure PowerShell for automation and build process also managed and Hosted plans for Azure
(IaaS) Infrastructure, implementing and deploying workloads on Azure virtual machines (VMs).
• Designing Azure Resource Manager Template and extensive experience in designing custom build steps using
PowerShell.
• Worked on Azure Services like IaaS, PaaS and worked on storages like Blob (Page and Block), SQL Azure. Well
experienced in deployment & configuration management and virtualization.
• Recreating existing application logic and functionality in the Azure Data Lake, Data Factory, SQL Database and SQL
dataware house environment. Experience in DWH/BI project implementation using Azure DF
• Migration of on-premises data (Oracle/ SQL Server/ DB2/ MongoDB) to Azure Data Lake Store (ADLS) using Azure
Data Factory (ADF V1/V2).
• Deploying Azure Resource Manager JSON Templates from PowerShell worked on Azure suite: Azure SQL Database,
Azure Data Lake, Azure Data Factory, Azure SQL Data Warehouse, Azure Analysis Service
• Built interactive and dynamic action trigger platform using Python, Azure ML to notify trial site for potential
abnormal behavior.
• Implemented Predictive Analytics/ Machine Learning using Azure Machine Learning (Azure ML) Studio, AWS ML &
Python
• Developed a stream filtering system using Spark streaming on top of Apache Kafka. Designed a system using Kafka
to auto - scale the backend servers based on the events throughput.
• Implemented a CI/CD pipeline using Azure DevOps (VSTS, TFS) in both cloud and on-premises with GIT, MS Build,
Docker, Maven along with Jenkins plugins.
• Implemented terraform scripts for CloudWatch Alerts. Created and maintained highly scalable and fault
tolerant multi-tier AWS and Azure environments spanning across multiple availability zones
using Terraform and CloudFormation.
• Implemented Azure DevOps Pipelines for CI/CD setup, deployed applications automatically by enabling the triggers to
deploy.
• Set up a GCP Firewall rules in order to allow or deny traffic to and from the VM's instances based on specified
configuration and used GCP cloud CDN (content delivery network) to deliver content from GCP cache locations
drastically improving user experience and latency.
• Used apache airflow in GCP composer environment to build data pipelines and used various airflow operators like
bash operator, Hadoop operators and python callable and branching operators.Built reports for monitoring data
loads into GCP and drive reliability at the site level
• Using GIT repository for storing Terraform files and maintaining versioning. Converted existing Terraform modules that
had version conflicts to utilize cloud formation during Terraform deployments to enable more control or missing
capabilities.
• Converted existing Terraform modules that had version conflicts to utilize cloud formation during Terraform
deployments to enable more control or missing capabilities
• Used Docker for running different programs on single VM, Docker images includes setting the entry point and volumes,
also ran Docker containers and worked on installing Docker and creation of Docker container images, tagging and
pushing the images and worked with Agile methodology in XL Deploy and XL Release, CI/CD automation from scratch,
Docker, OpenShift.
• Evaluated Kubernetes for Docker container orchestration. Managed Kubernetes charts using Helm and created
reproducible builds of the Kubernetes applications, templatize Kubernetes manifests, provide a set of configuration
parameters to customize the deployment and Managed releases of Helm packages.
• Building and maintaining Docker container clusters managed by Kubernetes, Linux, Bash, GIT, Docker, on Azure.
Utilized Kubernetes and Docker for the runtime environment of the CI/CD system to build, test, deploy.
• Design and implement scalable enterprise monitoring systems by applying continuous integration/delivery concepts
and maintained and troubleshooting of our enterprise Redhat OpenShift systems and to continuously improve speed,
efficiency, and scalability of OpenShift systems.
• Implemented OpenShift Container Platform for Docker and Kubernetes, used Kubernetes to manage containerized
applications using its nodes, Config Maps, node-selector, Services, and deployed application containers as Pods.
• Implemented Microservices on RedHat OpenShift based on Kubernetes, Etcd, and Docker to achieve Continuous
Delivery.
• Installing, Configured and management in Ansible Centralized Server and creating the playbooks to support various
middleware application servers, and involved in configuring the Ansible tower as a configuration management tool to
automate repetitive tasks
• Working with Ansible Tower to manage Web Applications, Config Files, Data Base, Commands, User Mount Points,
Packages and for running playbooks stream in real-time and amazed to see the status of every running job without any
further reloads.
• Maintaining Jenkins in various multiple environments by installing packages on Jenkins master and slaves and perform
regular security updates for Jenkins.
• Migrated the Jenkins free style jobs to pipeline jobs by writing Jenkins file in groovy and written helper functions
Library and reused those libraries in Jenkins file.
• Implemented collaborative development environment using GIT, GitHub and integrated it with Jenkins, and
Maintained branches/forks in GitHub version control for the changes made in cookbooks as per release.
• Developed a fully automated continuous integration system using GIT, Jenkins and custom tools developed in Python
and Bash.
• Used build tool Maven for building deployable artifacts such as jar, war, and ear from source code and Artifactory
Repository like Sonar type Nexus for Maven and ANT builds to upload artifacts using Jenkins.
• Implementing a Continuous Delivery framework using Jenkins, Maven & Nexus in Linux environment and
Implemented Continuous Delivery and Deployment with Ansible and Docker to deploy the applications as a container.
• Set up and maintained Logging and Monitoring subsystems using tools loke; Elasticsearch, Fluentd, Kibana,
Prometheus, Grafana and Alertmanager.
• Introduced Node-exporter tool to the project for exporting logs Kubernetes deployments to Prometheus. There after
visualized metrics data in Grafana.
• Established infrastructure and service monitoring using Prometheus and Grafana. Built dashboards and visualizations
on top of MapR-DB and Hive using Oracle data visualizer desktop. Built real-time visualizations on top of Open TSDB
using Grafana.
• Written groovy scripts to use multi branch pipeline projects in Jenkins to configure it as per client’s requirements and
configured continuous integration from Source control, setting up build definition within Visual Studio Team Services
(VSTS) and configure continuous delivery to automate the deployment of ASP.NET MVC applications to Azure web
apps.
• Experience in query optimization SQL tuning using SQL Profiler, Database Tuning Advisor DTA, Index tuning wizard
and Performance monitor for performance tuning.

Client: SunNet Solution, Houston, Texas July 18 - Jan 20

Role: Cloud/DevOps Engineer.

Roles and Responsibilities:

• Designed, configured and managed public cloud infrastructures utilizing Amazon Web Services (AWS) including EC2,
Auto-Scaling, high-availability, fault tolerance, in launching EC2 instances, Elastic Load Balancer, CodeBuild, Elastic
Beanstalk, S3, Lambda, Glacier, cloud Front, RDS, VPC, Direct Connect, Route53, cloud Watch, cloud Formation, IAM,
SNS. 
• Created S3 buckets and managing policies for S3 buckets and using them for storage, backup and archived in AWS and
worked on AWS LAMBDA which runs the code with a response of events and Implemented API Gateways,
Authentication.
• Assisted Application Teams in creating complex IAM policies for administration within AWS and Maintained DNS
records using Route53. Used Amazon route53 to manage DNS zones and give public DNS names to elastic load
balancer IPs.
• Migrated On-Premises application servers and databases to AWS cloud and performed continuous data replication
using Cloud Endure to migrate large scale databases to cloud during DR setup.
• Worked on AWS Elastic Beanstalk for fast deploying of various applications developed with Java, PHP, Node.js, Python,
Ruby and Docker on familiar servers such as Apache and IIS.
• Designed and implemented scalable, secure cloud architecture based on Amazon Web Services. Leveraged AWS
cloud services such as FARGATE, EC2, S3, Docker Container, Cloud formation (Template - JSON) and Virtual Private
Cloud.
• Wrote Ansible Playbooks with Python SSH as the Wrapper to manage configuration of AWS Nodes and Test Playbooks
on AWS instances using Python and ran Ansible Scripts to provision Dev servers.
• Migrated On-Premises application servers and databases to AWS cloud and performed continuous data replication
using Cloud Endure to migrate large scale databases to cloud during DR setup.
• Created and managed IT infrastructure and application stack using AWS CloudFormation and writing the files in JSON.
• Written terraform scripts from scratch for building Dev, Staging, Prod and DR environments.
• Infrastructure buildout, maintenance & automation, collaborated with infrastructure team to maintain servers using
Terraform for provisioning, Ansible for automating software configuration. Servers were spread across various regions
and availability zones on AWS.
• Created Terraform modules to create instances in AWS & automated process of creation of resources in AWS using
Terraform.
• Worked on developing container-based deployments using Docker, working with Docker images, Docker HUB, and
Docker registries to provide data to registries according to the situation and worked on creating the Docker
containers and Docker consoles for managing the application life cycle.
• Building and maintaining Docker container clusters managed by Kubernetes, Linux, Bash, GIT, Docker. Utilized
Kubernetes and Docker for the runtime environment of the CI/CD system to build, test, deploy.
• Implementing Kubernetes deployments, Kubernetes volumes, Kubernetes network policy, command line utilities of
Kubernetes Kubectl and Implemented Kubernetes to orchestrate the containerized applications deployed on the node
machines. Managed Kubernetes objects like Replication Controllers, Deployments, Services, Liveness Probe and
Readiness Probe.
• Configured monitoring tools like Gold Pinger, Prometheus to get metrics from every Kubernetes cluster and dealt with
a lot of Kubernetes cluster or node failure management problems by trouble shooting the errors.
• Automated the process of deploying a set of security and monitoring tools dedicated to a Kubernetes cluster by
separating them with type of cluster based on what kind of applications were deployed into that cluster.
• Developed Ansible Playbooks using YAML scripts for launching different EC2 virtual servers in the cloud using Auto-
Scaling and Amazon Machine Images (AMI).
• Leveraged Ansible and Ansible Tower by creating multiple playbooks to manage Web Apps, OS Files, DB, Commands,
Mount Points, and Packages
• Configured Ansible Tower, which provides an easy-to-use dashboard and role-based access control, so that it's easier
to allow individual teams access to use Ansible for their deployments.
• Created CI/CD workflow by incorporating git, Maven and other deployment tools using Jenkins. Configured Jenkins to
make periodic builds and mentioned triggers to automate building important builds on the fly.
• Implemented CI/CD pipeline to pull the code from git repository, build and deploy products as artifacts using tools like
git, maven, and JFrog Artifactory on Jenkins.
• Designing and implementing CI (Continuous integration) system and configuring Jenkins servers and nodes by writing
required scripts (Bash & Python) and creating configuring VMs.
• Used GIT version control to manage the source code and integrating with Jenkins to support build automation and
integrated with JIRA to monitor the commits and worked with Ansible tower for scheduling playbooks and used GIT
repository to store these playbooks and implemented continuous deployment pipeline with Jenkins.
• Performed SVN to Bitbucket migration and managed branching strategies using GIT workflow. Managed User access
control, Triggers, workflows, hooks, security, repository control in Bitbucket.
• Created snippets that allow developers to share code segments and generated pulled requests for code review and
comments using Bitbucket.
• Utilized Splunk and New Relic for monitoring of logging, software, operating system, and hardware resources and used
these monitoring tools for working of instances in AWS platform.
• Installed and configured mod jk, mod cluster plugin for JBoss and implemented clusters in JBoss for load
balancing and failover.
• Configured the data sources, message queues and security domains for applications on JBoss application server and
created JBoss specific.
• Developed java programs at the Application side. Developed Ruby/Python scripts to monitor health of Mongo
databases and perform ad-hoc backups using Mongo dump and Mongo restore.
• Creating scripts for data modeling and data import and export. Extensive experience in deploying, managing and
developing MongoDB clusters. Experience in creating JavaScript for using DML operation with MongoDB.
• Creation, configuration and monitoring Shards sets. Analysis of the data to be shared, choosing a shard Key to
distribute data evenly. Architecture and Capacity planning for MongoDB clusters. Implemented scripts for mongo DB
import, export, dump and restore.
• Implemented MongoDB database concepts such as locking, transactions, indexes, Sharding, replication, schema
design. Created multiple databases with sharded collections and choosing shard key based on the requirements.
Experience in managing MongoDB environment from availability, performance and scalability perspectivesWorked on
creating various types of indexes on different collections to get good performance in Mongo database.
• Created datadog dashboards for various applications and monitored real-time and historical metrics. Setup datadog
monitoring across different servers and aws services. Automate Datadog Dashboards with the stack through
Terraform Scripts.Created system alerts using various datadog tools and alerted application teams based on the
escalation matrix.
• Installed Sumo Logic collectors, configured SSH/proxy, and deployed application both in production and non-
production environments. Monitored the health check of micro services using sumo logic by creating dashboard
Client: Lincoln Financial Group, Radnar, PA Nov 17 - July 18

Role: DevOps engineer

Roles and Responsibilities:

• Involved in designing and deploying multiple application utilizing AWS stack and implemented AWS solutions like EC2,
S3, IAM, EBS, Elastic Load Balance (ELB), Security Group, Auto Scaling.
• Automated and implemented the Cloud Formation Stacks for creating AWS resources like VPC, Subnets, Gateways,
AutoScaling, Elastic-Load-Balancers (ELB), creating DB Instances and many others across different Availability Zones.
• User management including IAM level (AWS Console level) by creating roles to allow the multiple users to switch roles
and editing trust relationship to allow switch from main account to other account and at AWS instance level as well.
• Automated the cloud deployment using Chef, Python and AWS Cloud Formation Templates and used Chef for
unattended bootstrapping in AWS. Restoring existing manual deployment and management processes
with Chef and AWS OpsWork stacks.
• Worked on to setup Docker to automate container deployment through Jenkins and worked on docker container to
create docker images for different environments.
• Developed docker images using a docker file and docker container snapshots and managing docker volumes also
deployed Docker swarm using Ansible.
• Worked in using Kubernetes for container management that runs docker containerized applications in a cluster
of EC2 instances in Linux environment.
• Worked with chef data bags, attributes, cookbooks, recipes, and templates in chef. And created jobs for chef client to
interact with chef server on time period basis.
• Implemented Chef Recipes for Deployment on build on internal Data Centre Servers. Also re-used and modified same
Chef Recipes to create a Deployment directly into Amazon EC2 instances.
• Used TeamCity Enterprise CI and distributed build that supports all of the environments to run the build, promotions
and deployments, and Used Shell Scripts to automate the deployment process, and also uploading the code in GITHUB
and generating build number using TeamCity.
• Built and Deployed Java/J2EE to Tomcat Application servers in an Agile continuous integration process and automated
the whole process using Maven and Automated Weekly releases with Maven Scripting for Compiling Java Code,
Debugging and placing Builds into Maven Repository.
• Used Jenkins for nightly build and test. Installed Multiple Plugins for smooth build, release build pipelines and created
a master, slave configuration to implement multiple parallel builds.
• Carried out deployments and builds on various environments using Jenkins and developed Jenkins build pipeline jobs
using groovy for Node.js and Java-based applications.
• Integrated Jenkins with GitHub private repositories with Nexus Artifact repository for pushing successful build code
using Maven as build Automation tool.
• Used Nagios as a monitoring tool to identify and resolve infrastructure problems before they effect critical process and
worked on Nagios event handlers in case of automatic restart of failed applications and services.
• Configured Splunk Searching and Reporting modules, Knowledge Objects, Administration, Add-On's, Dashboards,
Clustering and Forwarder Management.
• Created Splunk Search Processing Language (SPL) queries, Reports, Alerts, and Dashboards.
• Created KV stores optimized for Splunk real time performance and responsible for managing & troubleshooting
MongoDB to model the data for correlation searches.
• Wrote Python scripts for pushing data from DynamoDB to MySQL Database. Also created and maintained the Python
deployment scripts for WebSphere web application server.
• Administration and support of homogeneous production and development server infrastructure of multiple flavors of
Linux.

Client: Pointsbet, Denver, Colorado Sept 16 - Nov 17

Role: Java/DevOps Engineer

Roles and Responsibilities:


• Built, tested, and deployed scalable, highly available and modular software products and strengthened developmental
methodologies by introducing a code quality document.
• Supported developers in front-end development using JSP, AngularJS, Html, CSS and back-end using RESTful web
services.
• Built application platform foundation to support migration from client-server product lines to enterprise architectures
and services.
• Installation, configuration and deployment of VMware products such as VMware Workstation, VMware converter,
VMware vSphere, vCenter server and applience, VConverter, vMotion, HA, DRS, vCenter Operations Manager vCOP ,
vCenter Service Manager, vCenter Configuration Manager VCM , vRealize vROP , Site Recovery Manager, Horizon
View, Cloud Director, Cloud Connector Server, vCenter Connector Node, vShild Manager.
• Experience in installing, configuring and troubleshooting VMware View Composer and View Connection server, view
transfer server, view replica server, ThinApp, view agent, view client to make Virtual Desktop Infrastructure efficient
and to simplify desktop administrative management tasks.
• Implemented VDI Horizon View Virtual desktop infrastructure technology using VMware View.
• Server Consolidation with VMware Converter P2P and P2V Conversions
• Virtualized the servers using the Docker for the test environments and dev-environments needs. And also,
configuration automation using Docker containers.
• Experience in CI/CD with Jenkins implemented automation of new projects builds framework using Jenkins & maven
as build framework tools.
• Installation, configuration and administration of Virtual Box and VMware virtual machine for RHEL, Ubuntu Linux
servers.
• Written Puppet manifest for various DB configurations to modularize and optimize end product configuration
• Worked with Puppet administrator, adding new Puppet enterprise nodes to the master, deactivating the nodes, and
troubleshooting connection issues, and troubleshooting, event application and reporting various Puppet issues and
starting or restarting the Puppet enterprise services.
• Used Webhooks for pushing the commits from GIT to Jenkins and written Groovy scripts to automate Jenkins Pipeline
and set up the automate the build in periodic time and set alerts to notify after the build.
• Build end to end CI/CD Pipelines in Jenkins to build CI/CD Pipeline and code deployment guide for Developers, Testers
and Production management.
• Expertise in all areas of Jenkins like Plugin Management, Securing and scaling Jenkins, integrating Code Analysis,
Performance issues, Analytics and Test Phases to complete the CI/CD pipelines within Jenkins.
• Experienced in several areas of Jenkins like master/slave administrations, access controls, report generations.
• Supported developers in front-end development using JSP, AngularJS, Html, CSS, and back-end using RESTful web
services.
• Built application platform foundation to support migration from client-server product lines to enterprise architectures
and services.
• Implemented Micro services, Service Oriented Architecture using Spring Boot with XML based Web Services
(SOAP/WSDL) using Top Down and Bottom-Up approach.
• Designed and developed the REST based Micro Services using the Spring Boot and Spring Cloud.
• Installation, configuration and administration of Virtual Box and VMware virtual machine for RHEL, Ubuntu Linux
servers.
• Developed highly interactive and customized UIs using JavaScript, HTML, JSP, and CSS to improve functionality of web
applications.
• Performed unit testing of applications by developing and applying test cases in JUnit.
• Created interactive UIs that surpassed client objectives and improved user experience.

Client: Persistent Systems, India March 15 – July 16

Role: Java Developer


Roles and Responsibilities:

• Use Java/J2EE technologies to develop web applications for client server environments and add functionalities to
existing applications.
• Revamp various Java applications developed with Spring, Hibernate and older J2EE.
• Conduct user requirements analysis to design and program applications and deliver support for system enhancement.
• Involved in design, development and testing of the application.
• Implemented the object-oriented programming concepts for validating the columns of the import file.
• Played an important role writing the Junit test case scenarios for all the validations.
• Involved in each and every phase of SDLC.
• Responsible for changing the GET and CHANGE request according to the requirement.
• Responsible for creating Restful Web services.
• Experience using SOAP UI to test the existing services. Responsible for consuming Web Service from WSDL.
• Rendered solid technical expertise in Software Development Life Cycle and core Java technologies to develop
applications based on specific client requirements.
• Development of different Application modules using J2EE, Struts, Oracle and Hibernate.
• Developed Persistence mapping files persistence.xml provided by Hibernate and domain objects and worked on
optimization of hibernate domain mappings that includes read-only entities and lazy loading.

Certifications:
AWS CERTIFIED DEVELOPER: https://www.credly.com/badges/8c37f897-cd85-49e1-b28b-1cce5ea1beea/public_url

AZURE DEVOPS ENGINEER: https://www.credly.com/badges/b1b6875f-661e-4096-ab26-a5546eaa5294/public_url

You might also like