You are on page 1of 13

Piyush Patel

(US Citizen)
piyushpatellead@gmail.com
Lead Python Developer
+1-314-590-2018

Summary:
 10+ years of experience in Analysis, Design, Development, Testing, Customization, Bug fixes,
Enhancement, Support and Implementation of various web, stand-alone, client-server enterprise
applications using Python, Django in various domains.
 Experienced in working with various stages of Software Development Life Cycle (SDLC),
Software Testing Life Cycle (STLC) and QA methodologies from project definition to post-
deployment documentation.
 Experience with Design, code, debug operations, reporting, data analysis and Web Applications
utilizing Python.
 Developed GUI using JavaScript, HTML/HTML5, DOM, XHTML, AJAX, CSS3, jQuery and
Angular8 in ongoing projects.
 Strong development experience in Java, Servlet, JSP, Spring, Struts, Hibernate, Angular8,
JavaScript, JQuery, HTML5, CSS3, SOAP & REST Web Services, AJAX.
 Converted the mock-ups into hand-written HTML5, CSS (2/3), XHTML, JavaScript, jQuery,
AJAX, XML and JSON.
 Experienced in implementing Object Oriented Python, Hash Tables (Dictionaries) and
Multithreading, Django, MYSQL, Exception Handling and Collections using Python.
 Worked with MVW frameworks like Django, Angular JS, HTML, CSS, XML, Java Script,
jQuery, Bootstrap.
 Have a hands - on experience on fetching the live stream data from DB2 to HDFS using
SparkStreaming and Apache Kafka.
 Deeply involved in writing complex Spark-Scala scripts, Spark context, Cassandra SQL context,
used multiple API's, methods which support data frames, RDD's, Cassandra table joins and
finally write/save the data frames/RDD's to Cassandra database.
 Knowledge on integrating different eco-systems like Kafka - Spark - HDFS.
 Good Knowledge in Apache Spark and SparkSQL.
 Experience in running Spark streaming applications in cluster mode and Spark log debugging.
 Experienced in developing Web Services with Python programming language and Good working
experience in processing large datasets with Spark using Scala and Pyspark.
 Knowledge on Spark framework for batch and real-time data processing.
 Good Knowledge in Amazon AWSconcepts like EMR and EC2 web services which provides
fast and efficient processing of Big Data.
 ExperiencExperience in developing applications using amazon web services like EC2, Cloud
Search, Elastic Load balancer ELB, S3, CloudFront.
 Expertise in JSON, IBM FileNet P8,Even Stream Processing(ESP) ,Scala, Linux, GoLang,
Adobe Flex, AngularJS, Python, JIRA, AWS (Amazon Web Services) and proficient in cloud
application development tools and deployment methods.e in Data manipulation using
BigDataHadoop Eco system components Map-Reduce, HDFS, Yarn/MRv2, Pig, Hive,
Hbase, Spark, Kafka, Flume, Sqoop, Flume, Oozie, Avro, AWS, Spark integration with
Cassandra, Solr and Zookeeper.
 Extensive Experience in working with Cloudera (CDH4 & 5), and HortonworksHadoop distros
and AWSAmazonEMR, to fully leverage and implement new Hadoop features.
 Hands on experience with data ingestion tools Kafka, Flume and workflow management tools
Oozie.
 Having hands on experience in migrating SAS code to PYTHON as well as EASYTRIEVE code
to SAS.
 Proficient in creating, joining, merging and maintaining datasets using PYTHON, PANDAS,
SAS, SQL and well exposed to working on SAS/MACRO in Windows and Mainframe's
 Experience in accessing data with PYTHON, SAS from databases such Oracle and DB2.
 Extensive experience with merging and concatenating and interleaving SAS datasets
 Experience in developing SAS procedures, macros, and applications for data cleaning,
programming, reporting and documentation
 Strong experience and knowledge of real time data analytics using SparkStreaming, Kafka and
Flume. Good experience in writing Spark applications using Python and Scala.
 Experience in developing Spark Programs for Batch and Real-Time Processing.
Developed Spark Streaming applications for Real Time Processing.
 Experience in developing code, created tables, constraints and keys, loaded tables using PCO
programs, DCL, SQL scripts and SQL Loader.
 Experienced in developing Web Services with Python programming language - implementing
JSON based RESTful and XML based SOAP webservices.
 Experience in using Design Patterns such as MVC, Singleton and frameworks such as DJANGO,
Ability in handling Django ORM (Object-Relational Mapper) and SQLAlchemy.
 Proficient in Python OpenStack API'S and GUI framework - Pyjamas (for web).
 Pycharm Proficient in performing Data analysis and Data Visualization using Python libraries.
 Scaling up projects using python tools like multithreading, celery.
 Experience in using Version Control Systems like GIT, SVN and CVS to keep the versions and
configurations of the code organized.
 Experience as a Python Developer, proficient coder in multiple languages and environments
including Python, Java/J2EE, REST Api, AWS, C, C++, SQL and Familiar with
Shell Scripting (bash), Perl
 Experience in UNIX/Linux shell scripting for job scheduling, batch-job scheduling, automating
batch programs, forking and cloning jobs.
 In-depth knowledge of computer applications and scripting like Shell, Bash and Python.
 Experience in Amazon Web Services (AWS) cloud platform like EC2, Virtual private clouds
(VPCs), Storage models (EBS, S3, instance storage), Elastic Load Balancers (ELBs).
 Experienced in developing API services in Python/Tornado, while leveraging AMQP and
RabbitMQ for distributed architectures.
 OpenVMS installation, management, scripting, boot process and startup; VMSclusters; CIFS;
DCL scripting; Decnet.
 Developed tools by using DCL scripts and lexical functions to enhance the productivity.
 Experience working within team and using team collaboration tool such as TFS to build web
application project under Agile and Waterfall environments.
 Experience in developing ColdFusion Components, custom tags and modified CF Objects.
 Experience with Unit testing/ Test driven Development (TDD), Load Testing.
 Experienced in building tools such as ANT, Apache MAVEN, and JENKINS.
 Experience in deploying applications in heterogeneous Application Servers TOMCAT, Web
Logic and Oracle Application Server.
 Good Experience on testing tools like JIRA and Bugzilla for bug tracking.

PROFESSIONAL EXPERIENCE

Panacea IT, FL (Remote)


Dec 17 to Present
Python Developer/Lead

Responsibilities:
 Designed front end and backend of the application using Python on Django Web Framework
 Used HTML, CSS Bootstrap, AJAX, JSON designed and developed the user interface of the
website.
 Used python libraries like Beautiful Soap, NumPy and SQL Alchemy.
 Worked with the Spark for improving the performance and optimization of the existing
algorithms in Hadoop using Spark Context, Spark-SQL, Data Frame, Pair RDD's, Spark YARN.
 Serializing JSON data and storing the data into tables using Spark SQL.
 Involved in converting Hive/SQL queries into Spark transformations using Spark RDD's
and Scala.
 Designing automation test architecture and then automating the test scripts using automation
tools like QuickTest Pro (QTP), Load Runner and performing synthetic monitoring using HP
Sitescope for web applications & database servers.
 Create Single Page Application (SPA) using HTML5, CSS3, SCSS, Bootstrap, JavaScript,
JSON, Angular 8, and Typescript 3.3.
 Analyze client processes, identify Automation opportunities, define RPA value proposition
reengineer process to improve Automation potential and recommend RPA approach/strategy
 Drive the full design and development of high-quality technical solutions.
 Consulting with business users to identify current operating procedures and clarify program
objectives.
 Build automation scripts using JavaScript to integrate with existing code.
 Develop Automation Meta bots and configurable .NET scripts to be integrated with Automation
Anywhere.
 Make recommendations about new code and /or existing code and making necessary changes.
 Worked with Bot central- A client specific RPA and Chabot portal to onboard, manage and track
bots.
 Setup angular project defining its layout, folder hierarchy, modules, routers, angular components
and publish the project to the server for the team to start working on the project.
 Consumed the data from Kafka using Apache spark.
 Extensively worked with Avro and Parquet files and converted the data from either format
Parsed Semi Structured JSON data and converted to Parquet using Data Frames in PySpark.
 Performed Kafka analysis, feature selection, feature extraction using Apache Spark Machine.
 Used Different Spark Modules like Spark core, Spark SQL, Spark Streaming, Spark Data sets
and Data frames.
 Wrote Lambda functions in python for AWS Lambda and invoked python scripts for data
transformations and analytics on large data sets in EMR clusters and AWS Kinesis data streams.
 Worked on integrating AWS DynamoDB using AWS Lambda to store the values the items and
backup the DynamoDB streams.
 Extensively worked in migrating traditional application in AWS cloud using S3, EC2, EMR,
Redshift, and Lambda
 Completed possesses solution implementation experience leveraging AWS cloud stack
components (S3, RDS, Redshift, EMR, Lambda), NoSQL, Python, PySpark, AWS Glue, Talend,
Tableau, Alteryx, Qlik toolset, Hadoop, Cloudera, Hive, Sqoop, Impla and Spark/Kafka
 Develop and Execute scripts on AWS Lambda to generate AWS CloudFormation template.
 Microservice architecture development using Python and Docker on an Ubuntu Linux platform
using HTTP/REST interfaces with deployment into a multi-node Kubernetes environment.
 Loading spilling data using Kafka, Flume and real time Using Spark and Storm.
 Implemented Spark SQL to access Hive tables into Spark for faster processing of data.
 Used Spark SQL with Python for creating data frames and performed transformations on data
frames like adding schema manually, casting, joining data frames before storing them.
 Worked on automating the shock stress treatment for the input variables by utilizing SAS and
python Jupyter Notebook. Converted SAS queries to python scripts using Pandas, numpy
packages and created class and function (methods) in generating loan profile
 The shocked inputs are fed into the MCFW application and the derived SAS datasets are used to
create dashboards in Tableau and standalone SAS Factor reports that shows the Gross Loss and
Net loss forecasted
 Reverse-engineered complex synthetic monitoring systems written in C#.
 Developing Ruby on Rails 3 web applications using MongoDB and back-ground processes using
Resque and Redis and Worked on performance tuning of cluster using Cassandra Configuration
file and JVM Parameters.
 Develop Ruby/Python scripts to monitor health of Mongo databases and perform ad-hoc backups
using Mongo dump and Mongo restore and Develop and implement core API services using
Python with spark.
 Involve in designing of API's for the networking and cloud services and Design and develop the
application using Agile Methodology and followed TDD and Scrum.
 Worked on creating SAS macros for capturing several shocked stress files and automated the
process so that the monthly reports are generated in tableau using SAS analysis datasets.
 Building/Maintaining Docker container clusters managed by Kubernetes Linux, Bash, GIT,
Docker, on GCP(Google Cloud Platform). Utilized Kubernetes and Docker for the runtime
environment of the CI/CD system to build, test deploy.  
 Knowledge of cloud infrastructure technologies in Azure.
 Experience with Microsoft Azure Cloud services, Storage Accounts,Azure date storage, Azure
Data Factory, Data Lake and Virtual Networks.
 Automated different workflows, which are initiated manually withPythonscripts and Linux
bash scripting.
 Used Scala to convert Hive/SQL queries into RDD transformations in Apache Spark.
 Worked with Azure Monitoring and Data Factory.
 Supported migrations from on premise to Azure.
 Building/Maintaining Docker container clusters managed by Linux, Bash, GIT, Docker, on
AWS. Utilized Docker for the runtime environment of the CI/CD system to build, test and
deploy.
 Providing support services to enterprise customers related to MicrosoftAzureCloud networking
and experience in handling critical situation cases.
 Implemented Spark Scripts using Scala, Spark SQL to access hive tables into spark for faster
processing of data.
 Developed views and templates with Python OOD to create a user-friendly Website interface.
 Participated in the complete SDLC process and used PHP to develop website functionality.
 Created backend database T-SQL stored procedures and Jasper Reports.
 Specified, prototyped, developed and tested an object-oriented, multiplatform C++ framework
containing support to: data structures, common algorithms sockets, threading.
 Refactor Restful APIs and Django modules to deliver certain format of data.
 Responsible for debugging the project monitored on JIRA (Agile)
 Private VPN using Ubuntu, Python, Django, CherryPy, Postgres, Redis, Bootstrap, jQuery,
Mongo, Fabric, Git, Tenjin, Selenium, Sphinx, Nose.
 Validate the Data by using the pySpark programs
 Create the Hive tables which are equivalent to Oracle tables
 Load the Oracle tables data into HDFS, Hive and HBase tables
 Created the pySpark programs to load the data into Hive and MongoDB databases
from PySpark Data frames
 Developed rest API's using python with flask and Django framework and done the integration of
various data sources including Java, JDBC, RDBMS, Shell Scripting, Spreadsheets, and Text
files.
 Knowledge on DCL language and have experience in conversion of DCL Scripts to Python3.5.
 Implemented client side validation to avoid back and forward between server-client and provide
better user experience using AngularJS.
 Worked on fixing the bugs on existing Jiras and validated in the integration environment
 Technical experience with LAMP (Linux, Apache, MySQL, PYTHON)
 Utilized Python 3.4 libraries such as numpy, pandas and matplotlib to read data from csv files
aggregate and update data
 Involved in writing optimization techniques for more accuracy of Macros in C/C++, C++
routines and Oracle SQL, PL/SQL.
 Worked under Agile/Scrum environment and handled production rollouts and issues.
 Integrated Redis-Cache with Django Rest Framework for reading the data faster
 Automated the existing scripts for performance calculations using NumPy and SQL Alchemy.
 Worked in NoSQL and MySQL database on queries and writing Stored Procedures for
normalization and renormalization.
 Implemented a Continuous Integration and Continuous Delivery (CI/CD) pipeline with Docker
and GIT and AWS AMI's. Developed Bash and Python scripts to automate various stages of
build and release process in Jenkins.
 Worked on object-oriented programming (OOP) concepts using Python, Django and Linux.
 Used PhP language on lamp server to develop page.
 Developed multi-threaded standalone applications using Python and PHP.
 Involved in Agile Methodologies and SCRUM Process.
 Experience in multiplatform Unix environments, with hands-on expertise scripting, and systems
integration
 Created Business Logic using Python to create Planning and Tracking functions and developed
multi-threaded standalone applications using Python and PHP.
 Reviewed codes in Shell Script, Pearl, Python, AWK, C++, PL/SQL & T-SQL; created
subprograms, procedures and functions, DB triggers, cursors and optimization techniques for T-
SQL.
 Responsible for debugging project monitored on JIRA (Agile)
 Worked on Jira for managing the tasks and improving the individual performance.
 Involved in Continuous Integration (CI) and Continuous Delivery (CD) process implementation
using Jenkins along with Shell script.
 such as maintaining user accounts and providing advanced file permissions to specific users.

Environment: Python 3.4/2.7, Django 1.8, C++, MySQL, MongoDB, AWS, QPID, LAMP,
Redis, JIRA, JSON, Angular JS, Docker, Agile, Django 2.2, Jenkins, Linux, MYSQL, T-SQL,
Shell, Spring MVC 4.0, PHP, Html5, JavaScript, Restful Web Service, Bash, TFS, GIT, Shell
scripts, Hibernate, jQuery, CSS, log4j, Tomcat,

Businessneeds Inc. (CA)


Jun' 15 to Nov' 17
LeadPython Developer

Responsibilities:
 Designed front end and backend of the application using Python on Django Web Framework
 Used HTML, CSS, AJAX, JSON designed and developed the user interface of the website.
 Developed views and templates with Python and Django's view controller and templating
language to create a user-friendly website interface.
 Developed application which accessed the Cloud foundry API to monitor trends in development
environments using other CF tools: Jenkins, Chef, Puppet.
 Added support for Amazon AWS S3 and RDS to host static/media files and the database into
Amazon Cloud.
 Provided advisory services to key stakeholders on the potential benefits of automation and
organization readiness required to accept the automation transformation.
 Provided guidance on the right automation platform for process automation based on the Process
- considering process type, complexity, platform capabilities etc
 Co-ordinated with teams during User acceptance testing and Automation of BW data flows using
UI Path.
 Taking care of Business Ad Hoc Requests to update the Production Data.
 Prepared a Road Map for delivering Task bots into production pipeline. Analyze the feasibility
of existing process suitable for automation.
 Involved in Using AWS Cloud Watch performed Monitoring, customized metrics and file
logging.
 Analyze the business drivers that determine key architecture requirements for various cloud
service delivery models (IaaS, PaaS, SaaS etc.) along with cloud specific services such as
Amazon SQS, Lambda, RDS etc.
 Wrote Lambda functions in python for AWS Lambda and invoked python scripts for data
transformations and analytics on large data sets in EMR clusters and AWS Kinesis data streams.
 Used PySpark to expose Spark API to Python.
 Experience in Cloud based services(AWS) to retrieve the data.
 Configured and deployed project using the Amazon EC2 on AWS.
 Develop and Execute scripts on AWS Lambda to generate AWS CloudFormation template.
 Developed Spark code using Python for faster processing of data on Hive (Hadoop). Developed
 Map Reduce jobs in Python for data cleaning and data processing.
 Used different type of transformations and actions in apache Spark.
 Experience in writing custom User Defined Functions (UDF) in Python for Hadoop (Hive and
Pig).
 Used Spark cluster to manipulate RDD's (resilient distributed datasets). And also used concepts
of RDD partitions.
 Connecting my SQL database through Spark driver.
 Design and develop solutions using C, C++, Multi-Threaded, Shell Scripting.
 Used JavaScript and JSON to update a portion of a webpage.
 Built database structure and wrote Postgre SQL queries and Built Django and REDIS servers for
data storing and Built web journal with framework and Jinja templates.
 Develop consumer-based features and applications using Python, Django, HTML and Test
Driven Development (TDD)
 Working in team of Developers to build and deploy Python Flask, Peewee, on Linuxserver
hosted on AWS.
 Worked on invoking SAS auth domain password which runs the script for connecting to oracle
database that has autosys job information
 Extensively used SAS macros to automate the statistical model scores, authorization datamarts
that run hourly, daily and monthly as part of BAU process.
 Used Proc Freq, Proc means, Proc compare SAS procedures for getting frequency, means of the
data, produced monthly, daily reports to compare them with previous month
 Worked on Autosys to schedule the jobs mothly, daily and hourly and run the production jobs
automatically and these jobs trigger shell scripts passing system parameters which run the SAS
programs to generate all daily reports, authorization datasets for downstream processes
 The SAS datasets are modified into different data types like csv, text, html and excel files and .hz
files are also converted to datasets. Used proc cimport to read transport file and xport for datasets
to create transport files
 Responsible for Working on Celery.
 Responsible in upgrading their operating system from OpenVMS to Linux and converting
all DCL programs to Python.
 Worked with Spark for improving performance and optimization of the existing algorithms in
Hadoop using Spark Context, Spark-SQL, DataFramesand Pair RDD's.
 Performed advanced procedures like text analytics and processing, using the in-memory
computing capabilities of Spark using Scala.
 Extract Real time feed using Kafka and Spark Streaming and convert it to RDD and process data
in the form of Data Frame and save the data as Parquet format in HDFS.
 Experienced in writing real-time processing and core jobs using Spark Streaming with Kafka as a
data pipe-line system.
 Configured Spark streaming to get ongoing information from the Kafka and store the stream
information to HDFS.
 Developed Bash and python Scripts for the purpose of manual deployment of the code to the
different environments and E-mail the team when the build is completed.
 Ensured high quality data collection and maintaining the integrity of the data using integration of
Python with C and C++ libraries.
 Developed Python classes and used decorated methods to create the dependency graphs for the
business logic and core applications that are pre-built using C++.
 Used Spark and Spark-SQL to read the parquet data and create the tables in hive using
the Scala API.
 Experienced in using the spark application master to monitor the spark jobs and capture the logs
for the spark jobs.
 Job scheduling, batch-job scheduling, process control, forking and cloning of jobs and checking
the status of the jobs using shell scripting.
 Worked on Spark using Python and Spark SQL for faster testing and processing of data.
 Implemented Spark sample programs in python using pyspark.
 Analyzed the SQL scripts and designed the solution to implement using pyspark.
 Developed pyspark code to mimic the transformations performed in the on-premise environment.
 Developed multiple Kafka Producers and Consumers as per the software requirement
specifications.
 Used Spark Streaming APIs to perform transformations and actions on the fly for building
common learner data model which gets the data from Kafka in near real time and persist it to
Cassandra.
 Experience in building Real-time Data Pipelines with Kafka Connect and Spark Streaming.
 Used Kafka and Kafka brokers, initiated the spark context and processed live streaming
information with RDD and Used Kafka to load data into HDFS and NoSQL databases.
 Used Zookeeper to store offsets of messages consumed for a specific topic and partition by a
specific Consumer Group in Kafka.
 Used Kafka functionalities like distribution, partition, replicated commit log service for
messaging systems by maintaining feeds and created applications, which monitors consumer lag
within Apache Kafka clusters.
 Using Spark-Streaming APIs to perform transformations and actions on the fly for building the
common learner data model.
 Involved in Cassandra Cluster planning and had good understanding in Cassandra cluster
mechanism.
 Responsible in development of Spark Cassandra connector to load data from flat file to
Cassandra for analysis, modified Cassandra.yaml and Cassandra-env.sh files to set various
configuration properties.
 Used Sqoop to import the data on to Cassandra tables from different relational databases like
Oracle, MySQL and Designed Column families.
 Responsible in development of Spark Cassandra connector to load data from flat file to
Cassandra for analysis.
 Developed efficient MapReduce programs for filtering out the unstructured data and developed
multiple MapReduce jobs to perform data cleaning and preprocessing on Hortonworks.
 Used Hortonworks Apache Falcon for data management and pipeline process in the Hadoop
cluster.
 Implemented Data Interface to get information of customers using Rest API and Pre-Process data
using MapReduce 2.0 and store into HDFS (Hortonworks).
 Maintained ELK (Elastic Search, Logstash, and Kibana) and Wrote Spark scripts
using Scala shell.
 Worked in AWS environment for development and deployment of custom Hadoop applications
 Rewrite existing Python/Django modules to deliver certain format of data.
 Responsible for tracking and monitoring project status in JIRA (Agile)
 Used IDE tool to develop the application and JIRA for bug and issue tracking.
 Participated in the complete SDLC process and used PHP to develop website functionality.
 Utilized STL and C++ algorithms to achieve optimal design efficiency.
 Developed Python web services for processing JSON and interfacing with the Data layer.
 Coding in LAMP (Linux, Apache, MySQL, and PHP) environment.
 Responsible for using Tableau efficiently.
 Collaborated with other developers throughout the project life cycle and used TFS for source
control.
 Developed an application that would allow transfer of log files from Linux computer to Linux
server using C++ multithreading environment.
 Collaborated on architecture, design including Python and Java automation framework.
 Experienced in NoSQL technologies like MongoDB, Cassandra, QPID, Messaging, Redis and
relational databases like Oracle, SQLite, PostgreSQL and MySQL databases.
 Extensive experience in designing and implementing various web applications in WAMP
(Windows, Apache, MySQL, PHP)
 Worked on deployment of project on to Amazon S3.
 Worked on virtual and physical Linux hosts and involved in day to day administrative activities
such as maintaining user accounts and providing advanced file permissions to specific users.
 Used GIT version control and deployed project to Heroku.
 Worked on development of SQL and stored procedures for normalization and renormalization in
MYSQL.
 Build SQL queries for performing various CRUD operations like create, update, read and delete.
 Used JIRA to assign, track, report and audit the issues.
 Shared responsibility for administration of Hive and Pig.
 Improved the coding standards, code reuse. Increased performance of the extended applications
by making effective use of various design patterns (Front Controller, DAO)
 Programming and Changing UI screens using C++/QT Used multi-threading and thread
synchronization extensively.
 Worked on a large scale distributed computing environment, monitoring data nodes to prioritize
jobs for processing functions.
 Worked extensively with Bootstrap, JavaScript, and JQuery to optimize the user experience.
Environment: Python 3.0, Django 1.6, C++, HTML5, QPID, CSS, PHP, HTML, TFS, Redis,
Java, MySQL, JavaScript, Angular JS, Backbone JS, JQuery, LAMP, Mongo DB, MS SQL
Server, T-SQL, AWS, Linux, Shell Scripting.

Doozer.com- (AL)
Sep' 13 to May' 15
Lead Python Developer

Responsibilities:
 Involved in writing stored procedures using MySQL. Worked on development of SQL and stored
procedures on MYSQL
 Develop programs to automate the testing of controller in CI/CD environment using Python,
Bash script, Git and Linux command line.
 Design and develop solutions using C, C++, Multi-Threaded, Shell Scripting and Python
 Developed Business Logic using python on Django Web Framework.
 Developed views and templates with python and Django's view controller and templating
language to create
 Angular.js is used to build efficient backend for client web application.
 Used python to extract information from XML files.
 Expertise in Service Oriented (SOA) and its related technologies like Web Services, BPEL,
WSDLs, SOAP, XML, XSD, XSLT etc.
 Able to work with different operating systems like MacOS, Windows, and Linux.
 Designed and developed a horizontally scalable APIs using python Flask.
 Involved in debugging the applications monitored on JIRA using agile methodology.
 Used Angular JS in the view layer for some of the view components.
 Debugging and troubleshooting production issues, enforced, documented and implemented C++
standards guidelines.
 Involved in developing code for obtaining bean references in Spring framework using.
 Dependency Injection (DI) or Inversion of Control (IOC) using annotations.
 Worked extensively with Core Java to develop code in Service Layer.
 Provide design recommendations, developing and data interaction programs per high level
specifications
 Rewrite existing Java, C++ application in Python.
 Followed the Object-Oriented Analysis and Design (OOAD)
 Applied Do Not Repeat Yourself (DRY) principle.
 Created Business logic using Python under Linux OS.
 Used Service Oriented Architecture (SOA), to achieve loose coupling.
 Used Jira for Ticketing.
 Used Team City for Continuous build. Worked with UML Diagrams. Participated in application
fine tuning.
 Used jQuery to provide better features on the front end.
 Provided extensive pre-delivery support using Bug Fixing and Code Reviews.
 Used LOG4J & JUnit for debugging, testing and maintaining the system state.
 Responsible for gathering requirements, system analysis, design, development, testing and
deployment.
 Developed tools using python, Shell scripting, XML to automate some of the menial tasks.
Interfacing with supervisors, artists, systems administrators and production to ensure production
deadlines are met.

Environment: C++, Python3.x, Django, Shell Scripting, Pandas, PyQt, PyQuery, Wire shark,
Flash, JSON, PHP, CSS3, AJAX, Angular.js, Bootstrap,TFS, Apache Web Server,

SEARS INC, Hoffman Estates, IL


Jan' 11 to Aug' 13
Sr. Python Developer

Responsibilities:
 Wrote Python scripts to parse XML documents and load the data in database.
 Utilized PyQt to provide GUI for the user to create, modify and view reports based on client
data.
 Developed web-based applications using Python 2.7/2.6, Django 1.4/1.3, PHP, Flask, Webapp2,
Angular.js, VB, C++, XML, CSS, HTML, DHTML, JavaScript and jQuery.
 Used Python based GUI components for the front end functionality such as selection criteria.
 Developed monitoring and notification tools using Python.
 Deployed the entire code using Linux parameters of the virtual machines for UAT phase.
 Worked on Technologies: QT, QML, C++, QNX, UML, JavaScript and Json.
 Implemented user interface guidelines and standards throughout the development and
maintenance of the website using the HTML, CSS, JavaScript and jQuery.
 Managed, developed, and designed a dashboard control panel for customers and Administrators
using Django, HTML, Bootstrap, and REST API calls using the JSON.
 Used GitHub for version control.
 Integrating the application with Django REST framework for building the API's.
 Worked with on AJAX framework to transform Datasets and Data tables into HTTP-serializable
JSON strings.
 Used Chef to manage VM configuration within AWS and primarily used Bash to write Git
applications and chef Recipes.
 Extensively used XLSX reader and writer modules to read, write and analyze data and project
the results as per the client request.
 Good Experience in Linux Bash scripting and following PEP-8 Guidelines in Python.
 Using Django evolution and manual SQL modifications was able to modify Django models
while retaining all data, while site was in production mode.
 Improved the coding standards, code reuse. Increased performance of the extended applications
by making effective use of various design patterns (Front Controller, DAO)
 Utilized STL and C++ algorithms to achieve optimal design efficiency.
 Creating Restful web services for Catalog and Pricing with Django MVT, MySQL and Oracle.
 Using SQLAlchemy with Flask and PostgreSQL as database for developing the web application.
 Used REST and SOAP API for testing web service for server-side changes.
 Developing scripts for build, deployment, maintain and related task using Jenkins, Maven,
Python, Bash.
 Successfully migrated the Django database from SQLite to MySQL to Postgresql with complete
data integrity.
 Automated tasks with tools like Puppet and Ansible.
 Setup Docker on Linux and configured Jenkins to run under Docker host.
 Managed code versioning with GitHub, Bitbucket and deployment to staging and production
servers.
 Taken part in entire lifecycle of the projects including Design, Development, and Deployment,
Testing and Implementation and support.
 Continuous improvement in integration workflow, project testing, and implementation of
continuous integration pipeline with Jenkins

Environment: Python 2.7, Django 1.10, MySQL, TFS, Python Scripting, MongoDB, AJAX,
SOAP, REST, jQuery, JavaScript, Bootstrap, PyCharm, AWS (EC2, S3, RDS)

CERNER, Kansas City, Missouri


Feb' 10 to Dec' 10
Sr. Python Developer

Responsibilities:
 Responsible for using AJAX framework with JQuery, Dojo implementation for Widgets and
Events handling.
 Customizing Log4J for maintaining information and debugging.
 Preparing builds, deploy and co-ordinate with the release management team to ensure that the
proper process is followed during the release.
 Customizing third party vendor information using Web services (SOAP and WSDL)
 Designed DTD's for XML representation of data.
 Worked with development of data access beans using hibernate, middle ware web service
components.
 Develop the GUI using JSP, spring web flow following spring web MVC pattern.
 Implemented persistence layer using Hibernate that use the POJOs to represent the persistence
database tables.
 Generated Ant, Bash scripts for build activities in QA, Staging and Production environments.
 Used SVN for version control across common source code used by developers.
 Written the JUNIT test cases for the functionalities.
 Used Log4j for application logging and debugging.

Environment: Python, RAD 7.0, C++, Ajax, HTML Restful API, MySQL, Django, JSON,
Panda, Java, Shell Scripting, PL/SQL, SVN, Jenkins, Jira, UNIX, Linux,

SEI Investments, Oaks PA


Oct' 07 to Jan' 10
Python Developer

Responsibilities:
 Designed and developed components using Python. Implemented code in Python to retrieve and
manipulated data.
 Re-engineered various modules for implementing changes and creating efficient system.
 Managed large datasets using Panda data frames and MySQL and creating efficient system.
 Designed and Developed UI using HTML, XHTML, AJAX, Java Script and jQuery.
 Used Java Script libraries like jQuery UI, DataGrid, jscolor, high charts.
 Developed the presentation layer HTML, JSP, CSS and DHTML.
 Developed widgets for GUI using PyGtk modules of python.
 Used Django to develop web-based application and deploy it using Jenkins.
 Used MySQL as backend database and MySQL dB of python as database connector to interact
with MySQL server.
 Developed Page layouts, Navigation and presented designs and concepts to the clients and the
management to review.
 Using Restful APIs to access data from different suppliers.
 Used Python and Django for creating graphics, XML processing of documents, data exchange
and business logic implementation between servers.
 Used python Data structures like sqlite3, dictionaries, and tuples.
 Used several Python libraries like NumPy, Pandas and Matplotlib.
 Helped with the migration from the old server to Jira database with Python scripts for
transferring and verifying the information
 Supported Apache Tomcat Web server on Linux Platform.
 Used RAD 7.0 for implementing Static and Dynamic web services for consuming and providing
services related to the business.
 Developed and executed User Acceptance Testing portion of test plan.
 Involved in writing application level code to interact with APIs, Web Serving using AJAX,
JSON, and XML.

Environment: Java, JDK, J2EE, JSP, Spring, Servlets, JavaScript, XML, HTML, CSS, Web
Services (RESTful), XML, XSLT, Ajax, Log4j, Tortoise SVN, Rational Application Developer,
WebSphere application server, Red hat Linux, JBOSS.

Education:
Bachelor of commerce, Gujrat University, Jun 1994 – May 1997

You might also like