Professional Documents
Culture Documents
SUMMARY
PROFESSIONAL EXPERIENCE
Currently working as Big Data Senior Specialist in Standard Chartered Data Security
department From November 2019 till date.
TECHNICAL SKILLS
CERTIFICATIONS
PROJECT EXPERIENCE
Project #1:
Organization:Standard Chartered GBS. Role: Big Data Senior Specialist, Data Security
Brief Profile: Standard Chartered GBS is a bank owned global business service, which provides
solution to bank own retail, insurance e.t.c applications.
PROJECT DESCRIPTION
We are building the centralize framework for Security called crypto as a security. Which retail,
insurance, banking domain application will integrate to security PII,PHI data. And can achieve
security by design principles.
RESPONSIBILITIES
1. Development –
▪ Work with application teams to understand the need of security and application
functionality.
▪ Propose solution that best fit in the existing architecture, along with best
practice and guideline.
▪ Do POC with team & demonstrate the solution to Management and explain the
traits.
▪ On-board “crypto as a service” integrated applications to Production.
▪ So far on-board Microservices, Cloud, Kubernetes and Big Data applications.
Environment:
▪ Framework: - Hadoop, Ranger, HDFS, Hive, Java (JDK 1.7/1.8), KMS, Pig,
Zookeeper, Oozie, Spark.
▪ Technology: - Java, Shell Scripting, python.
Project #2:
Organization: Protegrity India Pvt Ltd Role: Technical Lead in Big Data R&D
Brief Profile: Protegrity is the leading enterprise data security software company worldwide,
providing high performance, infinitely scalable, end-to-end data security solutions.
Protegrity delivers centrally managed and controlled data security that protects sensitive information
across the enterprise in Big Data, Cloud, Databases, Applications and file systems from the point of
acquisition to deletion.
PROJECT DESCRIPTION
The Protegrity Software product for securing end-to-end data using encryption and tokenization in
Hadoop environment irrespective of the injection tools called Big Data Protector.
The Big Data Protector uses patent-pending vaultless tokenization and central policy control for
access management and secures sensitive data at rest in the following areas:
• Data in HDFS.
• Data used during MapReduce, Hive and Pig processing, and HBase.
• Data traversing enterprise data systems.
The data is protected from internal and external threats, and users and business processes can continue
to utilize the secured data.
It secures files with volume encryption and protects data inside files using tokenization and strong
encryption protection methods. Depending on the user access rights and the policies set by the
Security Manager, this data is unprotected.
Big Data Protector provides fine-grained field-level protection within the MapReduce, Hive, Pig, and
HBase frameworks as well as provides directory and file level protection (encryption) on HDFS.
RESPONSIBILITIES
1. Development –
▪ Wrote codec and java code to implement encryption feature for data-at-rest &
data-in-motion on HDFS storage.
▪ Created the custom UDF’s to implement tokenization feature for fine-grained
field level protection in Pig, Hive .
▪ Created the Hbase coprocessor to implement tokenization support for Hbase
Tables.
▪ Research on tools like talend, Nifi to provide product integration and support
with Protegrity solution.
▪ Responsible for ensuring the compatibility of the product with the frameworks
like Ranger(KMS), Sentry, Kerberos, Active Directory,HSM.
▪ Managing the product installer & builds. Taking care of products enhancements
& bug fixes.
▪ Migrated the BigData product installer from Linux base to Native Installer by
integrating the Product with Ambari/Cloudera Manager Framework.
▪ Added the compression support with encryption on HDFS storage.
▪ Responsible to float requirements, create design document & to do the code
review of product features.
▪ Worked on few features, based on the customer requirements.
▪ Contributed in the CI implementation for Product.
2. Testing – Created & Executed unit test cases, Performance test, Sanity Check, Full
Regression Test on each environment for all Hadoop distribution version [CDH.x.x-RHEL/
Centos/SLES & HDP.x.x-RHEL/Centos/SLES].
3. Agile Methodology – Currently, Handling the team of Six members. Handling daily stand-
up (SCRUM CALL) meeting –Involvement in daily status reports for all activities related
to the tasks assigned to team.
4. Research – So far did research on a BI tools, AWS Kinesis, Snowflakes, Qubole,
BlueData,Robin Systems.
Environment:
▪ Framework: - Hadoop, HBase, HDFS, Hive, Java (JDK 1.7/1.8), KMS, Pig,
Zookeeper, Oozie, Spark.
▪ Technology: - Java, Shell Scripting, python.
▪ Tools: - Eclipse, Jenkins, Tortoise SVN, Maven, WinSCP, Putty, GitLab.
Project #3:
Organization: Capgemini India Pvt Ltd Role: Consultant
Brief Profile: Capgemini is a global leader in consulting, technology and outsourcing services.
Client: Thomsonreuters
PROJECT DESCRIPTION
The Findlaw is a division of Thomson Reuters, which enables its customer’s services in a legal
division such as finding a lawyer, etc.
The Scheduling system is the system which enables its customers to track end to end execution of a
project.
RESPONSIBILITIES
1. Development - Developing JSP pages, Java Beans and Unit Testing and Peer Code review.
2. Testing – Created & Executed unit test cases, Sanity Check, Full Regression Test on each
environment.
3. Daily stand-up (SCRUM CALL) meeting.
Environment
▪ Framework: - Spring v3.2.2. Hibernate 4.0, H2 DB
▪ Technology: - Java, JavaScript, JSP
▪ Tools: - Eclipse, Jenkins, Perforce visual client.
Project #4:
Organization: Capgemini India Pvt Ltd Role: Software Engineer
RESPONSIBILITIES
1. Development - Developed JSP pages, Java Beans, coding and Unit Testing and Peer
Code review.
2. Testing – Created & Executed unit test cases, full regression test on each environment.
3. Daily stand-up (SCRUM CALL) meeting.
Environment
▪ Framework: - Grails, Spring MVC, Hibernate 4.0.
▪ Technology: - Java, Groovy on Grails, JavaScript, JSP
▪ Tools: - Eclipse, TortoiseSVN 1.7.9.
EDUCATION