You are on page 1of 7

expertjava09@gmail.

com
3212915759

Summary:
Having 9.2 years of overall IT experience and 8 years of Java developer with AWS in
Implementation of Projects and microservice applications using Java Technologies, Used
extensively Spring, Hibernate, Web services (REST and SOAP), AWS cloud and having 1.2 years
of hands on experience with Big Data technologies like Apache Hadoop, Map Reduce, Spark,
Hive, HDFS, Sqoop, MapR. Has high level of proficiency in working with Java, server side
programming.

Professional Synonyms:

 Experience in Web development using Java &J2ee Technologies.


 Strong experience in development of n-tier applications using J2EE technologies like
struts, springs, Hibernate, Web services SOAP and REST.
 Created Java-based Web apps using AWS SDK to read, write and delete data files from
Amazon Simple Storage Service (S3), Amazon SQS, AWS CloudWatch, AWS lambdas
 In depth and extensive knowledge of Hadoop architecture and various Hadoop
components.
 Familiar with components of Hadoop Ecosystem: Map Reduce, HDFS, Hive, Sqoop.
 Developed and proficient in using Apache Spark and MapReduce jobs for Distributed
Environments.
 Involved in data loading and transforming large sets of structured, semi-structured data
using Apache Spark, Sqoop and analysed them by running Hive queries.
 Worked on installation and configuration of MapR distribution.
 Good knowledge in all the phases of Software Development Life Cycle extensively
followed Agile methodology, especially Scrum process.
 Possess strong problem analysis skills with ability to follow project standards and
procedures
 Ability to work effectively while working as a team member as well as individually.
 Excellent communication and Interpersonal Skills with proven abilities in resolving various
Complex software issues.
Education:

 Post Graduate (MSc) in Telecommunications Engineering from Liverpool John Moore’s


University, Liverpool, UK.
 Graduate (B.Tech) in Electronics and communications from
Jawaharlal Nehru Technological University.

Technical skills:

Programming Technologies: JAVA, J2EE and SQL


Web Technologies: Servlets, Struts, Spring, Hibernate, Amazon S3, Amazon SQS, AWS SDK.
Swagger, apache cxf, JPA, Web services (REST & SOAP).
Big Data Technologies: Apache Hadoop, Apache spark, MapReduce, HDFS, Sqoop and Hive.
RDBMS Technologies : Oracle, Cassandra, MySQL.
IDE: Eclipse, IntelliJ, Toad, IBM RAD, MySQL workbench, S3 browser, Sonar
Lint.
Servers: Web logic, Tomcat.
Development Tools: kafka tool, Confluence, Jira, Bit bucket, SVN, HPSM, HPALM, Git.
Sourcetree
Operating System’s: Windows, Ubuntu, Linux (Basics).

Awards Winnings:

1. Have been recognized with the ‘BRAVO’ award for my efforts in delivering DHL shipment
web service up-gradation for TMS application on DEC 2019.

2. Have been recognized with the ‘Pat on Back’ award for taking complete end-to-end
ownership of the task and contributed towards automating fiber, coax file import in line
inquiry service on Sep 2021.

PROJECT DETAILS

Currently Working:

Client : Wells Fargo


Center Company: Busitants Inc
Location: Westborough, MA
Role : Senior Software Engineer
Technologies: REST Web services, Spring boot, PostgreSQL, swagger api, AWS cloudwatch, AWS
S3, AWS lambdas, AWS codepipeline.
Duration : Jan 2020 to till now.
Description:

Information Hub (iHub) as a common place to land, conform and distribute large scale datasets
to service Wells Fargo.
iHub is a platform for delivering current, conformed and consolidated data to
downstream systems.
Sourcing Layer: iHub integrates with multiple SORs to simply deliver the information as -is
to downstream systems (AFS, LoanIQ, LUCAS and InfoLease)
Integration Layer: The Integrated Layer maps source data into a consistent, stable structure
and conforms data values to establish consistent meaning.
Consuming Layer: iHub conforms data from its sources to provide consistent and accurate
information base on subject area.
These areas include Credit, Deposit, Treasury Management and Securities.

Responsibility:

• Written multiple Apache Spark jobs for data cleaning and pre-processing
• Process and analyse the data from Hadoop file system using Apache Spark API.
• Involved in loading data from multiple file formats to HDFS
• Responsible to manage data coming from different sources and application
• Writing Hive Queries to Load data into Hive Tables using HQL
• Importing and Exporting Data from Oracle to Hive table and HDFS files.

Project #2
Project Name: TMS (Transportation Management System)
Client : Mycon Solutions Pvt. Ltd.
Project Type : Logistic Project.
Role : Sr. Software Engineer
Duration : Jan’2019 to Dec’ 2019.

Description:

TMS is Mercurygate International product on Transportation Management system.


It handles all the transportation related modules. It is like Oracle OTM.
In TMS we are integrating loads and shipments with other vendors like UPS, FedEx, DHL, DAT
these are completely SOAP and Restful integrations.
CARMA (Carrier Management) handles all the Carrier, Drivers, Fleet equipment and vendors. In
this module contributed to complex tasks like Driver HOS (Hours of service), satellite tracking, as
this is product everything must be handled through configurations and roles.

Responsibility:

 Analyse and discuss requirements with product owner.


 Develop the code as per Technical specifications.
 Work with architects to develop optimal solution.
 Prioritize, assign and execute tasks throughout using Agile methodology
 Write well-designed, efficient code.
 Schedule and coordinate product releases with internal teams.
Project #3

Project Name: Information Hub (iHub)


Client : Wells Fargo Center
Company : Persistent Systems Inc.
Project Type: Banking Project.
Role : Module Lead
Technologies: MapR, Apache Spark, HDFS, Hive, Jira, Git.
Duration : Jun’2017 to Dec 2018.

Description:

Information Hub (iHub) as a common place to land, conform and distribute large scale datasets
to service Wells Fargo.
iHub is a platform for delivering current, conformed and consolidated data to
downstream systems.
Sourcing Layer: iHub integrates with multiple SORs to simply deliver the information as -is
to downstream systems (AFS, LoanIQ, LUCAS and InfoLease)
Integration Layer: The Integrated Layer maps source data into a consistent, stable structure
and conforms data values to establish consistent meaning.
Consuming Layer: iHub conforms data from its sources to provide consistent and accurate
information base on subject area.
These areas include Credit, Deposit, Treasury Management and Securities.

Responsibility:

• Written multiple Apache Spark jobs for data cleaning and pre-processing
• Process and analyse the data from Hadoop file system using Apache Spark API.
• Involved in loading data from multiple file formats to HDFS
• Responsible to manage data coming from different sources and application
• Writing Hive Queries to Load data into Hive Tables using HQL
• Importing and Exporting Data from Oracle to Hive table and HDFS files.

Project #4

Project Name: CS (complimentary services)


Client : BCI (Beckman Coulter)
Company : Zensar Technologies.
Project Type : E-Commerce Project.
Role : Software Engineer
Technologies : REST Web services, springs, Amazon S3, AWS SDK, swagger, Jira , Git.
Duration : Aug’2016 to May’2017.
Description:

Beckman Coulter is the Life Science., Beckman Coulter is ecommerce website that manages the
selling of instruments that simplify, automate and innovate complex biomedical testing.
It serves customers in two segments: Diagnostics and Life Sciences.
Their diagnostics customers include hospitals and laboratories around the world and produce
information used by physicians to diagnose disease, make treatment decisions and monitor
patients.
Scientists use our life science research instruments to study complex biological problems
including causes of disease and potential new therapies or drugs.
This Project involved Redesign the orthodox application into micro-web services.
Includes order processing, invoice, registration for the product etc.

Responsibility:

• Creating the TDD(Technical Design Document) and updating with respective review
Comments.
• Developing code as per Technical Design Document, reviewing the others code with
Respective to BCI standards.
• Test case preparation and Documentation of Application workflow.
• Testing modules in Development and testing environment.
• Complete quality awareness coordinator for the project by ensuring the internal/external
reviews and final inspections.

Project #5

Project Name: ACT (Agility Course Tests)


Client : AKC (American Kennel Club)
Company : Zensar Technologies.
Project Type: Event Management Business Processing.
Role : Software Engineer
Technologies: REST Web services, springs, Apache cxf, JPA, MySQL, Jira and Git.
Duration : June’2015 to July’2016.

Description:

ACT (Agility Course Tests) is a self-service system for setting up Events data and submitting
ACT event results to AKC so that participants get their points and titles.
Once the organizer login's it show Events in progress, Event create.
Events in progress: Events where the organizer has previously started entering results and
saved for later editing or submission. Sort order should be by date, oldest first.
Event creates: Organizer can create event with details Event date, Organizer details,
Event Location, Event Types.
Once event details are entered and saved, we can add a dog result of Non-Registered Dog
and Registered Dog,
Submit and Reported Event results will submit the event Result and Report to AKC so
that participants get their points and titles.
System generates Event Result ID for the event which the organizer has submitted the results for.

Responsibility:

 Reviewing the SRD (Software Required Document) and giving review comments.
 Development of code for various Modules and reviewing the others code.
 Test case preparation and Documentation of Application workflow.
 Testing each and every module independently.
 Complete quality awareness coordinator for the project by ensuring the internal/external
reviews and final inspections.

Project #6

Project Name: myIRIS


Client : Frequentz
Company : Zensar Technologies
Project Type: Track and Trace.
Role : Software Engineer
Technologies: Web services, Cassandra, Jira, Git.
Duration : April’2014 to May'2015.

Description:

The objective of myIRISmFishPRO is to allow the registered users (Boat Captain /Marbelize
/Customer) to achieve traceability from catch to consumption and capture relevant
traceability data.
With secure access and based on the privileges assigned, the user (Boat Captain /Marbelize/
Customer) can monitor traceability and ensure compliance from point of catch to sale.

Responsibility:

 Reviewing the SRD (Software Required Document) and giving review comments.
 Development of code for various Modules and reviewing the others code.
 Test case preparation and Documentation of Application workflow.
 Testing each and every module independently.
 Complete quality awareness coordinator for the project by ensuring the
internal/external reviews and final inspections.

Project #7

Project Name: ECOMCAT-ecommerce


Client : Hewlett Packard.
Company : Hewlett Packard
Project : E-Commerce Project.
Role : Senior Java Developer
Technologies: Jsp, springs, Hibernate, SOAP Web services, JavaScript.
Duration : Oct ’12 to Jan ’14.

Description:

EComCat is the automated catalogue development and management solution enabling


region-specific, segment-specific and customer-specific catalogues for the ecommerce
environment. eComCat aggregates data from many source systems of content and pricing into
catalogues specific to regions, segments and customers, Customizes the catalogues to meet
customer-specific requirements such as product localization, marketing bundles,
configurations, unique marketing hierarchies, descriptions and part numbers.
Creates catalogues in many formats either to various commerce environments or directly
to customers

Responsibility:
 Understanding the requirements and functionalities.
 Involved in code review, unit testing and writing Test Scripts.
 Involved in writing form bean, Form validation, Controller classes
 Responsible for developing view, controller components and Hibernate mapping files.
 Implemented both client and server side validation in project.
 Test the newly developed enhancements before release to production with all aspects
 of the functionality

You might also like