You are on page 1of 5

8+ Years of Experience in Java

Kannan S
Phone: (+91)-6381648240
kannansbigdata@gmail.com

PROFESSIONAL SUMMARY

 Having 8+ years of experience in designing, developing and maintaining large business applications such as
data migration, integration, conversion, and Testing.
 Involved in entire Software Development Life Cycle (SDLC) viz. Requirement Analysis/Definition, Design,
Development, Testing, Implementation, production fixes, Documentation and Support of software
applications.
 Having 3 years of experience working in Hadoop Ecosystem, design/developing applications.
 Domain experience includes Airline, Banking Finance, Insurance
 Exposure in design and development of solutions for Big Data using the Hadoop eco system technologies
(HDFS, Hive, Impala, Sqoop, Map Reduce, Apache Spark, Cassandra,AWS,)
 Worked extensively on Hadoop migration project and POCs.
 Hands on experience on Data ingestion tools like NIFI.
 Capable of processing large sets of Structured, and Semi-structured data
 Brought in simplification process and Optimization initiatives to bring efficiency into applications.
 Manned versatile roles across diverse applications as Data Engineer, Developer, QA engineer and automation
projects.
 Able to assess business rules, collaborate with stakeholders and perform source-to-target data mapping,
design and review
 Working on AWS Components like S3, Lambda, EMR, EC2.
 Automated spark structured streaming jobs with Kubernetes auto scaling mechanism using dockers
 Strong data base experience in MS SQL Server
 Worked in Agile Methodologies with JIRA for all Sprint activities
 Have good problem solving and analytical skills and love to innovate in order to perform better.
 Have strong Interpersonal skills, communication skills and people skills to manage a team.

TECHNICAL SKILLS

Data Eco System : Hadoop, Sqoop, Hive, Apache Spark and AWS
Distribution : Cloudera 5.12
Databases : SQL Server, MySQL, PostgreSQL
Languages : Scala, Java, Python
Framework : Spring boot
Reporting Tools : Jaspersoft Studio
Operating Systems : Linux and Windows

EDUCATION

 MCA from Jayaram College of Engineering and Technology, Anna University

 B.Sc. in Chemistry from Kamaraj College, Manonmanium Sundaranar University


PROFESSIONAL EXPERIENCE

Project Name: IATA (International Air Transport Association) October 2019– Present

CASSLink is IATA's internet-based data processing and customer management system which facilitates
interaction and exchange of information between cargo intermediaries / agents and airlines participating in IATA
Cargo Accounts Settlement System (CASS). It is designed to simplify the billing and settling of accounts
between airlines and freight forwarders. It is developed for processing of Airline documents - Air Waybills and
correction documents (CCA’s & DCM’s) for billing to IATA Accredited Agents and CASS Associates. It uses global
standards in accepting electronic documents for processing from Airlines.

Project Role: AWS Cloud Spark Developer

Responsibilities

 Performed Import and Export of remote data to AWS s3.


 Performed Import and Export of data into HDFS and Hive using Sqoop and managed data within the
environment.
 Was responsible for Optimizing Spark SQL queries that helped in saving Cost to the project.
 Loaded and transformed large sets of semi structured data likes XM, JSON, Avro, and Parquet.
 Created multiple Hive tables, running hive queries in those data, implemented Partitioning, Dynamic
Partitioning, and Bucketing in Hive for efficient data access.
 Processed web URL data using scala and converted it to data frames for further transformations.
 Generated complex JSON data after all the transformations for easy storage and access as per client
requirements.
 Developed Spark code and deployed it in EMR.
 Involved in delivering the resultant data to snowflake
 Ran necessary containers (ECS) in AWS to run Docker files.
 Scheduled necessary scripts in EC2 to integration with other servers.
 Created necessary procedures to run Snowflake built on AWS S3
 Created lambda functions in AWS to run ECS containers.
 Involved in handling team for a soften project delivery.
 Work with client/customer to create technical strategies and frameworks
 Line management of team members and their professional development
 Working with other Delivery Leads to perform impact analysis on key initiatives
 Understand and execute Change and Incident management
 Work with the Project Manager in the production of Project Work Package for Production related Products

Technologies: Hadoop, Spark, HDFS, Hive, Sqoop, Spark Sql, AWS S3, EMR, EC2, Lambda,

Project Name: iFlight-Neo July 2018– October 2019


iFlight Neo is a new generation multi-tenant product solving operational and crew management problems in
Airline industry. It is believed to replace the existing desktop based airline operation products with a web
application with all latest technologies such as Spring, Hibernate, Jaspersoft etc. The product has two major
parts: Operations (Ops) & Crew Tracking System (CTS).

Project Role: Java Developer


Responsibilities

 Design, develop, and support reports using Jaspersoft reporting platform and related tools
 Participated in Functional and Systems Requirements Modelling sessions to create System Requirements
for the application
 Involved in working on sprint user stories
 Involved in production support activities.
 Responsible for creating various entity object and establishing relationship between them by using
Hibernate Annotations.
 Written complex SQL queries and transformed them to HQL for static queries.
 Involved in working on the Data Analysis, Data Quality and data profiling for handling the business that
helped the Business team. .
 Code & peer review of assigned task. Unit testing and Bug fixing.

Technologies Java, Springboot, Oracle, Hibernate, Jaspersoft

Project Name: E-Application for IFBI April 2017– July-2018


Indiana Farm Bureau Insurance (IFBI) is the largest writer of farm insurance and the second largest writer of
personal lines insurance in the state of Indiana. This E-Application helps an agent to convert all insurance
applications which are in paper to e-applications. It also communicates with many third party applications
through web services for retrieving customer's information. It facilitates in rapid delivery and approval of a
desired insurance product to a customer. Thus it serves as an end-to-end, straight-through processing platform
for automating new business processing and underwriting of individual life and group insurance products..

Project Role: Java Developer

Responsibilities
 Involved in the Development of different modules and working with enhancements of the application where
in a good exposure.
 Involved in developing Web Services for clients to perform the operations in Case Import and accessed QAS,
Premium Calculator services through ESB.
 Written triggers and procedures to achieve the complex tasks. Especially used triggers to handle A-Sync call
message from JMS
 Created Constraints for Dynamic UI rendering based on multiple product combinations.
 Created Views to display the data in the Dashboard.
 Loading from disparate data sets, and high-speed querying.
 Preparing the HBMs and POJOs using Maven.
 Creating services for performing business logic or storing/ retrieving data from database.
 Written test cases using JUnit with the help of Spring IoC.

Technologies: Java, Spring, Hibernate, PostgreSQL, Jaspersoft

Project Name: Toyota Tsusho Electronics May 2016– Mar 2017


This Project is aimed to find the suitable method to process the big data and produce the relevant
information. Apache Hadoop Distributed System is used to process the big data and Java based programming to
perform the operation. The Apache Hadoop software library is a framework for distributed computing of large
data across clusters of computers using programming models. It is designed to scale up from one machine to
hundreds of machines, each offering local computation and storage. The Hadoop library is designed to detect and
handle failure. The positioning errors of probe taxis depend upon the accuracy of the device itself and need to be
filtered as much as possible.
Project Role: Java with Hadoop Developer
Responsibilities

 To design and the developing module in the project namely SMART web application System.
 Involved in developing the functional component of the system
 Design and developing GIS database for the system.
 Involved in creating Hive tables, data loading and writing hive queries.
 Handled Hadoop Map Reduce jobs to process large data sets.
 Performed Import and Export of data into HDFS and Hive using Sqoop and managed data within the
environment.
 Managed Hive Tables and created child tables based on partitions.
 Loaded and transformed large sets of semi structured data.
 Code & peer review of assigned task. Unit testing and Bug fixing.

Technologies: Hadoop, HDFS, Hive, Sqoop, ArcGIS, Java

Project Name: OGPL Feb 2012-Jan 2013


Open Government Platform (OGPL) is envisioned to be a platform for creating and hosting Open Data
Websites. The website is intended to be used by Governments and their agencies to publish datasets, documents,
services, tools and applications collected by them for public use. It enhances the transparency in the functioning
of Government and shall also open avenues for many more innovative uses of Government Data to give different
perspective of the situation.

Project Role: Java Developer

Responsibilities
 Design the data portal which is intended to be used by Governments and their agencies to publish
datasets, documents, services, tools and applications collected by them for public use.
 Involved in developing platform for creating and hosting Open Data Websites. Responsible for creation
of Dataset Catalogues.
 Collecting the open data sets from the corresponding agencies and publish the dataset in OGPL data
portal.
 VRM users over internet with using CAS/OpenID or LDAP authentication protocol as proposed in DMS
development.
 Maintaining System Security work related to FIREWALL
 Managing feedback and WCMS for Website management.

Technologies: Java, Drupal, Spring, GWT, MySQL.

Project Name: ICAR & NAIP Feb 2012-Jan 2013


Platform Independent highly scalable content delivery system for web based e-Learning system. This
web based system is implemented in Java using Turbine which is an open source framework as secure web
application. This system is expected to allow more time for the teachers to get equipped with recent advances in
the subject matter and to enable the students to interact with the teachers more effectively and fruitfully for
enhancing their knowledge and skills. Further, the e-Learning modules would provide the students an anytime
and anywhere learning opportunity.

Project Role: e-Learning Developer


Responsibilities

 Involved in developing Platform Independent highly scalable content delivery system for web based
eLearning system.
 Designed SME and Students UI screen for easily interact with the eLearning system.
 Coding technical modules such as User and List Management, Course management, Student Management,
Backup and Restoration, Online Registration, and Question Bank using Java, Moodle, Articulate, and
Lectora for application.
 Web based system is implemented in Java using Turbine which is an open source framework as secure
web application.
 Used Maven for packaging and building the application.
 Involved in facilitate Internet, Wi-Fi connectivity throughout the campus.
 Monitor networks to ensure security and availability to specific users by using FIREWALL.

Technologies: Java, Moodle, MySQL, JavaScript, Articulate.

WORK EXPERIENCE

Organization Period
IBS Software Jul 2018– Till Date
SNS Financial Service Apr 2017– July-2018
RIMES May 2016 – Mar 2017
NIC Feb 2012 – Jan 2013
FCRI Dec 2009 - Dec 2011

You might also like