Professional Documents
Culture Documents
Kannan S
Phone: (+91)-6381648240
kannansbigdata@gmail.com
PROFESSIONAL SUMMARY
Having 8+ years of experience in designing, developing and maintaining large business applications such as
data migration, integration, conversion, and Testing.
Involved in entire Software Development Life Cycle (SDLC) viz. Requirement Analysis/Definition, Design,
Development, Testing, Implementation, production fixes, Documentation and Support of software
applications.
Having 3 years of experience working in Hadoop Ecosystem, design/developing applications.
Domain experience includes Airline, Banking Finance, Insurance
Exposure in design and development of solutions for Big Data using the Hadoop eco system technologies
(HDFS, Hive, Impala, Sqoop, Map Reduce, Apache Spark, Cassandra,AWS,)
Worked extensively on Hadoop migration project and POCs.
Hands on experience on Data ingestion tools like NIFI.
Capable of processing large sets of Structured, and Semi-structured data
Brought in simplification process and Optimization initiatives to bring efficiency into applications.
Manned versatile roles across diverse applications as Data Engineer, Developer, QA engineer and automation
projects.
Able to assess business rules, collaborate with stakeholders and perform source-to-target data mapping,
design and review
Working on AWS Components like S3, Lambda, EMR, EC2.
Automated spark structured streaming jobs with Kubernetes auto scaling mechanism using dockers
Strong data base experience in MS SQL Server
Worked in Agile Methodologies with JIRA for all Sprint activities
Have good problem solving and analytical skills and love to innovate in order to perform better.
Have strong Interpersonal skills, communication skills and people skills to manage a team.
TECHNICAL SKILLS
Data Eco System : Hadoop, Sqoop, Hive, Apache Spark and AWS
Distribution : Cloudera 5.12
Databases : SQL Server, MySQL, PostgreSQL
Languages : Scala, Java, Python
Framework : Spring boot
Reporting Tools : Jaspersoft Studio
Operating Systems : Linux and Windows
EDUCATION
Project Name: IATA (International Air Transport Association) October 2019– Present
CASSLink is IATA's internet-based data processing and customer management system which facilitates
interaction and exchange of information between cargo intermediaries / agents and airlines participating in IATA
Cargo Accounts Settlement System (CASS). It is designed to simplify the billing and settling of accounts
between airlines and freight forwarders. It is developed for processing of Airline documents - Air Waybills and
correction documents (CCA’s & DCM’s) for billing to IATA Accredited Agents and CASS Associates. It uses global
standards in accepting electronic documents for processing from Airlines.
Responsibilities
Technologies: Hadoop, Spark, HDFS, Hive, Sqoop, Spark Sql, AWS S3, EMR, EC2, Lambda,
Design, develop, and support reports using Jaspersoft reporting platform and related tools
Participated in Functional and Systems Requirements Modelling sessions to create System Requirements
for the application
Involved in working on sprint user stories
Involved in production support activities.
Responsible for creating various entity object and establishing relationship between them by using
Hibernate Annotations.
Written complex SQL queries and transformed them to HQL for static queries.
Involved in working on the Data Analysis, Data Quality and data profiling for handling the business that
helped the Business team. .
Code & peer review of assigned task. Unit testing and Bug fixing.
Responsibilities
Involved in the Development of different modules and working with enhancements of the application where
in a good exposure.
Involved in developing Web Services for clients to perform the operations in Case Import and accessed QAS,
Premium Calculator services through ESB.
Written triggers and procedures to achieve the complex tasks. Especially used triggers to handle A-Sync call
message from JMS
Created Constraints for Dynamic UI rendering based on multiple product combinations.
Created Views to display the data in the Dashboard.
Loading from disparate data sets, and high-speed querying.
Preparing the HBMs and POJOs using Maven.
Creating services for performing business logic or storing/ retrieving data from database.
Written test cases using JUnit with the help of Spring IoC.
To design and the developing module in the project namely SMART web application System.
Involved in developing the functional component of the system
Design and developing GIS database for the system.
Involved in creating Hive tables, data loading and writing hive queries.
Handled Hadoop Map Reduce jobs to process large data sets.
Performed Import and Export of data into HDFS and Hive using Sqoop and managed data within the
environment.
Managed Hive Tables and created child tables based on partitions.
Loaded and transformed large sets of semi structured data.
Code & peer review of assigned task. Unit testing and Bug fixing.
Responsibilities
Design the data portal which is intended to be used by Governments and their agencies to publish
datasets, documents, services, tools and applications collected by them for public use.
Involved in developing platform for creating and hosting Open Data Websites. Responsible for creation
of Dataset Catalogues.
Collecting the open data sets from the corresponding agencies and publish the dataset in OGPL data
portal.
VRM users over internet with using CAS/OpenID or LDAP authentication protocol as proposed in DMS
development.
Maintaining System Security work related to FIREWALL
Managing feedback and WCMS for Website management.
Involved in developing Platform Independent highly scalable content delivery system for web based
eLearning system.
Designed SME and Students UI screen for easily interact with the eLearning system.
Coding technical modules such as User and List Management, Course management, Student Management,
Backup and Restoration, Online Registration, and Question Bank using Java, Moodle, Articulate, and
Lectora for application.
Web based system is implemented in Java using Turbine which is an open source framework as secure
web application.
Used Maven for packaging and building the application.
Involved in facilitate Internet, Wi-Fi connectivity throughout the campus.
Monitor networks to ensure security and availability to specific users by using FIREWALL.
WORK EXPERIENCE
Organization Period
IBS Software Jul 2018– Till Date
SNS Financial Service Apr 2017– July-2018
RIMES May 2016 – Mar 2017
NIC Feb 2012 – Jan 2013
FCRI Dec 2009 - Dec 2011