You are on page 1of 20

CONTENTS

INTRODUCTION
1.1 Introduction
Chapter 1

1.2 Need for study


1.3 Objective of the study
1.4 Scope of the study
1.5 Limitations of the study
CONCEPTS AND REVIEW

Chapter 2

2.1 Industry profile


2.2 Company profile
2.3 Review of literature
METHODOLOGY

Chapter 3 3.1 Research design


3.2 Data collection details
DATA ANALYSIS
Chapter 4

4.1 Data design


4.2 Data implementation
4.3 Data testing
RESULTS AND DISCUSSION

Chapter 5

5.1 Output
5.2 Sample coding
5.3 Conclusion
5.4 Concepts and review
Appendix and Bibliography

PAGE
NO

Remote Monitoring System


CHAPTER-1

ABSTRACT:
The assets of remote senses digital world daily generatemassive volume of real-time data
(mainly referred to theterm Big Data), where insight information has a potential significanceif
collected and aggregated effectively. In todays era, thereis a great deal added to real-time remote
sensing Big Data than itseems at first, and extracting the useful information in an efficientmanner
leads a system toward a major computational challenges,such as to analyze, aggregate, and store,
where data are remotelycollected. Keeping in view the above mentioned factors, there is aneed
for designing a system architecture that welcomes both realtime,as well as offline data
processing. Therefore, in this paper,
we propose real-time Big Data analytical architecture for remotesensing satellite application. The
proposed architecture comprisesthree main units, such as 1) remote sensing Big Data
acquisitionunit (RSDU); 2) data processing unit (DPU); and 3) data analysisdecision unit
(DADU). First, RSDU acquires data from thesatellite and sends this data to the Base Station,
where initial processingtakes place. Second, DPU plays a vital role in architecturefor efficient
processing of real-time Big Data by providing filtration,load balancing, and parallel processing.
Third, DADU is theupper layer unit of the proposed architecture, which is responsiblefor
compilation, storage of the results, and generation of decisionbased on the results received from
DPU. The proposed architecturehas the capability of dividing, load balancing, and
parallelprocessing of only useful data. Thus, it results in efficiently analyzing
real-time remote sensing Big Data using earth observatorysystem. Furthermore, the proposed
architecture has the capabilityof storing incoming raw data to perform offline analysis onlargely
stored dumps, when required. Finally, a detailed analysis ofremotely sensed earth observatory
Big Data for land and sea areaare provided using Hadoop. In addition, various algorithms
areproposed for each level of RSDU, DPU, and DADU to detect landas well as sea area to
elaborate the working of an architecture.

1.1 INTRODUCTION:
A great deal of interest in the field of Big Data and its analysis has risen , mainly driven from
extensive number of research challenges strappingly related to bonafide applications, such as
modeling, processing, querying, mining, and distributing large-scale repositories.
The term BigData classifies specific kinds of data sets comprising formless data, which dwell
in data layer of technical computing applications.
The data stored in the underlying layer of all these technical computing application scenarios
have some precise individualities in common, such as 1) large scale data, which refers to the size
and the data warehouse;2) scalability issues, which refer to the applications likely tobe running
on large scale 3) sustain extraction transformation loading (ETL) method from low, raw data to
well thought-out data up to certain extent; and 4) developmentof uncomplicated interpretable
analytical over Big Datawarehouses with a view to deliver an intelligent and momentous
knowledge for them.
Big Data are usually generated byonline transaction, video/audio, email, number of clicks,
logs,posts, social network data, scientific data, remote access sensorydata, mobile phones, and
their applications .
Thesedata are accumulated in databases that grow extraordinarily andbecome complicated to
confine, form, store, manage, share,process, analyze, and visualize via typical database
softwaretools.
Advancement in Big Data sensing and computer technologyrevolutionizes the way remote data
collected, processed,analyzed, and managed.
Particularly, most recentlydesigned sensors used in the earth and planetary observatorysystem are
generating continuous stream of data.

Moreover,majority of work have been done in the various fields ofremote sensory satellite image
data, such as change detection, gradient-based edge detection , region similaritybasededge
detection , and intensity gradient technique forefficientintraprediction .

1.3 OBJECTIVES AND SCOPE


Objective of study
Which will easy to Explore and filtered the particular image from the specific database
and with using of SQL server. It can optimize the survey of searching data,place and
unique pictures simplified user interface.
It will immune the network link with fast and secure gateway.
To study about the factors that influences the Filteration.
To final suitable measures to reduce conflicts among the searching process.

1.4 SCOPE OF THE STUDY:


The debilitation technique which will create complex to collect particular data when it is
defragmented that can avoid these complications and complex network to optimize the
system acquired informations.
East to implement using the ASP.net and well define platform
Extended area to know and moderated the system such as monitoring, robotics,
networking process.

1.5 Limitation of the study:

Sometimes we have to deal with erroneous data too, or some of the data might be
imprecise.

Large scale data, which refers to the size and the data warehouse

scalability issues, which refer to the applications likely to be running on large scale

CHAPTER-2
2.1 INDUSTRY PROFILE

The software industry includes businesses for development, maintenance


and publication of software that are using different business models, mainly
either "license/maintenance based" (on-premises) or "Cloud based" (such as
SaaS, PaaS, IaaS, MaaS, AaaS, etc.). The industry also includes software
services, such as training, documentation, and consulting.
History
The word "software" was coined as a prank as early as 1953, but did not
appear in print until the 1960s. Before this time, computers were
programmed either by customers, or the few commercial computer vendors
of the time, such as UNIVAC and IBM. The first company founded to provide
software products and services was Computer Usage Company in 1955.
The software industry expanded in the early 1960s, almost immediately after
computers

were

first

sold

in

mass-produced

quantities.

Universities,

government, and business customers created a demand for software. Many


of these programs were written in-house by full-time staff programmers.
Some were distributed freely between users of a particular machine for no
charge. Others were done on a commercial basis, and other firms such as
Computer Sciences Corporation (founded in 1959) started to grow. The
computer/hardware makers started bundling operating systems, systems
software and programming environments with their machines.
Web development
Web development is a broad term for the work involved in developing a
web site for the Internet (World Wide Web) or an intranet (a private network).
Web development can range from developing the simplest static single page
of plain text to the most complex web-based internet applications, electronic
businesses, and social network services. A more comprehensive list of tasks
to which web development commonly refers, may include web design, web
content development, client liaison, client-side/server-sidescripting, web

server and network security configuration, and e-commerce development.


Among web professionals, "web development" usually refers to the main
non-design aspects of building web sites: writing markup and coding.
For larger organizations and businesses, web development teams can consist
of hundreds of people (web developers). Smaller organizations may only
require a single permanent or contracting webmaster, or secondary
assignment to related job positions such as a graphic designer and/or
information systems technician. Web development may be a collaborative
effort between departments rather than the domain of a designated
department.
Web development as an industry
Since the commercialization of the web, web development has been a
growing industry. The growth of this industry is being pushed especially by
businesses wishing to sell products and services to online customers.1
For tools and platforms, the public can use many open source systems to aid
in web development. A popular example, the LAMP (Linux, Apache, MySQL,
PHP) stack is available for download online free of charge. This has kept the
cost of learning web development to a minimum. Another contributing factor
to the growth of the industry has been the rise of easy-to-use WYSIWYG webdevelopment software, most prominently Adobe Dreamweaver, WebDev, and
Microsoft Expression Studio. Using such software, virtually anyone can
relatively quickly learn to develop a very basic web page. Knowledge of
HyperTextMarkup Language (HTML) or of programming languages is still
required to use such software, but the basics can be learned and
implemented quickly with the help of help files, technical books, internet
tutorials, or face-to-face training.
An ever growing set of tools and technologies have helped developers build
more dynamic and interactive websites. Web developers now help to deliver

applications as web services which were traditionally only available as


applications on a desk-based computer.
Web application development
Web application development is the process and practice of developing
web applications.
Just as with a traditional desktop application, web applications have varying
levels of risk. A personal home page is much less risky than, for example, a
stock trading web site. For some projects security, software bugs, etc. are
major issues. If time to market, or technical complexity is a concern,
documentation, test planning, change control, requirements analysis,
architectural description and formal design and construction practices can
mitigate risk.
Technologies

Ajax

ASP

ASP.NET

CSS

ColdFusion

Java

JavaScript

Perl

PHP

Ruby, including Ruby on Rails

CGI

Python

Django

HTML5

Wt Web toolkit

WebObjects

Xojo

Lifecycle Model
Time to market, company-growth and requirements churn, three things that
are emphasized in web-based business, coincide with the principles of the
Agile practices. Some agile lifecycle models are:

Extreme programming

Scrum

Timebox development

Feature-driven development

Testing
Web applications undergo the same unit, integration and system testing as
traditional desktop applications. But because web application clients vary so
greatly, teams might perform some additional testing, such as:

Security

Performance, Load, and Stress

HTML/CSS validation

Accessibility

Usability

Cross-browser

Many types of tests are automatable. At the component level, one of the
xUnit packages can be a helpful tool. Or an organization can create its own
unit testing framework. At the GUI level, Watir or iMacros are useful.
Tools
In the case of ASP.NET, a developer can use Microsoft Visual Studio to write
code. But, as with most other programming languages, he/she can also use a
text editor. Notepad++ is an example. WebORB Integration Server for .NET
can be used to integrate .NET services, data and media with any web client.
It includes developer productivity tools and APIs for remoting, messaging and
data management.
For ColdFusion and the related open source CFML engines, there are several
tools available for writing code. These include Adobe Dreamweaver CS4, the
CFEclipse plugin for Eclipse (software) and Adobe CF Builder. You can also
use any text editor such as Notepad++ or TextEdit.

For PHP, the Zend Development Environment provides numerous debugging


tools and provides a rich feature set to make a PHP developer's life easier.
WebORB Integration Server for PHP can be used to integrate PHP classes and
data with any web client. It includes developer productivity tools and APIs for
remoting, messaging and data management. Tools such as Hammerkit
abstract PHP into a visual programming environment and utilise componentbased software methods to accelerate development.
For Java (programming language), there are many tools. The most popular is
Apache Tomcat, but there are many others. One very specific one is WebORB
Integration Server which can be used to integrate Java services, data and
media with any web client. It includes developer productivity tools and APIs
for remoting, messaging and data management.
Several code generation tools such as nuBuilder, dbQwikSite or M-Power are
available to automate the development of code. Using such tools, nontechnical users can produce working code, and experienced coders can
accelerate the development cycle.

2.2 COMPANY PROFILE


FHAPL Technologies - Website designing company Chennai India is expert
in Website design and Website development, we provide a wide range of
designing services inclusive of visually appealing and user-friendly websites.

Our website designers are proficient in Photoshop, HTML 5.0, XHTML, CSS
and Web 2.0 standards. We serve our global clients with highly trained and
professional team of experts who have years of experience in this field.
Web Developers have the expertise to deliver a well planned and executed
Web Solution for your business. Our Web Developers also have the ability to
improve your existing website to ensure a successful online business
application. With our easy-to-use CMS Solution, you can update your website
anytime, anywhere. FHAPL Technologies specialize in providing Joomla,
Word Press, Mobile Website and several other applications.
FHAPL Technologies web Development Company, employs an experienced
team of professional PHP programmers and talented designers who have
been delivering excellent projects to our Customers since 2007.FHAPL
Technologies is a successful web design and Development company
established in chennaiproviding customized services to clients located
throughout the world. Professional SEO (Search Engine Optimization) and
SEM (Search Engine Marketing) are some of our core services for our
esteemed clients.
FHAPL Technologies is a proven outsourcing partner to transform the
delivery and economics of integrated maximum IT solutions to achieve
superior quality and real value.

FHAPL is the Software development solutions band providing full featured


Custom Software Application and web solutions and to act as a offshore
development center for overseas development firms.
We provide the optimum edges custom software solutions with perfect enduser satisfaction. Each project is tailored to meet all aspect of your business
needs. We have very well defined IT processes, which results in excellent

project planning, and time bound execution. We provide a seamless


approach to business, technology, and professional services
bycombiningexcellence.
All projects deliver predictable outcomes, clear status visibility, critical
functionality early, and continuous control of cost and schedule. All of our
services are delivered under our collaborative delivery model, which is
proven to be Perfect Solution and cost-effective to our clients. We custom
build software solutions that enlarged vision in your business value and lead
to promote path.

2.3LITERATURE SURVEY
1.An Architecture to Support the Collection of Big Data in the Internet of
Things
The Internet of Things (IoT) relies on physical objects interconnected between each
others, creating a mesh of devices producing information. In this context, sensors are
surrounding our environment (e.g., cars, buildings, smart phones) and continuously collect data
about our living environment.
Thus, the IoT is a prototypical example of Big Data. The contribution of this paper is to
define a software architecture supporting the collection of sensor-based data in the context of the
IoT. The architecture goes from the physical dimension of sensors to the storage of data in a
cloud-based system.
It supports Big Data research effort as its instantiation supports a user while collecting
data from the IoT for experimental or production purposes. The results are instantiated and
validated on a project named SMARTCAMPUS, which aims to equip the SophiaTech campus
with sensors to build innovative applications that supports end-users.

2.BIG Data Analytics: A Framework for Unstructured Data Analysis


Nowadays, most of information saved in companies are unstructured models. Retrieval
and extraction of the information is essential works and importance in semantic web areas. Many
of these requirements will be depend on the unstructured data analysis. More than 80% of all
potentially useful business information is unstructured data, in kind of sensor readings, console
logs and so on.
The large number and complexity of unstructured data opens up many new possibilities for
the analyst. Text mining and natural language processing are two techniques with their methods
for knowledge discovery from textual context in documents. This is an approach to organize a
complex unstructured data and to retrieve necessary information.
The paper is to find an efficient way of storing unstructured data and appropriate
approach of fetching data. Unstructured data targeted in this work to organize, is the public
tweets of Twitter. Building an Big Data application that gets stream of public tweets from twitter
which is latter stored in the HBase using Hadoop cluster and followed by data analysis for data
retrieved from HBase by REST calls is the pragmatic approach of this project.

3.Challenges and Opportunities with Big Data


The promise of data-driven decision-making is now being recognized broadly, and
there is growing enthusiasm for the notion of Big Data, including the recent announcement
from the White House about new funding initiatives across different agencies, that target
research for Big Data.
While the promise of Big Data is real for example, it is estimated that Google alone
contributed 54 billion dollars to the US economy in 2009 there is no clear consensus on what is
Big Data.
In fact, there have been many controversial statements about Big Data, such as Size is
the only thing that matters. In this panel we will try to explore the controversies and debunk the
myths surrounding Big Data.

4.Change Detection in Synthetic Aperture Radar Images Based on Fuzzy


Active Contour Models and Genetic Algorithms
This paper presents an unsupervised change detection approach for synthetic aperture radar
images based on a fuzzy active contour model and a genetic algorithm. The aim is to partition the
difference image which is generated from multitemporal satellite images into changed and
unchanged regions.
Fuzzy technique is an appropriate approach to analyze the difference image where regions
are not always statistically homogeneous.
Since interval type-2 fuzzy sets are well-suited for modeling various uncertainties in
comparison to traditional fuzzy sets, they are combined with active contourmethodology for
properlymodeling uncertainties in the difference image. The interval type-2 fuzzy active
contourmodel is designed to provide preliminary analysis of the difference image by generating
intermediate change detectionmasks.
Each intermediate change detection mask has a cost value. A genetic algorithm is
employed to find the final change detectionmask with the minimum cost value by evolving the
realization of intermediate change detection masks. Experimental results on real synthetic
aperture radar images demonstrate that change detection results obtained by the improved fuzzy
active contour model exhibits less error than previous approaches

5.A Big Data Architecture for Large Scale Security Monitoring


Network traffic is a rich source of information for security monitoring. However the
increasing volume of data to treat raises issues, rendering holistic analysis of network traffic
difficult.
In this paper we propose a solution to cope with the tremendous amount of data to
analyse for security monitoring perspectives. We introduce an architecture dedicated to security
monitoring of local enterprise networks.
The application domain of such a system is mainly network intrusion detection and
prevention, but can be used as well for forensic analysis.
This architecture integrates two systems, one dedicated to scalable distributed data storage
and management and the other dedicated to data exploitation.
DNS data, NetFlow records, HTTP traffic and honeypot data are mined and correlated
in a distributed system that leverages state of the art big data solution.
Data correlation schemes are proposed and their performance are evaluated against
several well-known big data framework including Hadoop and Spark.

CHAPTER-3
RESEARCH METHODOLOGY
RESEARCH METHODOLOGY
Research methodology refers to the method that the research uses in performing
Sin studying the research problem along with the logic behind them.
RESEACH OBJECTIVES
The Remote Monitoring System that will simplify and predict the particular images
from the web server effectively and monitor the exclusive system requirement.
TYPE OF RESEARCH DESIGN DATA
The data and exclusive research design involved by the experimental type to utilize the
project as monitoring system.
Quasi experimental design-one shot design(after only)
DATA
In the research methodology fully data source collected by observation process and
technique.

You might also like