You are on page 1of 51

MISBEHAVE USERS ANALYSIS WITH MACHINE

LEARNING FOR WEB REPOSITORY RESULTS

ABSTRACT:

Information sharing is the key goal of Cloud Storage servers. It allows storage
of sensitive and large volume of data with limited cost and high access benefits.
Security must be in given due importance for the cloud data with utmost care to
the data and confidence to the data owner. But this limits the utilization of data
through plain text search. Hence an excellent methodology is required to match
the keywords with encrypted cloud data. The proposed approach similarity
measure of “coordinate matching” combined with “inner product similarity”
quantitatively evaluates and matches all relevant data with search keyword to
arrive at best results. This approach, each document is associated with a binary
vector to represent a keyword contained in the document. The search keyword
is also described as a binary vector, so the similarity could be exactly measured
by the inner product of the query vector with the data vector. The inner product
computation and the two multi-keyword ranked search over encrypted data
(MRSE) schemes ensures data privacy and provides detailed information about
the dynamic operation on the data set and index and hence improves the search
experience of the user.
TABLE OF CONTENTS

TITLE PAGENO

ABSTRACT

LIST OF TABLES

LIST OF FIGURES

LIST OF ABBREVATIONS

1. INTRODUCTION

1.1Synopsis

1.2Objective

2. SYSTEM ANALYSIS

2.1 Existing System

2.2 Proposed System

3. REQUIREMENTS AND SPECIFICATION

3.1 System Requirement Specification

3.2 System Requirement

3.2.1 Hardware Requirements


3.2.2 Software Requirements

4. SOFTWARE DESCRIPTION

4.1. Features of Dot net

4.2. The Dot Net framework

4.3. Languages supported by Dot Net

4.4. Objectives of Dot Net Framework

4.5. Features of Dot Net

5. SYSTEM DESIGN

5.1 Use case diagram

5.2 Activity diagram

5.3 Sequence diagram

5.4 Collaboration diagram

5.5 Data flow diagram

5.6 Architecture diagram

5.7 Entity Relationship diagram

5.8 Class diagram


6. IMPLEMENTATION

6.1 Module description

1. User authentication module

2. Name search module

3. Stop word removal module

4. Multi-keyword search module

5. Individual page updation module

6. Priority based search module

7. Performance evaluation module

7. SYSTEM TESTING

7.1 Testing Objectives

7.2 Types of tests

7.2.1 Unit testing

7.2.2 Integration testing

7.2.3 Functional testing

7.3 System Test

7.4 Test Strategy and Approaches

8. CONCLUSION
8.1 Summary

8.2 Future Enhancement

APPENDIX 1

APPENDIX 2

LITERTURE REVIEW

REFERENCES

INTRODUCTION
INTRODUCTION
1.1Synopsis:
The term Cloud refers to a Network or net. In alternative words, we will say
that Cloud is some things that are gift at remote location. Cloud will give
services over network. Service Models are the reference models on that the
Cloud Computing relies. These may be classified into three basic service
models as listed below:
Infrastructure as a Service (IaaS)
Platform as a Service (PaaS)
Software as a Service (SaaS)
There area unit several alternative service models all of which may take the
shape like saas Anything as a Service. This can be Network as a Service,
Business as a Service, Identity as a Service, information as a Service or
Strategy as a Service. The Infrastructure as a Service (IaaS) is the simplest level
of service. Every of the service models create use of the underlying service
model. The planned approach similarity live of “coordinate matching”
combined with “inner product similarity” quantitatively evaluates and matches
all relevant information with search keyword to make best results. Then that
user can able to upload the same document with changes in that document that
document modified words are updated in the individual page.

OBJECTIVE:

The proposed concept mainly focuses the individual page updation of the multiple
page documents. We are introducing the system to update the particular modified
page or word in that particular document when it can be uploaded again.

Scope:

SYSTEM STUDY

FEASIBILITY STUDY

The feasibility of the project is analyzed in this phase and business proposal
is put forth with a very general plan for the project and some cost estimates.
During system analysis the feasibility study of the proposed system is to be carried
out. This is to ensure that the proposed system is not a burden to the company. For
feasibility analysis, some understanding of the major requirements for the system
is essential.

Three key considerations involved in the feasibility analysis are

 ECONOMICAL FEASIBILITY
 TECHNICAL FEASIBILITY
 SOCIAL FEASIBILITY

ECONOMICAL FEASIBILITY

This study is carried out to check the economic impact that the system
will have on the organization. The amount of fund that the company can pour into
the research and development of the system is limited. The expenditures must be
justified. Thus the developed system as well within the budget and this was
achieved because most of the technologies used are freely available. Only the
customized products had to be purchased.

TECHNICAL FEASIBILITY
This study is carried out to check the technical feasibility, that is, the
technical requirements of the system. Any system developed must not have a high
demand on the available technical resources. This will lead to high demands on the
available technical resources. This will lead to high demands being placed on the
client. The developed system must have a modest requirement, as only minimal or
null changes are required for implementing this system.

SOCIAL FEASIBILITY
The aspect of study is to check the level of acceptance of the system by
the user. This includes the process of training the user to use the system efficiently.
The user must not feel threatened by the system, instead must accept it as a
necessity. The level of acceptance by the users solely depends on the methods that
are employed to educate the user about the system and to make him familiar with
it. His level of confidence must be raised so that he is also able to make some
constructive criticism, which is welcomed, as he is the final user of the system.
SYSTEM ANALYSIS
2. SYSTEM ANALYSIS

System Analysis is a combined process dissecting the system


responsibilities that are based on the problem domain characteristics and user
requirements.
EXISTING SYSTEM:

 The large number of data users and documents in cloud is crucial


for the search service to allow multi-keyword query and provide
result similarity ranking to meet the effective data retrieval need.
 The searchable encryption focuses on single keyword search or
Boolean keyword search, and rarely differentiates the search
results.
 By stop word concept the unwanted keywords will be removed.
 The document search by name not by content. So we get relevant
information and irrelevant information.
 We are using MD5 algorithm in existing system.

Drawbacks of the Existing System:


 Single-keyword search without ranking
 Boolean- keyword search without ranking
PROPOSED SYSTEM:
 We define and solve the challenging problem of privacy-
preserving multi-keyword ranked search over encrypted cloud data
(MRSE), and establish a set of strict privacy requirements for such
a secure cloud data utilization system to become a reality.
 Among various multi-keyword semantics, we choose the efficient
principle of “coordinate matching”.
 In proposed system define public or private page and will be
stored .
 Individual page updation is in this system.
 We ranking the document(abc.doc) by multi key word concept.
 Checksum value for each page.
Merits of the Proposed System:
 Multi-keyword ranked search over encrypted cloud data (MRSE)
 “Coordinate matching” by inner product similarity.
SYSTEM REQUIREMENTS
3. REQUIREMENTS AND SPECIFICATION

3.1 SOFTWARE REQUIREMENTS SPECIFICATION


The software requirements specification is produced at the culmination of the
analysis task. The function and performance allocated to software as part of system
engineering are refined by establishing a complete information description as
functional representation of system behavior, an indication of performance
requirements and design constraints, appropriate validation criteria.

3.2 SYSTEM REQUIREMENTS


3.2.1HARDWARE REQUIREMENTS:
• System : Pentium IV and above.
• Hard Disk : 40 GB or Above.
• Monitor : VGA and High resolution monitor
• Mouse : Logitech.
• Ram : 1 GB or Above

3.2.2 SOFTWARE REQUIREMENTS:


• Operating system :- Windows VISTA or Above.
• Front End :- Microsoft Visual Studio 2010
(.Net Framework 4.0)
• Coding Language : - C#(CSHARP)
• Backend :- SqlServer 2008 R2
SOFTWARE DESCRIPTION

4. SOFTWARE DESCRIPTION
4.1. Features Of .Net

Microsoft .NET is a set of Microsoft software technologies for rapidly


building and integrating XML Web services, Microsoft Windows-based
applications, and Web The Microsoft .NET Framework version 4.0 redistributable
package (released 2010-04-12) installs the .NET Framework runtime and
associated files that are required to run and develop applications to target the .NET
Framework 4.

The Microsoft .NET Framework 4 provides the following new features and
improvements:

 The .NET Framework 4 works side by side with the Framework version 3.5
SP1. Applications that are based on earlier versions of the Framework will
continue to run on that version. Just a subset of functionality is shared by all
versions of the Framework.
 Innovations in the Visual Basic and C# languages, for example statement
lambdas, implicit line continuations, dynamic dispatch, and named/optional
parameters.
 The ADO.NET Entity Framework, which simplifies how developers
program against relational databases by raising the level of abstraction, has
many new features. These includes persistence ignorance and POCO
support, lazy loading, test-driven development support, functions in model,
and new LINQ operators.
 Enhancements to ASP.NET:
o New JavaScript UI Templates and databinding capabilities for
AJAX.
o New ASP.NET chart control.
 Improvements in WPF:
o Added support in Windows Presentation Foundation (WPF) for
Windows 7 multi-touch, ribbon controls, and taskbar extensibility
features.
o Added support in WPF for Surface 2.0 SDK. o New line-of-business
controls including charting control, smart edit, data grid, and others
that improve the experience for developers who build data centric
applications.
o Improvements in performance and scalability.
o Visual improvements in text clarity, layout pixel snapping,
localization, and interoperability.
 Improvements to Windows Workflow (WF) that let developers to better host
and interact with workflows. These include an improved activity
programming model, an improved designer experience, a new flowchart
modeling style, an expanded activity palette, workflow-rules integration, and
new message correlation features. The .NET Framework also offers
significant performance gains for WF-based workflows.
 Improvements to Windows Communication Foundation (WCF) such as
support for WCF Workflow Services enabling workflow programs with
messaging activities, correlation support, durable two-way communication
and rich hosting capabilities. Additionally, .NET Framework 4 provides new
WCF features such as service discovery, router service, simplified
configuration and a number of improvements to queuing, REST support,
diagnostics, and performance.
 Innovative new parallel programming features such as parallel loop support,
Task Parallel Library (TPL), Parallel LINQ (PLINQ), and coordination data
structures which let developers harness the power of multi-core processors.

4.2. THE .NET FRAMEWORK

The .NET Framework has two main parts:

1. The Common Language Runtime (CLR).

2. A hierarchical set of class libraries.

The CLR is described as the “execution engine” of .NET. It provides the


environment within which programs run. The most important features are

 Conversion from a low-level assembler-style language, called


Intermediate Language (IL), into code native to the platform being
executed on.
 Memory management, notably including garbage collection.
 Checking and enforcing security restrictions on the running code.
 Loading and executing programs, with version control and other such
features.
 The following features of the .NET framework are also worth
description:
Managed Code

The code that targets .NET, and which contains certain extra Information -
“metadata” - to describe itself. Whilst both managed and unmanaged code can run
in the runtime, only managed code contains the information that allows the CLR to
guarantee, for instance, safe execution and interoperability.
Managed Data

With Managed Code comes Managed Data. CLR provides memory


allocation and Deal location facilities, and garbage collection. Some .NET
languages use Managed Data by default, such as C#, Visual Basic.NET and
JScript.NET, whereas others, namely C++, do not. Targeting CLR can, depending
on the language you’re using, impose certain constraints on the features available.
As with managed and unmanaged code, one can have both managed and
unmanaged data in .NET applications - data that doesn’t get garbage collected but
instead is looked after by unmanaged code.

Common Type System

The CLR uses something called the Common Type System (CTS) to strictly
enforce type-safety. This ensures that all classes are compatible with each other, by
describing types in a common way. CTS define how types work within the
runtime, which enables types in one language to interoperate with types in another
language, including cross-language exception handling. As well as ensuring that
types are only used in appropriate ways, the runtime also ensures that code doesn’t
attempt to access memory that hasn’t been allocated to it.

Common Language Specification

The CLR provides built-in support for language interoperability. To ensure


that you can develop managed code that can be fully used by developers using any
programming language, a set of language features and rules for using them called
the Common Language Specification (CLS) has been defined. Components that
follow these rules and expose only CLS features are considered CLS-compliant.
THE CLASS LIBRARY

.NET provides a single-rooted hierarchy of classes, containing over 7000


types. The root of the namespace is called System; this contains basic types like
Byte, Double, Boolean, and String, as well as Object. All objects derive from
System. Object. As well as objects, there are value types. Value types can be
allocated on the stack, which can provide useful flexibility. There are also efficient
means of converting value types to object types if and when necessary.

The set of classes is pretty comprehensive, providing collections, file,


screen, and network I/O, threading, and so on, as well as XML and database
connectivity.

The class library is subdivided into a number of sets (or namespaces),


each providing distinct areas of functionality, with dependencies between the
namespaces kept to a minimum.

4.3. LANGUAGES SUPPORTED BY .NET

The multi-language capability of the .NET Framework and Visual Studio


.NET enables developers to use their existing programming skills to build all types
of applications and XML Web services. The .NET framework supports new
versions of Microsoft’s old favorites Visual Basic and C++ (as VB.NET and
Managed C++), but there are also a number of new additions to the family.

Visual Basic .NET has been updated to include many new and improved
language features that make it a powerful object-oriented programming language.
These features include inheritance, interfaces, and overloading, among others.
Visual Basic also now supports structured exception handling, custom attributes
and also supports multi-threading.

Visual Basic .NET is also CLS compliant, which means that any CLS-
compliant language can use the classes, objects, and components you create in
Visual Basic .NET.

Managed Extensions for C++ and attributed programming are just


some of the enhancements made to the C++ language. Managed Extensions
simplify the task of migrating existing C++ applications to the new .NET
Framework.

C# is Microsoft’s new language. It’s a C-style language that is


essentially “C++ for Rapid Application Development”. Unlike other languages, its
specification is just the grammar of the language. It has no standard library of its
own, and instead has been designed with the intention of using the .NET libraries
as its own.

Microsoft Visual J# .NET provides the easiest transition for Java-language


developers into the world of XML Web Services and dramatically improves the
interoperability of Java-language programs with existing software written in a
variety of other programming languages.

Active State has created Visual Perl and Visual Python, which enable .NET-
aware applications to be built in either Perl or Python. Both products can be
integrated into the Visual Studio .NET environment. Visual Perl includes support
for Active State’s Perl Dev Kit.

Other languages for which .NET compilers are available include

 FORTRAN
 COBOL
 Eiffel

Fig1 .Net Framework


C#.NET is also compliant with CLS (Common Language Specification) and
supports structured exception handling. CLS is set of rules and constructs that
are supported by the CLR (Common Language Runtime). CLR is the runtime
environment provided by the .NET Framework; it manages the execution of the
code and also makes the development process easier by providing services.

C#.NET is a CLS-compliant language. Any objects, classes, or components


that created in C#.NET can be used in any other CLS-compliant language. In
addition, we can use objects, classes, and components created in other CLS-
compliant languages in C#.NET .The use of CLS ensures complete
interoperability among applications, regardless of the languages used to create
the application.

CONSTRUCTORS AND DESTRUCTORS:

Constructors are used to initialize objects, whereas destructors are used to


destroy them. In other words, destructors are used to release the resources
allocated to the object. In C#.NET the sub finalize procedure is available. The
sub finalize procedure is used to complete the tasks that must be performed
when an object is destroyed. The sub finalize procedure is called automatically
when an object is destroyed. In addition, the sub finalize procedure can be
called only from the class it belongs to or from derived classes.

GARBAGE COLLECTION

Garbage Collection is another new feature in C#.NET. The .NET


Framework monitors allocated resources, such as objects and variables. In
addition, the .NET Framework automatically releases memory for reuse by
destroying objects that are no longer in use.
In C#.NET, the garbage collector checks for the objects that are not currently in
use by applications. When the garbage collector comes across an object that is
marked for garbage collection, it releases the memory occupied by the object.

OVERLOADING

Overloading is another feature in C#. Overloading enables us to define


multiple procedures with the same name, where each procedure has a different
set of arguments. Besides using overloading for procedures, we can use it for
constructors and properties in a class.

MULTITHREADING:

C#.NET also supports multithreading. An application that supports


multithreading can handle multiple tasks simultaneously, we can use
multithreading to decrease the time taken by an application to respond to user
interaction.

STRUCTURED EXCEPTION HANDLING

C#.NET supports structured handling, which enables us to detect and


remove errors at runtime. In C#.NET, we need to use Try…Catch…Finally
statements to create exception handlers. Using Try…Catch…Finally statements,
we can create robust and effective exception handlers to improve the
performance of our application.

THE .NET FRAMEWORK

The .NET Framework is a new computing platform that simplifies


application development in the highly distributed environment of the Internet.
4.4. OBJECTIVES OF. NET FRAMEWORK

1. To provide a consistent object-oriented programming environment whether


object codes is stored and executed locally on Internet-distributed, or executed
remotely.

2. To provide a code-execution environment to minimizes software deployment


and guarantees safe execution of code.

3. Eliminates the performance problems.

4. There are different types of application, such as Windows-based applications


and Web-based applications.

4.5. FEATURES OF SQL-SERVER

The OLAP Services feature available in SQL Server version 7.0 is now
called SQL Server 2000 Analysis Services. The term OLAP Services has been
replaced with the term Analysis Services. Analysis Services also includes a new
data mining component. The Repository component available in SQL Server
version 7.0 is now called Microsoft SQL Server 2000 Meta Data Services.
References to the component now use the term Meta Data Services. The term
repository is used only in reference to the repository engine within Meta Data
Services

SQL-SERVER database consist of six type of objects, They are,

1. TABLE

2. QUERY

3. FORM

4. REPORT
5. MACRO

TABLE:

A database is a collection of data about a specific topic.

VIEWS OF TABLE:

We can work with a table in two types,

1. Design View

2. Datasheet View

Design View

To build or modify the structure of a table we work in the table


design view. We can specify what kind of data will be hold.

Datasheet View

To add, edit or analyses the data itself we work in tables datasheet


view mode.

QUERY:

A query is a question that has to be asked the data. Access gathers data that
answers the question from one or more table. The data that make up the answer is
either dynaset (if you edit it) or a snapshot (it cannot be edited).Each time we run
query, we get latest information in the dynaset. Access either displays the dynaset
or snapshot for us to view or perform an action on it, such as deleting or updating.
SYSTEM DESIGN
5. SYSTEM DESIGN

System Design involves identification of classes their relationship as well as


their collaboration. In objector, classes are divided into entity classes and control
classes. The Computer Aided Software Engineering (CASE) tools that are
available commercially do not provide any assistance in this transition. CASE tools
take advantage of Meta modeling that are helpful only after the construction of the
class diagram. In the FUSION method some object-oriented approach likes Object
Modeling Technique (OMT), Classes, and Responsibilities. Collaborators (CRC),
etc, are used. Objector used the term ”agents” to represent some of the hardware
and software system. In Fusion method, there is no requirement phase, where a
user will supply the initial requirement document. Any software project is worked
out by both the analyst and the designer. The analyst creates the user case diagram.
The designer creates the class diagram. But the designer can do this only after the
analyst creates the use case diagram. Once the design is over, it is essential to
decide which software is suitable for the application.

5. SYSTEM DESIGN

5.1 UML DIAGRAM OF THE PROJECT:

UML is a standard language for specifying, visualizing, and documenting of


software systems and created by Object Management Group (OMG) in 1997.There
are three important type of UML modeling are Structural model, Behavioral
model, and Architecture model. To model a system the most important aspect is to
capture the dynamic behavior which has some internal or external factors for
making the interaction. These internal or external agents are known as actors. It
consists of actors, use cases and their relationships. In this fig we represent the Use
Case diagram for our project.
Use case Diagram:

A use case is a set of scenarios that describing an interaction between a user


and a system. A use case diagram displays the relationship among actors and use
cases. The two main components a user or another system that will interact with
the system modeled. A use case is an external view of the system that represents
some action the user might perform in order to complete a task.

search Documents

stop word

DB
Stemming Technique

MRSE Scheme
USER

cloud
check sum value

priority(sorting)

final document
5.2 ACTIVITY DIAGRAM:

Activity diagram are typically used for business process modeling for
modeling the logic captured by a single use case or usage scenario, or for modeling
the detailed logic of a business rule. Although UML activity diagrams could
potentially model the internal logic of a complex operation it would be far better to
simply rewrite the operation so that it is simple enough that you don’t requires an
activity diagram. In many ways UML activity diagrams are the objects-oriented
equivalent of flow charts and data flow diagrams (DFDs) from structured
development.

Search
documents

Stop word
removal

Stemming
Technique

MRSE scheme

Checksum
Value

Priority pages
5.3 SEQUENCE DIAGRAM:

The Sequence Diagram models the collaboration of objects based on a time


sequence. It shows how the objects interact with others in a particular scenario of a
use case.

USER
CLOUD
1.search documents

check db()

verified documents sent

stop word removal

stemming words removal

db verified

list of words sent

multi-keyword document search

MRSE scheme

return files in a ranked order

updation of particular page

check sum value

pages updated

priority-wise pages sorted


5.4 COLLOBORATION DIAGRAM:

A collaboration diagram describes interactions among objects in terms of


sequenced messages. Collaboration diagrams represent a combination of
information taken from class, sequence, and use case diagrams describing both the
static structure and dynamic behavior of a system.

2: check db()
6: db verified
9: MRSE scheme
12: check sum value
1: 1.search documents
4: stop word removal
5: stemming words removal
8: multi-keyword document search
11: updation of particular page
USER CLOUD

3: verified documents sent


7: list of words sent
10: return files in a ranked order
13: pages updated
14: priority-wise pages sorted
DATA FLOW DIAGRAM:

Remov
e unwan
ted words`

5.6 ARCHITECTURE DIAGRAM:


5.7 ENTITY RELATIONSHIP DIAGRAM

Priority
5.8 CLASS DIAGRAM

cloud users
cloud system
+name
+address +name
+id +id

+send a data() +process the timer()


+set a timer() +delete the document()
+generate a key() +Handling the malware()

receiver
+name
+id
+address
+view the document()
+download the document()
IMPLEMENTATION
6. IMPLEMENTATION

Module description

1) USER AUTHENTICATION MODULE

 User‘s information is stored in database to check whether the user is


authenticated or unauthenticated user.

 When the user is authorized person means document is searched .

 User’s information is searched by using name and password in a database.


Both the name and password is matching with the database then only
documents are searched in a database.

 User is unauthorized person, documents not searched and it will produce


errors.
User authentic
ation check

2. NAME SEARCH MODULE

 After the verification of user’s information in a Database. User is


allowed to search a document
 Using document name only. Each and every word is sorted and
produce output based on name search in a database.
 Content-wise search is allowed in this Proposed System.
Search by name

3) STOP WORD REMOVAL MODULE

 Documents searched in a database by the order of document’s


name and its content.
 Document‘s content is sorted by using stop word removal
technique in a database.
 Stemming technique used to list the words by the removal of
stemming words in a document.
4) MULTI-KEYWORD SEARCH MODULE

 Users search the document by using multiple keywords in a database.

 Removal of words in a database finally sorts out some multi-keywords for


the document.

 These Multi-keywords are sorted by means of using priority basis.

 Multi-keyword ranked search over encrypted(MRSE) data is a technique


involved and returning files in a ranked keyword order regarding to certain
relevance criteria(eg:keyword frequency)

Multi-keyword concept

5) INDIVIDUAL PAGE UPDATION MODULE


 Once the user checks about documents they are getting contents of
various pages as a output .
 User wants to do the updation in particular page in a document,
each page searched in a database and particular page is sorted out
first and it is modified by the user.
 Those modification of particular page is done by using check sum
technique in a database.
 It checks the pages and giving priority to the content.
Updation of individual page
6) PRIORITY BASED SEARCH MODULE :
 User searched the document by using multi-keywords and finally
outputs produced based on large number contents.
 Each and every content in a page is analyzed in a document and
priority given to huge pages in a database.
 Priority wise pages are sorted in this module and cost is reduced
because of using this technique in a cloud server.
Priority wise sorted

7) PERFORMANCE EVALUATION MODULE


 Document is searched by name of document. And outputs
produced based on the multi-keyword search in a document.
 Finally, documents saved in a database using public or private
methodology.
 When the document is saved as public, its related to everyone’s
view.
 Second method, it is saved as private produces output related to
authorized user view only.
Public/Private
SYSTEM TESTING
7. SYSTEM TESTING

7.1 TESTING OBJECTIVES


The purpose of testing is to discover errors. Testing is the process of trying
to discover every conceivable fault or weakness in a work product. It provides a
way to check the functionality of components, sub-assemblies, assemblies and/or a
finished product It is the process of exercising software with the intent of ensuring
that the Software system meets its requirements and user expectations and does not
fail in an unacceptable manner. There are various types of test. Each test type
addresses a specific testing requirement.

7.2 TYPES OF TESTS


7.2.1Unit testing
Unit testing involves the design of test cases that validate that the internal
program logic is functioning properly, and that program inputs produce valid
outputs. All decision branches and internal code flow should be validated. It is the
testing of individual software units of the application .it is done after the
completion of an individual unit before integration. This is a structural testing, that
relies on knowledge of its construction and is invasive. Unit tests perform basic
tests at component level and test a specific business process, application, and/or
system configuration. Unit tests ensure that each unique path of a business process
performs accurately to the documented specifications and contains clearly defined
inputs and expected results.

7.2.2Integration testing
Integration tests are designed to test integrated software components to
determine if they actually run as one program. Testing is event driven and is more
concerned with the basic outcome of screens or fields. Integration tests
demonstrate that although the components were individually satisfaction, as shown
by successfully unit testing, the combination of components is correct and
consistent. Integration testing is specifically aimed at exposing the problems that
arise from the combination of components.

7.2.3 Functional test


Functional tests provide systematic demonstrations that functions tested are
available as specified by the business and technical requirements, system
documentation, and user manuals.

Functional testing is centred on the following items:

Valid Input : identified classes of valid input must be accepted.

Invalid Input : identified classes of invalid input must be rejected.

Functions : identified functions must be exercised.

Output : identified classes of application outputs must be exercised.

Systems/Procedures: interfacing systems or procedures must be invoked.

Organization and preparation of functional tests is focused on requirements, key


functions, or special test cases. In addition, systematic coverage pertaining to
identify Business process flows; data fields, predefined processes, and successive
processes must be considered for testing. Before functional testing is complete,
additional tests are identified and the effective value of current tests is determined.

Test objectives

 All field entries must work properly.


 Pages must be activated from the identified link.
 The entry screen, messages and responses must not be delayed.
Features to be tested

 Verify that the entries are of the correct format


 No duplicate entries should be allowed
 All links should take the user to the correct page.

7.3 SYSTEM TEST


System testing ensures that the entire integrated software system meets
requirements. It tests a configuration to ensure known and predictable results. An
example of system testing is the configuration oriented system integration test.
System testing is based on process descriptions and flows, emphasizing pre-driven
process links and integration points.

White Box Testing


White Box Testing is a testing in which in which the software tester has
knowledge of the inner workings, structure and language of the software, or at least
its purpose. It is purpose. It is used to test areas that cannot be reached from a black
box level.

Black Box Testing


Black Box Testing is testing the software without any knowledge of the inner
workings, structure or language of the module being tested. Black box tests, as
most other kinds of tests, must be written from a definitive source document, such
as specification or requirements document, such as specification or requirements
document. It is a testing in which the software under test is treated, as a black
box .you cannot “see” into it. The test provides inputs and responds to outputs
without considering how the software works.
Unit Testing:

Unit testing is usually conducted as part of a combined code and unit test
phase of the software lifecycle, although it is not uncommon for coding and unit
testing to be conducted as two distinct phases.

7.4 Test strategy and approach


Field testing will be performed manually and functional tests will be written
in detail.

Test objectives

 All field entries must work properly.


 Pages must be activated from the identified link.
 The entry screen, messages and responses must not be delayed.
Features to be tested

 Verify that the entries are of the correct format


 No duplicate entries should be allowed
 All links should take the user to the correct page.
Integration Testing
Software integration testing is the incremental integration testing of two or
more integrated software components on a single platform to produce failures
caused by interface defects.

The task of the integration test is to check that components or software


applications, e.g. components in a software system or – one step up – software
applications at the company level – interact without error.

Test Results: All the test cases mentioned above passed successfully. No defects
encountered.
Acceptance Testing
User Acceptance Testing is a critical phase of any project and requires
significant participation by the end user. It also ensures that the system meets the
functional requirements.

Test Results:

All the test cases mentioned above passed successfully. No defects


encountered.
CONCLUSION

8.1SUMMARY:
Main goal of this project is evaluation of storing data in cloud more secure. This
section include various test conducted on data stored in cloud, these test are
conducted on the basic of various parameters. Due to loss and Damage of Data
Transmission, We Proposed one Concept. In order to overcome that, We Proposed
a Technique to Transferring the Image or Video form Source to Destination
Without any Loss of Data and Leakage of Data.

8.2 FUTURE ENHANCEMENT:

In FUTURE ENHANCEMENT, the user get a alert from cloud admin to approval
the other user request.And also in future the system is used to store and view the
file like Image,Video,Audio and etc.

APPENDIX 1

Sample Code:

APPENDIX 2
Screen Shots:

LITERTURE REVIEW
REFERENCES

REFERENCES
[1] E. Adar, J. Teevan, and S. T. Dumais, “Large scale analysis of Web
revisitation patterns,” in Proc. SIGCHI Conf. Human Factors Comput.
Syst., 2008, pp. 1197–1206.
[2] D. Carmel, et al., “Personalized social search based on the user’s
social network,” in Proc. 18th ACM Conf. Inf. Knowl. Manage., 2009,
pp. 1227–1236.
[3] A. Cockburn, S. Greenberg, S. Jones, B. Mckenzie, and M. Moyle,
“Improving Web page revisitation: Analysis, design and evaluation,”
Inf. Technol. Soc., vol. 1, no. 3, pp. 159–183, 2003.
[4] T. Deng, L. Zhao, and L. Feng, “Enhancing Web revisitation by
contextual keywords,” in Proc. 13th Int. Conf. Web Eng., 2013, pp. 323–
337.
[5] T. Deng, L. Zhao, H. Wang, Q. Liu, and L. Feng, “ReFinder: A
context-based information re-finding system,” IEEE Trans. Knowl. Data
Eng., vol. 25, no. 9, pp. 2119–2132, Sep. 2013.
[6] T. V. Do and R. A. Ruddle, “The design of a visual history tool to
help users refind information within a website,” in Proc. 34th Eur. Conf.
Advances Inf. Retrieval, 2012, pp. 459–462.
[7] S. Dumais, E. Cutrell, J. Cadiz, G. Jancke, R. Sarin, and D. C.
Robbins, “Stuff I’ve seen: A system for personal information retrieval
and re-use,” in Proc. 26th Annu. Int. ACM SIGIR Conf. Res. Develop.
Inf. Retrieval, 2003, pp. 72–79.
[8] R. Durrett, Probability: Theory and Examples, 4th ed. Cambridge,
U.K.: Cambridge Univ. Press, 2010.
[9] H. C. Ellis and R. R. Hunt, Fundamentals of Human Memory and
Cognition, 3rd ed. Dubuque, IA, USA: William C. Brown, 1983.
[10] J. A. Gamez, J. L. Mateo, and J. M. Puerta, “Improving revisitation
browsers capability by using a dynamic bookmarks personal toolbar,” in
Proc. 8th Int. Conf. Web Inf. Syst. Eng., 2007, pp. 643– 652.

You might also like