You are on page 1of 25

www.chmag.

in

June2013 | Page - 1

www.chmag.in

June2013 | Page - 2

Oracle Hardening
Part-1

Introduction
Oracle and SQL databases are one the most
used databases in enterprises. I will be
taking you through Oracle Hardening to
make it hard for malicious users to break it
the system. Focus will be on the parameters
you need to consider and explanation on
what the parameter does; why it should be
changed; and how it can be done. This will
be covered in multiple parts as its a huge
topic.

Abstract
Following template will be used for each
parameter
WHAT: This will explain what the
parameter is used for and where it
can be found.
WHY: The reason you should
consider changing it
VERSION: Versions of Oracle it is
applicable for;
COMMAND: The command to help
you make the changes (wherever
applicable)

www.chmag.in

Thumb-rule: The Information


security
clichs
(wherever
applicable)
Recommended settings: Table of
recommended
settings
mostly
combined for multiple parameters
that are similar type.

Solution
Firstly, a general but very important check:

WHAT: NEVER run Oracle as


super-user (i.e. root for UNIX and
Local System for Windows)
WHY: Obvious reasons, you don't
want your operating system to be
compromised through an already
exploited Oracle database.
Thumb-rule: Always, run your
applications with privileges less than
that of system.
VERSION: ALL

Before we get into the database parameters


lets look at some required OS level File
permissions.
Oracle uses several file-systems to place a
variety of files. Most of the oracle software
files are stored in Oracle home labeled
ORACLE_HOME. Oracle recommends files
be placed in a format called Oracle Flexible

June2013 | Page - 3

Architecture (OFA) but it increases the


security risk.

WHAT: Execute and write permissions


for others on $ORACLE_HOME
directory to be revoked.
WHY: Unauthorized access to everyone
on the critical files of installation.
Thumb-rule: Revoke the execute
permissions on executable from end
users
VERSION: ALL
Recommended
settings:

File/Directory/

Permission/

Parameter

Value

$ORACLE_HOME

rwxr-x---

directory

and/or

WHY: Unauthorized access to logs


information and disclosure of
information as any user can access
this directories.
VERSION: ALL
Recommended settings:

File/Directory/

Permissio

Parameter

n/ Value

$ORACLE_HOME/rdbms/a

No access to

udit directory

others
everyone

$ORACLE_HOME/rdbms/lo

No access to

g directory

others

non-

$ORACLE_HOME/network/

No access to

trace directory

others
everyone

Check Permissions on files again.


rwxr-xr-x

directory (most files)


Next are some more important directories.
Audit and log directories may contain
important information about system. Some
errors in Oracle lead to generation of Trace
files. We can generate them forcefully after
enabling SQL_TRACE parameter. In
general all trace files have read and write
permission for Oracle software owner and
group of Oracle installation has permission
of read only.

or

rwxr-xr-x

directory
$ORACLE_HOME

or

everyone

world writable
$ORACLE_HOME/bin/*

or

WHAT: Access to important


directories like audit, log or
network\trace directories to be
restricted from others.

WHAT: Files/executable owned by


root should not have setUID and/or
setGID permissions.
WHY: Process that runs this file is
granted access based on the owner of
the file (usually root), rather than
the user who is running it leading to
unauthorized changes to system
VERSION: ALL
Recommended settings:

File/Directory/

Permission/

Parameter

Value

System

Set-UID and Set-

Files/executable

GID

permissions

(especially owned by should not be set.


root)

www.chmag.in

June2013 | Page - 4

And again

WHAT: Files owned by Oracle


Inventory Group (Oinstall) to be
assigned
permissions
as
per
requirement
WHY:Oinstall group owns the
Oracle inventory that is a catalog of
all Oracle software installed on the
system. Unauthorized changes to
system
due
to
inappropriate
permissions.
VERSION: ALL
Recommended settings:

File/Directory/

Permission/

Parameter

Value

Files/executable

Set-UID and Set-

owned

by GID

oracle:oinstall

should not be set.

WHAT: Scheduled scripts should


not be accessible by group or others.
WHY: Scheduled scripts are meant
to be executed by owner on regular
intervals, access to group or others
may lead to unauthorized scripts
being executed on system
VERSION: ALL (UNIX only)
Recommended settings:

& root

One more OS permission setting


and we are done

permissions

And again and again. these


permissions are not going let you
go ;)

oracle users

WHAT: Permissions on data files,


control file and redo log files to be
restricted to owners
WHY:
1. Data files contain all the
database data. If data files
are accessible to public, they
can be read by any user
2. Control files are binary
configuration
files
that
control access to data files.
Public write access to these
files may pose serious threat.
3. Should be appropriate size
and it should be accessible
only to owners. Small redo
logs
cause
system
checkpoints to continuously
put a high load on the buffer
cache and I/O system
VERSION: ALL
Recommended settings:

File/Directory/
Parameter
Data

file

(owner: -rw-------

oracle:oinstall)
Control file (owner: -rw-r----oracle:oinstall)
Redo

log

File/Directory/

Permission/

(owner:

Parameter

Value

oracle:oinstall)

Scheduled scripts

Not

readable

Permission/ Value

files -rw-------

to

group or others
Scheduled scripts of

www.chmag.in

Owned by oracle

June2013 | Page - 5

Finally, we are all set to configure some


important database parameters. This post is
getting bigger so lets take a logical break
here. Next post, we will focus on general
database parameters.

Ajinkya Patil
http://avsecurity.in
Ajinkya is an Information Security
professional
with
experience
in
conducting Web application security, IT
governance reviews, Network security,
Database and OS security reviews of
approximately 500 servers.
He holds a CISA (Associate of ISACA)
certification,
Information
Security
Management certification and has a
Bachelors degree in Information
Technology from Mumbai University.
He also listed in Hall of Fame of
Blackberry (RIM).

www.chmag.in

June2013 | Page - 6

Sarbanes Oxley Act


Part 2
Introduction
In an effort to focus on core competencies,
reduce costs incurred and increase
efficiency,
organizations
today
are
increasingly outsourcing business processes,
data transactions, IT & network systems and
other support services. Further, there has
been an ever-growing emphasis on
governance, risk and compliance with the
result that user organizations are seeking to
get assurance on the effectiveness of
internal
controls
at
the
service
organizations. Traditionally, organizations
relied on Statement on Auditing Standards
No. 70 (SAS 70) reports to get an assurance
on the internal controls of a service
organization. However, the SAS 70 report
primarily focuses on financial reporting
controls and not on other areas such as
security and system availability.
In June 2011, the SAS 70 report was
replaced by the Service Organization
Control (SOC) reports SOC 1, SOC 2 and

www.chmag.in

SOC 3 reports. The SOC 2/SOC 3 reports


are prepared in accordance with Statements
on Standards for Attestations 16 (SSAE 16)
AT Section 101 and based upon the Trust
Services Principles (TSP), as opposed to
SOC 1 reports that are prepared in
accordance with SSAE 16 focusing on the
internal controls over financial reporting
(ICFR).

SOC Report Principles


SOC 1 is fundamentally similar to a SAS 70
report. It reports on controls at a service
organization relevant to user organizations
internal control over financial reporting
(ICFR). Service organizations are required
to provide a description of the systems and
define the controls relevant for the user
organizations financial reporting. Though
the report may cover some IT general
controls, there is no specific focus on
security, privacy or availability in a SOC 1
report.
SOC 2 and SOC 3 reports are based on the
Trust Services Principles developed by
American Institute of Certified Public
Accountants
(AICPA) and Canadian
Institute of Chartered Accountants (CICA).

June2013 | Page - 7

The Trust Services principles are built


around
four
areas
of
Policies,
Communication,
Procedures
and
Monitoring and address controls relevant to
Security, Availability, Processing Integrity,
Confidentiality or Privacy. A service
provider is not required to report on all the
above principles and can limit the review
only to those that are relevant to the
outsourced service being performed.

Auditors Opinion
Management Assertion
System Description
Control Objectives and Activities
(SOC 1)/ Trust Service Principles
and Criteria (SOC 2)
Test Procedures
Testing results

Applicability of SOC reports


SOC Report Types
The SOC reports can be Type I or Type II. A
Type I report typically contains descriptions
of the service organizations systems and a
point in time or as on date design
effectiveness of the controls.
A Type II report contains descriptions of the
service organizations systems and period of
time design and operating effectiveness of
the controls. The period of time is usually a
12 month period; however, it may also cover
a shorter period, such as 6 months.

SOC Report Structure


The SOC 1 and SOC 2 reporting structure is
similar in most parts to a SAS 70 report. A
SOC 3 report is less detailed in terms of the
testing performed by the auditor. One of the
key differences between a SOC 2 and SOC 3
reports is that the SOC 2 report provides a
detailed description of tests of controls and
the testing results and the auditors opinion
of the service organizations system
description, whereas, the SOC 3 report is a
short report on whether the trust principles
are met. It does not have details of the
auditors test procedures, results or opinion.
SOC 1/SOC 2 report sections:

www.chmag.in

It is important for user and service


organizations to understand which report to
obtain based on the type of services being
outsourced and regulatory or user entity
requirements. Where the services are clearly
financial in nature such as processing
payroll, healthcare and transactions, SOC 1
reports would be requested. Where the
services are more technical in nature and
the focus is on addressing security,
availability or privacy such as Cloud service
providers, Data Center collocations etc SOC
2 reports are more applicable. However, it is
important to note that a cloud based ERP
service may need to provide a SOC 1 report
because it provides financial services, as
well as a SOC 2/SOC 3 reports to address
key cloud service aspects such as security
and availability. SOC 2 and SOC 3 reports
are based on the same fundamental criteria.
However, a SOC 3 report is a less detailed
version of the SOC 2 report that can be
made publicly available to anyone. It is most
often used as marketing material.

Comparison SOC 1, SOC 2,


SOC3
The table below provides a brief comparison
of the three SOC reports.

June2013 | Page - 8

Neelima is an Information Security


professional with more than 7 years of
experience. She has been focused on
Information Security & Technology Risk,
Business
Continuity
Management,
Server and Network Security Reviews,
Wireless security reviews, Vulnerability
Assessments, Compliance assessments
such as SOX and SAS 70 (SSAE16, ISAE
3402). She enjoys travelling and
photography.

NeelimaRao
CISSP, CISA, CCNA
neelima.rao.g@gmail.com

www.chmag.in

June2013 | Page - 9

Big Brother is Watching


You NSA Prism
Surveillance Program
NSA (USA) is doing Data collection across
the world and there is nothing wrong in that
according to USA. In the process of spying
non-USA citizens they also spy USA citizens.
It is really difficult to differentiate between
citizens and non-citizens when you are
spying in massive scale of all types of
communications.
Goal of NSA: Having profile of each person
on earth who is connected via any devices to
network. It contains information about each
person like:

Name, Address and Office address


How many bank accounts he owns
and how much money he has in each
of bank account.
What kind of job he does,
researcher? Developer? Politician?
etc.
His family and relation among other
people.
His photo, his family photos

www.chmag.in

What does he speak more often?


What is the content of his talks?
Whom he speaks to more often?
Where do you travel?

This list is to name a few. Each of the


information is done based on person
primarily. Later they can establish the
relationship among other people across
world.
Let's understand this interesting story...

How will they spy through


internet on large scale?
Ask all local companies to provide all
most all data which they have stored in
plain text. Using national security related
acts they can get data legally on all
companies which are based out of USA. For
e.g. if USA asks Google and Facebook for all
data about non-citizens why will they not
give?
A privacy law which exists in USA is
nothing to with outside USA. In this way
they gather huge data in plain text including
pictures, behaviour of user, etc. Facebook in
its IPO has declared that they can share data
for business reasons. USA government pays
June2013 | Page - 10

money for getting all these personal


information. If you don't like to use
Facebook, dont use it. If you dont like to
use Google, don't use it.
Google and Facebook is not forcing any
country use it, each citizens from different
country is going and hooking themselves
into these companies product, that is not
the problem of Google and Facebook or
other large companies from USA.
Legally USA can always ask any data about
USA citizens using court order, this job is
very easy. (Collect massive data here).
Cloud Infrastructure. Billions of $ has
invested by USA companies and its
investors. This has business advantage for
many organizations to go for cloud. Every
big corporate company has invested billions
of dollars in setting up infrastructure for
cheap rates. This is going to go up further
and many of the organizations will use the
cloud infrastructure in coming years. If you
see discussions related to cloud, only one
point which is stopping to some extent is
security, however companies who are
investing in billions are able to convince
other companies with different clouds like
private, public, hybrid, blah blahblah and
make sure organizations adopts it.
This is USA saying like Why do you keep
data inside your country and organizations,
keep it in our country and in some USA
companies servers. You are safe. USA is
succeeding in doing that and its success we
can watch in future too. Again collect data
from here same like in the previous point.
Intercept data at ISP level. Collect all
data from all ISPs of all plain texts and store
in data centre for further analysis. If here is
a specific target they can collect the
encrypted data from all ISPs for later

www.chmag.in

decryption. They are working on breaking


encryption got long and they are building
massive hardwares using ASIC for very long,
they will use every possibility available to
break the encryption and they will improve
further. (Collect massive data here).
NSA (USA) spying divisions in
friendly countries. USA can help friendly
(week) countries with its capabilities and in
return get all ISPs data in respective
countries; they can easily setup these
centres which helps both countries. In this
way they will collect both plain text and
specific or all encrypted traffics. (Collect
massive data here).
Using 3rd party to collect data. This is
very interesting part. Private companies do
work for giving information to NSA, these
private companies can do anything what
they like, as long as they give information
NSA is happy. 3rd party can be companies
or individuals where NSA might have one
time contacts. For e.g. having contacts with
botnet masters. This is powerful method,
since many botnets across world can collect
specific data from each PC of all types. Even
these bot masters does not need to know
whom they are selling to, as long as they
make money they are happy. No bot master
will have connection directly with NSA, but
indirectly through other channels. In case if
some company legally trying to shutdown
botnet, NSA does not care for these botmasters since they know they can get other
bot-net and somebody else will build one
more botnet always exists in underground
and that business is again many billion
dollar business. This is very good options for
NSA to collect specific data from specific
PC/Mobile/Tablet. All Plain text data.

June2013 | Page - 11

Data breach. Does not matter what data


breach happens anywhere in world, again
NSA can get these data via 3rd party and
build a profile about specific person or
organizations etc. All Plain text data.
Individual
blackhat
hacker.
If
somebody gets important data from any
critical infrastructure, they might want to
sell it to other countries, NSA can buy all
these data again 3rd party channel. All Plain
text data.(Collect massive data here).
There are other methods also to gather data
bydifferent means? I have collected few
here.

have invested in collecting information.


Story goes further.
Consider you are searching using keyword
in internet and some very important
information exists in 100th page, will you go
there to look for information? Nope. But for
NSA all critical information does not matter
where it exists, it is important.
NSA has couple of problems to solve; the
same problem does not exist for public or
even for internet yet, few are listed here:

What are the various types of


data collection can happen in
massive scale?

Confidential
documents
of
Organizations across the globe.
All types of user credentials from all
PC, Mobile, Tablet or another device
which will come into existence
tomorrow
Documents like: PDF, DOC, XLS,
PPT,
Source
code,
personal
documents, Images of your family,
etc.
All financial related documents,
credentials to login to bank, credit
card information, trading account,
where do you stay, your digital
certificates, organization digital
certificate etc.

What does NSA do with this massive data?


Do you expect them to use local Google
search engine or Google appliance and do
the search using keywords and expect
humans to navigate information manually?
Nope, it is waste of time and all money they

www.chmag.in

Searching meaningful information in


huge amount of data. USA investors
and every other people on this earth
are speaking about Big data, there is
massive investment has done it.
More the technology improvement
happens, NSA is going to benefit in
searching information in its massive
data. Everyone is investing in big
data like MAD.
Search needs to improve beyond
based on keywords, there are many
efforts are on the way including
context
based
search
within
information + big data analytics.
Even if there are any further
improvements in these areas, the
same technology is going to be used
within NSA as basis to solve specific
problem within NSA
Image based search this needs lot of
improvements. Lot of research is
happening in this area and some of
available for public too. Do you
remember Google image search?
Yes. You remember it.
Searching keyword within different
formats of files like PNG, JPEG,
DOC, XLS, PDF, etc. is not easy in
integrated way. I am pretty sure they
have done enough investment and

June2013 | Page - 12

proceed in this area as well.


Integrated search of all types of
formats is most important before
they apply other technologies like
analytics.
Have you heard of global intelligence
by all software based security
companies these days? Yes. You have
heard about it, more these private
companies improves these domain.
The same technology can be used by
NSA as basis to develop further
within.
Hardware Speed - only one target
here i.e. breaking AES and other
encryption technologies. ASIC plays
larger role here and I am pretty sure
they have improved in this area a lot.

bad about it, apart from getting our own


house in order based on India's national
interests.
I don't mind if India is spying all countries
in world using program similar to PRISM, I
will be happier for the same. But, stronger
control needs to be put on the person who
has access to these collected data against
misuse within our country for some political
reasons or harassing citizens due to
personal rivalry.

Yash K. S.
yashks@gmail.com

Once you collected all data in some of these


methods, you can use this to prevent some
of the problems for country in terms of
national security; at the same time privacy
is also going to be a problem for the good
citizens of its own country.
If USA can put strong controls for not
misusing the information which they
gathered on good citizens of its country,
they will win the game. According to USA
and its citizens collecting information
globally is perfectly acceptable. PRISM
program will continue, I expect no change in
that. It will only become more secret and
more control will be placed further.
If I look at from my country point of view, I
will not accept USA collecting information
from my country.
Spying on other countries is ancient art
(well documented by Chanakya in
Arthashastra more than 2000 years ago);
spying using technology is just an extension
of existing methods of spying. Every country
knows about it and there is nothing to feel

www.chmag.in

June2013 | Page - 13

Android Framework
for Exploitation
Android is a mobile operating system
platform developed by Andy Rubin, Rich
Miner, Nich Sears and Chris White, which
was later acquired by Google Inc. and is
right now developed and maintained by
Google itself. In the smartphone share,
Android covers more than 50% of the
market share, much more than iOS and
other mobile platforms such as Blackberry,
Windows and Symbian.

Botnets, Application vulnerabilities, Write


exploits and so on.
Android Framework for Exploitation could
be
downloaded
from
http://github.com/xysec/ and download the
AFE and AFE-server as well.
AFE is the python based tool, which runs on
the Unix-based OS, and AFE-server is the
application, which runs on the phone, and
connects to the python interface. So, we
would launch up AFE, and the interface
would look something like this.

Due to the popularity of Android among


both the consumer and developer base, it is
also mostly targeted by Malware Authors.
That is the reason we see a lot of Android
malware applications on 3rd party appstores and even sometimes Google Play.
The question arises, how does a malware for
an Android device is created. To show how
easy it is to create a malware for the
Android platform, we will be using a famous
framework known as Android Framework
for Exploitation, also known as AFE, which
is a framework which you could use to
create Automated Android Malwares,

www.chmag.in

To do this, after downloading, you need to


type in
./afe.py
and go to menu, and to modules. You could
list the modules by typing in list, and get a
list of all the available modules right now.

June2013 | Page - 14

To use a particular module in AFE, you need


to type in run module to start the module.
Afe/menu/modules$ run malware
Once you are started with the malware
module, you need to set the reverse IP, and
the items you would like to steal. Once you
set up the options, we would then proceed
to Build It option, and generate an APK
(Android Package/android application file),
which you could use on any android device.
The IP address, which you had specified,
would receive all the Call Logs, Contacts and
Messages from the victims phone.

It is really surprising, that when this


framework was released, none of the anti
viruses detected it. You could also go ahead
and prove proof-of-concepts botnets, which
are operated all via SMS.
In this article, we will be focusing more on
the Leaking Content Provider vulnerability
in Android Applications.
Content Provider in Android, allows
applications to exchange and share data
with other applications, and also provides
them with a better interface to query data,
and perform various functions on it.
In Android by default, when you define a
content provider, its exposure is set as
public, by default. So, if the application
developer has not put special permissions to
protect the content provider, any other

www.chmag.in

application could have access to that


vulnerable applications data. Lets take an
example, say you are using a Banking
Application, and the user is allowed to store
his credit card information and other
sensitive information in the banking
application
for
easier
and
faster
transactions.
But,
in
the
banking
application, the application developer has
not properly defined the permissions for
content provider. So, what an attacker could
do here, is he could create his own malicious
application, and use his application (which
wont be needing any permissions upon
installation) and use it to steal the data
stored in the Baking Application storage. So
the attackers application has now
unrestricted access to the entire sensitive
information in the baking application, and
the users security and privacy is put at risk.
For the present article, we would be taking
the example of an Android Application
named
Catch
Notes.
https://play.google.com/store/apps/details
?id=com.threebanana.notes&hl=en
The application has got around 5000000 10000000 downloads so far, and is one of
the top 100 applications in android. Catch
Notes application allows the user to store
his private and personal notes on his phone.
So, the first thing we need to do is to go to
AFE menu, and then modules.
From the modules section, we will select the
option Get Content Provider.
Afe/menu/modules$ list
apk_inject
dbstealer
decompile_apk
get_content_provider
malware
rageagainstthecage
June2013 | Page - 15

Afe/menu/modules$
get_content_provider

run

After selecting Get_Content_Provider, it


will show a list of all the available Android
apks existing in the AFE Input folder. It will
also ask the user to enter the name of the
apk, which he wishes to decompile.

Now, we will use the query functionality in


AFE, to query this content provider via AFE.
For this functionality, we need the phone
and the python interface to be connected to
each other. Therefore, we would be using
the AFE-Sever app, available at afeframework.com/afeserver.apk.
The
application screen would look something
like this.

Here we would be selecting catch.apk, which


is nothing but the apk file of the catch
application. After entering the apk name, it
will automatically decompile, and will parse
out all the content providers for us.
Enter the name of the apk you want to check
the content query: catch.apk
Here we would be analyzing all the content
providers, and one content provider looks
interesting to us:-

Here, we would be selecting a port, for


example 8899, and in the python
interface, we would be forwarding the
port using adb forward.
Afe$ !adb
tcp:8899

forward

tcp:8899

Thereafter, we need to connect using


the connect command ,and specifying
the IP address and port number.

At this point, we would just select the


content provider we are interested in, and
go back to the menu.

www.chmag.in

Afe$ connect 127.0.0.1 -p 8899

June2013 | Page - 16

So, to query that particular content


provider, and find out whether it is
vulnerable or not, we would type in:Now, it will show us a message connected
and an asterisk sign before the afe. This
means we can now directly interact with the
phone and its applications.
Coming back to the content provider
vulnerability we were discussing
about, we would be using query
command. To get more info about this
command and various options on how
to use it we would be typing in ?query.
Make sure you have the catch
application
installed
on
your
device/emulator, and you have created a
note. My application screen looks like this:-

*Afe/menu$
query
"get
--url
content://com.threebanana.notes.
provider.NotePad/notes"
You would be getting a similar kind of
screen like the one shown below.

This is the entire information stored in the


content providers of the catch notes
application, which could be accessed by any
other application. In the middle, if you
notice carefully, you would see the text field,
corresponding to the value of Hello
Clubhack - from Aditya.
That is how you find and exploit content
provider vulnerabilities. But what if you
want to share it with someone, or publish
your exploit online. For this, you could use
the sploit module present in AFE. Using this
module, you could automate the entire
process of typing in the content provider
name, and extracting information from the
application on the device.
If you go to the exploits folder in the AFE
directory structure, you would notice a file
named as exp-appname.xml. This is a
sample
exploit
for
Local
File
Inclusion/Directory Traversal vulnerability
in Adobe Reader android application. So,
keeping that as a base, we would be building
out own exploit for catch notes.

www.chmag.in

June2013 | Page - 17

In this file, we would be changing few


things, like the <name> tag to be
<name>com.threebanana.notes<
/name>
and<query> tag will contain the query
which needs to be executed, so
<query>

Now, if you type in the command sploit


[exploitname].xml it will automate the
entire exploitation process for you.

"get
--url
content://com.threebanana.notes.
provider.NotePad/notes"
</query>
So, my final exploit looks like this:-

So, if you wish you could now share it with


the community or publish it in the Github
repo of AFE and so on.

Now, we should save this exploit as


catchexploit.xml at the same location as of
exp-appname.xml, and go back to the
python interface. There, we would type in
exploit to go to the exploit menu.
*Afe/menu$ exploit
*Afe/menu/exploit$
Type in a ?to see all the available options.

www.chmag.in

June2013 | Page - 18

Conclusion
In this article, we saw how to find and
exploit basic content provider vulnerability
and even write an exploit for it. A lot of
famous applications are vulnerable to this
particular vulnerability; the only drawback
is very few people are looking into android
application based vulnerabilities right now.
So, its a great time to start looking into appvulnerabilities and write exploits for it. To
prevent these kinds of vulnerabilities all the
application developer needs to do, is set the
permissions for the content provider in
AndroidManifest.xml
and
also
set
android:exported value to false. Thats all
for this article guys. Hope you enjoyed it.

www.chmag.in

Aditya Gupta
adityagupta1991@gmail.com
Aditya Gupta is the co-founder of XY
Securities, an information security firm
based in India. His main expertise
includes Exploiting Web Applications,
Evading Firewalls, Breaking Mobile
Security and Exploit Research. Aditya
has been a frequent speaker to many
conferences
including
Clubhack,
Nullcon, BlackHat, ToorCon.

June2013 | Page - 19

Network Security
Basics Part-1

Part 1-ISO/OSI Model and TCP/ IP


Model

Introduction
From this article we will go through network
security as whole from basic to expert level.
It will help you get better idea of Network
Security. It's just the reference for people
who are interested in network security but
don't know from where to start there is
more to do by own.
Always remember that "Defense in Depth" is
Key of NETWORK SECURITY.

www.chmag.in

In this Network basics article series we will


have brief overview for following topics:
1. Basics of ISO/OSI and TCP/IP
Model.
2. Know better Information Security
OSI Model.

June2013 | Page - 20

Let's begin with 1stpart to understand Basic


of ISO/OSI and TCP/IP Model.
What are reasons for layered model?
Change:When changes are made to
one layer, the impact on the other
layers is minimized. If the model
consists
of
a
single,
allencompassing layer, any change
affects the entire model.

Design:A layered model defines


each layer separately. As long as the
interconnections between layers
remain constant, protocol designers
can specialize in one area (layer)
without worrying about how any
new implementations affect other
layers.
Learning: The layered approach
reduces a very complex set of topics,
activities, and actions into several
smaller, interrelated groupings. This
makes learning and understanding
the actions of each layer and the
model generally much easier.

Troubleshooting: The protocols,


actions, and data contained in each
layer of the model relate only to the
purpose of that layer. This enables
troubleshooting efforts
to
be
pinpointed on the layer that carries
out the suspected cause of the
problem.

Standards: Probably the most


important reason for using a layered
model is that it establishes a
prescribed
guideline
for
interoperability between the various
vendors developing products that
perform
different
data

www.chmag.in

communications tasks. Remember,


though, that layered models,
including the OSI model, provide
only a guideline and framework, not
a rigid standard that manufacturers
can use when creating their
products.

TCP/IP Network Model


Transmission Control Protocol/Internet
Protocol (TCP/IP) is a suite of protocol that
governs the way data travel from one device
to another. Each layer performs a specific
function and is transparent to the layer
above it and the layer below it.
The TCP/IP network model consists of four
layers:
1. Application Layer,
2. Transport Layer,
3. Internet Layer,
4. Network Access Layer

Application
Layer:
The
Application Layer provides the user
with the interface to communication.
This could be your web browser, email client (Outlook, Eudora or
Thunderbird), or a file transfer
client. The Application Layer is
where your web browser, a telnet,
ftp, e-mail or other client application
runs. Basically, any application that
rides on top of TCP and/or UDP that
uses a pair of virtual network sockets
and a pair of IP addresses. The
Application Layer sends to, and
receives data from, the Transport
Layer.

Transport Layer: The Transport


Layer provides the means for the
transport of data segments across
the Internet Layer. The Transport
June2013 | Page - 21

Layer is concerned with end-to-end


(host-to-host)
communication.
Transmission
Control
Protocol
provides
reliable,
connectionoriented transport of data between
two endpoints (sockets) on two
computers that use Internet Protocol
to communicate. User Datagram
Protocol
provides
unreliable,
connectionless transport of data
between two endpoints (sockets) on
two computers that use Internet
Protocol to communicate. The
Transport Layer sends data to the
Internet layer when transmitting
and sends data to the Application
Layer when receiving. Following
template will be used for each
parameter.

Packetization, logical addressing and


routing functions that forward
packets from one computer to
another.
The
Internet
Layer
communicates with the Transport
Layer when receiving and sends data
to the Network Access Layer when
transmitting.

Network Access Layer: The


Network Access Layer provides
access to the physical network. This
is your network interface card.
Ethernet, FDDI, Token Ring, ATM,
OC, HSSI, or even Wi-Fi is all
examples of network interfaces. The
purpose of a network interface is to
allow your computer to access the
wire, wireless or fiber optic network
infrastructure and send data to other
computers. The Network Access
Layer transmits data on the physical
network
when
sending
and
transmits data to the Internet Layer
when receiving.

1.1 ISO/OSI Network Model


Layer 7 - Application Layer

Internet Layer: The Internet


Layer
provides
connection-less
communication across one or more
networks, a global logical addressing
scheme and Packetization of data.
The Internet Layer is concerned with
network to network communication.
(Note: Packetization is act or process
of bundling data into packets
according to a specific protocol) The
Internet Layer is responsible for
Packetization,
addressing
and
routing of data on the network.
Internet Protocol provides the

www.chmag.in

The Application layer provides


services to the software through
which the user requests network
services.
Your computer application software
is not on the Application layer. This
layer isn't about applications and
doesn't contain any applications. In
other words, programs such as
Microsoft Word or Corel are not at
this layer, but browsers, FTP clients,
and mail clients etc.

June2013 | Page - 22

A few of the most popular Application layer


protocols are File Transfer Protocol (FTP): A
protocol that enables a client to send
and receive complete files from a
server.
Hypertext
Transfer
Protocol
(HTTP): The core protocol of the
World Wide Web.
Telnet: The protocol that lets you
connect to another computer on the
Internet in a terminal emulation
mode.
Simple Mail Transfer Protocol
(SMTP):One of several key protocols
that are used to provide e-mail
services.
Domain Name System (DNS): The
protocol that allows you to refer to
other host computers by using
names rather than numbers.
Layer 6 - Presentation Layer
This layer is concerned with data
representation and code formatting.
Masks the differences of data
formats between dissimilar systems
Specifies architecture-independent
data transfer format
Encodes and decodes data; Encrypts
and decrypts data; Compresses and
decompresses data.
Layer 5 - Session Layer
The Session layer establishes,
maintains,
and
manages
the
communication session between
computers.
Manages
user
sessions
and
dialogues.
Controls
establishment
and
termination of logic links between
users.
Reports upper layer errors.

www.chmag.in

Layer 4 - Transport Layer


Manages
end-to-end
message
delivery in network
Provides reliable and sequential
packet delivery through error
recovery
and
flow
control
mechanisms
Provides connectionless oriented
packet delivery.
Two core protocols are found in this layer:
Transmission
Control
Protocol
(TCP): Provides reliable connectionoriented transmission between two
hosts. TCP establishes a session
between hosts, and then ensures
delivery of packets between the
hosts.
User Datagram Protocol (UDP):
Provides connectionless, unreliable,
one-to-one or one-to-many delivery.
Layer 3 - Network Layer
Determines how data are transferred
between network devices.
Routes packets according to unique
network device addresses.

June2013 | Page - 23

Provides flow and congestion control


to
prevent
network
resource
depletion.
The Network layer is where data is
addressed, packaged, and routed
among networks.

Several important Internet protocols


operate at the Network layer like:
Internet Protocol (IP): A routable
protocol that uses IP addresses to
deliver packets to network devices.
IP is an intentionally unreliable
protocol, so it doesnt guarantee
delivery of information.
Address Resolution Protocol (ARP):
Resolves IP addresses to hardware
MAC addresses, which uniquely
identify hardware devices.
Internet Control Message Protocol
(ICMP):
Sends
and
receives
diagnostic messages. ICMP is the
basis of the ubiquitous ping
command.
Internet
Group
Management
Protocol (IGMP): Used to multicast
messages to multiple IP addresses at
once.
Layer 2 - Data Link Layer
As its name suggests, this layer is
concerned with the linkages and
mechanisms used to move data
about the network, including the
topology, such as Ethernet or Token
Ring, and deals with the ways in
which data is reliably transmitted.
Defines procedures for operating the
communication links.
Frames packets.
Detects
and corrects
packets
transmit errors.

www.chmag.in

Layer 1 - Physical Layer


The Physical layer's name says it all.
This layer defines the electrical and
physical specifications for the
networking media that carry the data
bits across a network.
Interfaces between network medium
and devices
Defines optical, electrical and
mechanical characteristics.
Second Part i.e. Know better Information
Security OSI Modelwe will cover in next
series of Networking Basics till then Happy
and Safe Hacking

Anagha Devale-Vartak
http://avsecurity.in
Anagha is an Information Security
professional
with
experience
in
Vulnerability
Assessment,
Web
Application Audit, Database Audit,
Antivirus Review, and Compliance
Audit. She holdsCCNA and CEH
certification.

June2013 | Page - 24

aq

You might also like