Professional Documents
Culture Documents
CRAWLER
SUBMITTED BY :
6 FAESIBILITY STUDY
QUESTION ARISES
What is a
Web
Crawler ?
Definition:
A web crawler (also known as a web
spider or web robot) is a program or
automated script which browses the
World Wide Web in a methodical,
automated manner. This process is
called Web crawling or spidering.
USE CASES OF WEB CRAWLER
1. TECHNICAL FEASIBILITY:
2. FINANCIAL FEASIBILITY:
3. OPERATIONAL FEASIBILITY:
OPERATING ENVIRONMENT
Hardware reqirement :
❖ Hard Disk - at least 20GB HDD
❖ ram - 1 GB RAM
PLATFORM - JAVA
FUTURE SCOPE OF THE SYSTEM
GOOGLE
Thanks
for
listening