Professional Documents
Culture Documents
Introduction To Internet
Internet Overview- Networks – WWW –Web Protocols ––
Web Organization and Addressing – Internet Service
Module 1 Providers, DNS Servers, Connection Types, Internet
Addresses – Web Browsers and Web Servers -Security and
Dr. MareeswariV Vulnerability-Web System Architecture – URL – Domain
Assistant Professor(Sr) Name – Web Content Authoring - Webserver Administration
SITE, VIT, Vellore – Search Engines
Cabin No: 210-A30
The Internet
The Internet is a global network comprised of smaller networks that Web Organization
are interconnected using standardized communication protocols. The
No one person, company, organization or government runs
Internet standards describe a framework known as the Internet the Internet. It is a globally distributed network comprising many
protocol suite. This model divides methods into a layered system of voluntarily interconnected autonomous networks.
protocols. ISOC is a voluntary membership organization whose purpose is to
promote global information exchange through Internet technology.
It is a network of networks, that consists of millions of private, public,
ISOC appoints the IAB- Internet Architecture Board. They meet
academic, business and government networks of local to global regularly to review standards and allocate resources, like addresses.
scope… IETF- Internet Engineering Task Force. Another volunteer
organization that meets regularly to discuss operational and technical
problems.
Internet services were launched in India on 15th August, 1995 by Videsh
Sanchar Nigam Limited. In November, 1998, the Government opened
up the sector for providing Internet services by private operators.
3 Dr.Mareeswari V/ AP(Sr) / SITE / VIT 01-03-2021 4 Dr.Mareeswari V/ AP(Sr) / SITE / VIT 01-03-2021
Web Protocols
MODERN USES OF INTERNET A protocol is simply a standard for enabling the connection, communication,
and data transfer between two places on a network. Here are some of the key
protocols that are used for transferring data across the Internet.
The internet can be accessed almost anywhere HTTP : Hypertext Transfer Protocol. It is the standard protocol for
by numerous means including mobile internet transferring web pages (and their content) across the Internet.
HTTPS : Hypertext Transfer Protocol over Secure Socket Layer (SSL). For a
services… website to use HTTPS it needs to have an SSL certificate installed on the
server. These are usually issued by a trusted 3rd party, referred to as a
Certificate Authority (CA). When you browse a web page using HTTPS, you
can check the details of the SSL certificate. For example, you could check the
The Internet allows computer users validity of it.
FTP : File Transfer Protocol. It is used to transfer files across the Internet.
to remotely access other computers and FTP is commonly used by web developers to publish updates to a website
(i.e. to upload a new version of the website).
information stores easily, wherever they may Every HTTP request also uses TCP and IP. The Web is just one of the
be. applications built on top of the Internet protocols
5 Dr.Mareeswari V/ AP(Sr) / SITE / VIT 01-03-2021 6 Dr.Mareeswari V/ AP(Sr) / SITE / VIT 01-03-2021
7 Dr.Mareeswari V/ AP(Sr) / SITE / VIT 01-03-2021 8 Dr.Mareeswari V/ AP(Sr) / SITE / VIT 01-03-2021
DNS
The domain name system (DNS) gives us humans an easy way to DNS
identify where we want to go on the Internet.
Consider the website:
Behind the scenes, each domain name maps to an IP address. When
we type a URL in the address bar of our browser, the computer has blog.gardeningknowhow.com
to figure out its IP address. That URL leads to the blog of a gardening tips & tricks
Find the IP address our VIT university website, google and website.
popular websites … refer ipaddress.com First level domain – com
Third level domain – blog
11 Dr.Mareeswari V/ AP(Sr) / SITE / VIT 01-03-2021 12 Dr.Mareeswari V/ AP(Sr) / SITE / VIT 01-03-2021
Now this file is converted to binary code by the browser and it is sent down the wires if we are
connected through Ethernet and if we are using WiFi, first it converts it to radio signal which is
decoded by router in a very low level. It is converted to binary and then sent to the servers.
15 Dr.Mareeswari V/ AP(Sr) / SITE / VIT 01-03-2021 16 Dr.Mareeswari V/ AP(Sr) / SITE / VIT 01-03-2021
RISK PREVENTION
RISKS !!! What do you need to do to minimise some or all of the risk
when accessing the internet ?
17 Dr.Mareeswari V/ AP(Sr) / SITE / VIT 01-03-2021 18 Dr.Mareeswari V/ AP(Sr) / SITE / VIT 01-03-2021
</body>
</html>
21 Dr.Mareeswari V/ AP(Sr) / SITE / VIT 01-03-2021 22 Dr.Mareeswari V/ AP(Sr) / SITE / VIT 01-03-2021
25 Dr.Mareeswari V/ AP(Sr) / SITE / VIT 01-03-2021 26 Dr.Mareeswari V/ AP(Sr) / SITE / VIT 01-03-2021
27 Dr.Mareeswari V/ AP(Sr) / SITE / VIT 01-03-2021 28 Dr.Mareeswari V/ AP(Sr) / SITE / VIT 01-03-2021
29 Dr.Mareeswari V/ AP(Sr) / SITE / VIT 01-03-2021 30 Dr.Mareeswari V/ AP(Sr) / SITE / VIT 01-03-2021
31 Dr.Mareeswari V/ AP(Sr) / SITE / VIT 01-03-2021 32 Dr.Mareeswari V/ AP(Sr) / SITE / VIT 01-03-2021
33 Dr.Mareeswari V/ AP(Sr) / SITE / VIT 01-03-2021 34 Dr.Mareeswari V/ AP(Sr) / SITE / VIT 01-03-2021
Search engine
Web hosting service
A search engine is a software system that is designed to carry
A web hosting service (often shortened to web host) is a type out web searches (Internet searches), which means to search
of Internet hosting service that allows individuals and organizations the World Wide Web in a systematic way for particular information
to make their website accessible via the World Wide Web. Web specified in a textual web search query.
hosts are companies that provide space on a server owned or leased The search results are generally presented in a line of results, often
for use by clients, as well as providing Internet connectivity, referred to as search engine results pages (SERPs)
typically in a data center. The information may be a mix of links to web pages, images,
videos, infographics, articles, research papers, and other types of files.
Some search engines also mine data available in databases or open
Refer more in:
directories. Unlike web directories, which are maintained only by
https://en.wikipedia.org/wiki/Web_hosting_service human editors, search engines also maintain real-time information by
running an algorithm on a web crawler.
Internet content that is not capable of being searched by a web search
engine is generally described as the deep web.
35 Dr.Mareeswari V/ AP(Sr) / SITE / VIT 01-03-2021 36 Dr.Mareeswari V/ AP(Sr) / SITE / VIT 01-03-2021
A search engine maintains the following processes in near real time: Crawling: Google searches the web with automated programs called crawlers,
Web crawling looking for pages that are new or updated. Google stores those page addresses
(or page URLs) in a big list to look at later. We find pages by many different
Indexing methods, but the main method is following links from pages that we already
Searching know about.
Indexing: Google visits the pages that it has learned about by crawling, and
Web search engines get their information by web crawling from site to
tries to analyze what each page is about. Google analyzes the content, images,
site. The "spider" checks for the standard filename robots.txt, addressed to and video files in the page, trying to understand what the page is about. This
it. The robots.txt file contains directives for search spiders, telling it information is stored in the Google index, a huge database that is stored on many
which pages to crawl. After checking for robots.txt and either finding it computers.
or not, the spider sends certain information back to Serving search results: When a user performs a Google search, Google
be indexed depending on many factors, such as the titles, page tries to determine the highest quality results. The "best" results have many
content, JavaScript, Cascading Style Sheets (CSS), headings, or its factors, including things such as the user's location, language, device (desktop
or phone), and previous queries. For example, searching for "bicycle repair
metadata in HTML meta tags. After a certain number of pages crawled, shops" would show different answers to a user in Paris than it would to a user
amount of data indexed, or time spent on the website, the spider stops in Hong Kong. Google doesn't accept payment to rank pages higher, and
crawling and moves on ranking is done algorithmically.
37 Dr.Mareeswari V/ AP(Sr) / SITE / VIT 01-03-2021 38 Dr.Mareeswari V/ AP(Sr) / SITE / VIT 01-03-2021
39 Dr.Mareeswari V/ AP(Sr) / SITE / VIT 01-03-2021 40 Dr.Mareeswari V/ AP(Sr) / SITE / VIT 01-03-2021