Professional Documents
Culture Documents
Assignment 1 IT Workshop
Assignment 1 IT Workshop
Prepared For:
Md Maswud Ul Hassan
Course teacher
Ulab School of Business
Assignmen
t-1
IT
Worksho
p
Analytics..............................................................................................................................9
Big data.............................................................................................................................10
ERP....................................................................................................................................11
Antivirus.............................................................................................................................12
Processor Speed.................................................................................................................13
Clock Speed........................................................................................................................14
AI (artificial intelligence)......................................................................................................15
Internet of things (IoT).......................................................................................................16
1
RAM
Random access memory, or RAM, is one of the most important components of not only desktop
PCs, but laptops, tablets, smartphones, and gaming consoles. Without it, doing just about
anything on any system would be much, much slower. Even not having enough for the
application or game you’re trying to run can bring things to a crawl, or make it so they can’t
even run at all.
But what is RAM? In a nutshell, it’s an extremely fast type of computer memory which
temporarily stores all of the information your PC needs right now and in the near future. It’s
where your computer loads up all of the things it thinks it’ll need to find out soon, so that when
it does, it can read it super fast. It’s quite different from your system’s storage, like its hard
drive, where information is stored long term.
If this is all semantics to you and you just need to know how to install some RAM, or want to
find out how much RAM you need, we have guides for that too.
RAM is a bit of a catch-all term, like “memory,” and actually covers a few different types. Most
of the time when people are discussing RAM or memory, what they’re actually talking about is
technically DRAM (dynamic random access memory), or more accurately for modern
systems, SDRAM (synchronous dynamic random access memory). The terminology doesn’t
really matter beyond technicalities, but it’s useful to know that the terms are relatively
interchangeable colloquially.
The most common types of RAM that are sold today is DDR4, though older systems may use
DDR2 or DDR3. Those simply denote the generation of RAM used in that particular system, with
each successive one offering faster speeds through greater bandwidth — a higher megahertz
(MHz) rating. Each generation also saw physical changes, so are not interchangeable.
2
CPU
The central processing unit (CPU) is the unit which performs most of the processing inside a
computer. To control instructions and data flow to and from other parts of the computer, the
CPU relies heavily on a chipset, which is a group of microchips located on the motherboard.
Control Unit: extracts instructions from memory and decodes and executes them
Arithmetic Logic Unit (ALU): handles arithmetic and logical operations
To function properly, the CPU relies on the system clock, memory, secondary storage, and data
and address buses.
The CPU contains internal memory units, which are called registers. These registers contain
data, instructions, counters and addresses used in the ALU's information processing.
Some computers utilize two or more processors. These consist of separate physical CPUs
located side by side on the same board or on separate boards. Each CPU has an independent
interface, separate cache, and individual paths to the system front-side bus. Multiple processors
are ideal for intensive parallel tasks requiring multitasking. Multicore CPUs are also common, in
which a single chip contains multiple CPUs.
3
What is IP?
An Internet Protocol address (IP address) is a logical numeric address that is assigned
to every single computer, printer, switch, router or any other device that is part of a
TCP/IP-based network.
The IP address is the core component on which the networking architecture is built; no
network exists without it. An IP address is a logical address that is used to uniquely
identify every node in the network. Because IP addresses are logical, they can change.
They are similar to addresses in a town or city because the IP address gives the
network node an address so that it can communicate with other nodes or networks, just
like mail is sent to friends and relatives.
The numerals in an IP address are divided into 2 parts:
The network part specifies which networks this address belongs to and
The host part further pinpoints the exact location.
Firewall
এই নিবন্ধটি অসম্পূর্ণ। আপনি চাইলে এটিকে সম্প্রসারিত করে উইকিপিডিয়াকে সাহায্য করতে পারেন।
4
While the two main types of firewalls are host-based and network-based, there are many
different types that can be found in different places and controlling different activities. A host-
based firewall is installed on individual servers and monitors incoming and outgoing signals. A
network-based firewall can be built into the cloud's infrastructure, or it can be a virtual firewall
service.
Types of firewalls
A packet-filtering firewall examines packets in isolation and does not know the packet's
context.
A stateful inspection firewall examines network traffic to determine whether one packet
is related to another packet.
The exponential growth of the internet and the resulting increase in connectivity of networks,
however, meant that filtering network traffic by IP address alone was no longer enough. Static
packet-filtering firewalls, which examine packet headers and use rules to make decisions about
what traffic to let through, arguably became the most important part of every network security
initiative by the end of the last century.
Encryption
In computing, encryption is the method by which plaintext or any other type of data is
converted from a readable form to an encoded version that can only be decoded by another
entity if they have access to a decryption key. Encryption is one of the most important methods
for providing data security, especially for end-to-end protection of data transmitted across
networks.
5
Encryption is widely used on the internet to protect user information being sent between a
browser and a server, including passwords, payment information and other personal
information that should be considered private. Organizations and individuals also commonly use
encryption to protect sensitive data stored on computers, servers and mobile devices like
phones or tablets.
Symmetric-key ciphers, also referred to as "secret key," use a single key, sometimes referred to
as a shared secret because the system doing the encryption must share it with any entity it
intends to be able to decrypt the encrypted data. The most widely used symmetric-key cipher is
the Advanced Encryption Standard (AES), which was designed to protect government classified
information.
Symmetric-key encryption is usually much faster than asymmetric encryption, but the sender
must exchange the key used to encrypt the data with the recipient before the recipient can
perform decryption on the cipher text. The need to securely distribute and manage large
numbers of keys means most cryptographic processes use a symmetric algorithm to efficiently
encrypt data, but use an asymmetric algorithm to securely exchange the secret key.
6
Asymmetric cryptography, also known as public key cryptography, uses two different but
mathematically linked keys, one public and one private. The public key can be shared with
everyone, whereas the private key must be kept secret. The RSA encryption algorithm is the
most widely used public key algorithm, partly because both the public and the private keys can
encrypt a message; the opposite key from the one used to encrypt a message is used to
decrypt it. This attribute provides a method of assuring not only confidentiality, but also the
integrity, authenticity and nonreputability of electronic communications and data at rest through
the use of digital signatures.
Cloud computing
Cloud computing boasts several attractive benefits for businesses and end users. Five of the
main benefits of cloud computing are:
Self-service provisioning: End users can spin up compute resources for almost any type
of workload on demand. This eliminates the traditional need for IT administrators to
provision and manage compute resources.
Elasticity: Companies can scale up as computing needs increase and scale down again
as demands decrease. This eliminates the need for massive investments in local
infrastructure, which may or may not remain active.
Pay per use: Compute resources are measured at a granular level, enabling users to pay
only for the resources and workloads they use.
7
Analytics
Analytics is the scientific process of discovering and communicating the meaningful patterns
which can be found in data.
It is concerned with turning raw data into insight for making better decisions. Analytics relies on
the application of statistics, computer programming, and operations research in order to
quantify and gain insight to the meanings of data. It is especially useful in areas which record a
lot of data or information.
Analytics provides us with meaningful information which may otherwise be hidden from us
within large quantities of data. It is something that any leader, manager or just about anyone
can make use of especially in today’s data-driven word. Information has long been considered
as a great weapon, and analytics is the forge that creates it. Analytics changes everything, not
just in the world of business, but also in science, sports, health care and just about any field
where vast amounts of data are collected.
Analytics leads us to find the hidden patterns in the world around us, from consumer behaviors,
athlete and team performance, to finding connections between activities and diseases. This can
change how we look at the world, and usually for the better. Sometimes we think that a
process is already working at its best, but sometimes data tells us otherwise, so analytics helps
us to improve our world.
In the world of business, organizations would usually apply analytics in order to describe,
predict and then improve the business performance of the company. Specifically it would help in
the following areas:
Web analytics
Fraud analysis
Risk analysis
Advertisement and marketing
Enterprise decision management
Market optimization
Market modeling
8
Big data
Big data is a term used to refer to data sets that are too large or complex for traditional data-
processing application software to adequately deal with. Data with many cases (rows) offer
greater statistical power, while data with higher complexity (more attributes or columns) may
lead to a higher false discovery rate. Big data challenges include capturing data, data
storage, data analysis, search, sharing, transfer, visualization, querying, updating, information
privacy and data source. Big data was originally associated with three key
concepts: volume, variety, and velocity] Other concepts later attributed with big data
are veracity (i.e., how much noise is in the data)] and value.
Current usage of the term "big data" tends to refer to the use of predictive analytics, user
behavior analytics, or certain other advanced data analytics methods that extract value from
data, and seldom to a particular size of data set. "There is little doubt that the quantities of
data now available are indeed large, but that’s not the most relevant characteristic of this new
data ecosystem."[6] Analysis of data sets can find new correlations to "spot business trends,
prevent diseases, combat crime and so on." Scientists, business executives, practitioners of
medicine, advertising and governments alike regularly meet difficulties with large data-sets in
areas including Internet search, fintech, urban informatics, and business informatics. Scientists
encounter limitations in e-Science work, including meteorology, genomics,[8]connectomics,
complex physics simulations, biology and environmental research.
Data sets grow rapidly- in part because they are increasingly gathered by cheap and numerous
information- sensing Internet of things devices such as mobile devices, aerial (remote sensing),
software logs, cameras, microphones, radio-frequency identification (RFID) readers
and wireless sensor networks. The world's technological per-capita capacity to store information
has roughly doubled every 40 months since the 1980s; ] as of 2012, every day
2.5 exabytes (2.5×1018) of data are generated. Based on an IDC report prediction, the global
data volume will grow exponentially from 4.4 zetta bytes to 44 zettabytes between 2013 and
2020.]By 2025, IDC predicts there will be 163 zetta bytes of data. One question for large
9
enterprises is determining who should own big-data initiatives that affect the entire
organization.
Relational database management systems, desktop statistics and software packages used to
visualize data often have difficulty handling big data. The work may require "massively parallel
software running on tens, hundreds, or even thousands of servers What qualifies as being "big
data" varies depending on the capabilities of the users and their tools, and expanding
capabilities make big data a moving target. "For some organizations, facing hundreds of
gigabytes of data for the first time may trigger a need to reconsider data management options.
For others, it may take tens or hundreds of terabytes before data size becomes a significant
consideration.
ERP
10
ERP software typically consists of multiple enterprise software modules that are individually
purchased, based on what best meets the specific needs and technical capabilities of the
organization. Each ERP module is focused on one area of business processes, such as product
development or marketing.
Some of the most common ERP modules include those for product planning, material
purchasing, inventory control, distribution, accounting, marketing, finance and HR. A business
will typically use a combination of different modules to manage back-office activities and tasks
including the following:
Antivirus
Antivirus software is a program or set of programs that are designed to prevent, search for,
detect, and remove software viruses, and other malicious software like worms, trojans, adware,
and more.
These tools are critical for users to have installed and up-to-date because a computer
without antivirus software protection will be infected within minutes of connecting to the
internet. The bombardment is constant, which means antivirus companies have to update their
detection tools regularly to deal with the more than 60,000 new pieces of malware created
daily.
Today's malware (an umbrella term that encompasses computer viruses) changes appearance
quickly to avoid detection by older, definition-based antivirus software. Viruses can be
programmed to cause damage to your device, prevent a user from accessing data, or to take
control of your computer.
Several different companies build antivirus software and what each offer can vary but all
perform some essential functions:
Scan specific files or directories for any malware or known malicious patterns
Allow you to schedule scans to automatically run for you
Allow you to initiate a scan of a particular file or your entire computer, or of a CD or
flash drive at any time.
11
Remove any malicious code detected –sometimes you will be notified of an infection and
asked if you want to clean the file, other programs will automatically do this behind the
scenes.
Show you the ‘health’ of your computer
Always be sure you have the best, up-to-date security software installed to protect your
computers, laptops, tablets, and smartphones.
Many antivirus software programs still download malware definitions straight to your device and
scan your files in search of matches. But since, as we mentioned, most malware regularly
morphs in appearance to avoid detection, Webroot works differently. Instead of storing
examples of recognized malware on your device, it stores malware definitions in the cloud. This
allows us to take up less space, scan faster, and maintain a more robust threat library .
Processor Speed
The Processor of the computer can be referred to as the “brains” of the computer. Therefore, if
you want your computer to last a few years without needing to update it; you want to purchase
the fastest processor available.
In simple terms, let’s think about what you do in a day or what you are asking your computer
to do. If you are doing one thing at a time; a single core processor is fine as the computer will
process one thing before trying to process another. However, if you want your computer to
think about multiple things at the same time (you are asking it to multitask) then you need a
multi-core processor.
I see people looking at dual core processors and they think they are fine. You will be for a while
if you are using the computer for “regular” application. However, in today’s world … what is
“regular”. Why not look at a quad core? There is nothing worse than waiting for your computer
12
to do what you want it to. Remember that the computer processor controls the entire machine
and its operations. The efficiency of the processor is a direct correlation to how well your
computer will operate.
Both business people AND the IT people who recommend the best configuration for a business
application always seem to undersell in this area and I have never understood why. Have you
ever thought about or looked at the teenagers today that have gaming programs loaded and
are flipping through screens of high graphics in seconds? How are their computers
configured? They have FAST processors and LOTS of RAM.
Just because you are using your computer for business; I would still recommend that you tell
your IT person that you want the fastest processor (the one that they recommend for gaming)
Why? It will last longer and you will be the envy of everyone else in the office when your work
is finished while they are still waiting to save and print something !
Clock Speed
Clock speed is the rate at which a processor executes a task and is measured in Gigahertz
(GHz). Once, a higher number meant a faster processor, but advances in technology have
made the processor chip more efficient so now they do more with less.
For example: An Intel Core i5 running at 3.46 GHz is not faster than an Intel Core i7 running at
3.06 GHz.
Comparing the speed of your old Pentium 4 CPU (an old scale which peaked at 3.4 GHz) with
the speed of the current i series CPU’s (the new scale which started at 1.6 GHz) is a bit like
comparing the Fahrenheit with Celsius temperature scales. The i series 1.6 GHz CPU runs
faster and outperforms the old style Pentium 4 CPU. Hence your new i5 or i7 running at 3.0
GHz plus cannot be compared to any older generation Pentium 4 hardware.
Remember: Don’t compare computers based on clock speed unless they use the same line of
processors – such as Intel core i3, i5 or i7.
You may come across Intel’s Turbo Boost Technology while shopping for a new PC. This is one
of the many exciting new features the Intel chip has built into core i5 and core i7 processors. It
automatically speeds up the processor for a burst of heavy-duty activity when your PC needs
extra performance. For example, a core i5 chip rated at a hefty 2.5 GHz can kick up to 3 GHz
on demand without stressing the processor or running the risk of overheating.
13
AI (artificial intelligence)
AI (artificial intelligence) is the simulation of human intelligence processes by machines,
especially computer systems. These processes include learning (the acquisition of information
and rules for using the information), reasoning (using rules to reach approximate or definite
conclusions) and self-correction. Particular applications of AI include expert systems, speech
recognition and machine vision. AI is ubiquitous today, used to recommend what you should
buy next online, to understand what you say to virtual assistants such as Amazon's Alexa and
Apple's Siri, to recognise who and what is in a photo, to spot spam, or detect credit card fraud.
At a very high level artificial intelligence can be split into two broad types: narrow AI and
general AI.
Narrow AI is what we see all around us in computers today: intelligent systems that have been
taught or learned how to carry out specific tasks without being explicitly programmed how to
do so. This type of machine intelligence is evident in the speech and language recognition of
the Siri virtual assistant on the Apple iPhone, in the vision-recognition systems on self-driving
cars, in the recommendation engines that suggest products you might like based on what you
bought in the past. Unlike humans, these systems can only learn or be taught how to do
specific tasks, which is why they are called narrow AI. Artificial general intelligence is very
different, and is the type of adaptable intellect found in humans, a flexible form of intelligence
capable of learning how to carry out vastly different tasks, anything from haircutting to building
spreadsheets, or to reason about a wide variety of topics based on its accumulated experience.
This is the sort of AI more commonly seen in movies, the likes of HAL in 2001 or Skynet in The
14
Terminator, but which doesn't exist today and AI experts are fiercely divided over how soon it
will become a reality.
The internet of things, or IoT, is a system of interrelated computing devices, mechanical and
digital machines, objects, animals or people that are provided with unique identifiers ( UIDs) and
the ability to transfer data over a network without requiring human-to-human or human-to-
computer interaction. A thing in the internet of things can be a person with a heart monitor
implant, a farm animal with a biochip transponder, an automobile that has built-in sensors to
alert the driver when tire pressure is low or any other natural or man-made object that can be
assigned an IP address and is able to transfer data over a network.
Increasingly, organizations in a variety of industries are using IoT to operate more efficiently,
better understand customers to deliver enhanced customer service, improve decision-making
and increase the value of the business.
An IoT ecosystem consists of web-enabled smart devices that use embedded processors,
sensors and communication hardware to collect, send and act on data they acquire from their
15
environments. IoT devices share the sensor data they collect by connecting to an IoT gateway
or other edge device where data is either sent to the cloud to be analyzed or analyzed locally.
Sometimes, these devices communicate with other related devices and act on the information
they get from one another. The devices do most of the work without human intervention,
although people can interact with the devices -- for instance, to set them up, give them
instructions or access the data.
The connectivity, networking and communication protocols used with these web-enabled
devices largely depend on the specific IoT applications deployed.
16