Professional Documents
Culture Documents
Medical Informatics Information
Medical Informatics Information
Group - 113
Topic – Basic Technology For Converting Text Information
Using MS Word
TASK- 6
similar area.
TASK-2
2) in form submission:
The keyboard shortcuts (hot keys)
combinations action
Cltr + Shift or Shift + Alt Switch in Russian/English
Citr+ NUM 5 select all
Cltr + Insert
Shift + Del
Shift -e Insert
Alt + Bac
Citr O
sinx
lim— =
1
La f(x)dx = lim
INFORMATICS (COMPUTER SCIENCE)
Mathematics
Natural
Science
Quantities of information
Information theory is based on probability theory and statistics. The most important
quantities of information are entropy. the information in a random rariable. and mutual
information. the amount of information in common between random variables. The former
quantity indicates easily message data can be compressed while the latter can be used to find
the communication rate across a channel.
The choice of logarithmic base in the following formulae determines the unit of information
entropy that is used. The most common unit of information is the bit. based on the binary
logarithm. ("her units include the not. which is based on the natural logarithm, and Hartley.
Which is based on the common logarithm.
Quantity of information The entropy. H):
H(X) = -Exex
Here. X is random variable. p — probability.
An important property of entropy is that it is maximized when all the messages in the
message space are equiprobable. p(x) l/n. most unpredictable—in which case.
TASK 5: use Home-Numbering
Test Word
Q.1 A Q.16 C
Q.2 A Q.17 D
Q.3 C Q.18 A
Q.4 A Q.19 A
Q.5 D Q.20 A
Q.6 A Q.21 A
Q.7 A Q.22 C
Q.8 D Q.23 A
Q.9 B Q.24 C
Q.10 D Q.25 C
Q.11 A Q.26 A
Q.12 B Q.27 D
Q.13 D Q.28 A
Q.14 A Q.29 B
Q.15 C Q.30 D
BY Sakshi shahapurkar