You are on page 1of 6

International Journal of Scientific & Engineering Research, Volume 1, Issue 3, December-2010 1

ISSN 2229-5518

General Pseudoadditivity of Kapur’s Entropy
prescribed by the existence of equilibrium
Priti Gupta, Vijay Kumar
Abstract—In this paper, the concept of composibility of the total channel capacity of the system of channels composed of two
independent subsystems of channel is discussed. This system of channels is a function of entropies of the subsystems. Here,
we show that the generalized entropies, the Tsalli’s entropy, the normalized Tsalli’s entropy and the Kapur’s entropy are
compatible with the composition law and defined consistently with the condition of equilibrium and satisfies pseudo additivity.

Key words— Entropy, Kapur’s Entropy, Pseudo-additivity

—————————— ——————————

1 INTRODUCTION
In the process of transmission of signals, a communication certain categories of information only.
system has to do with the amount of information, the The channels can be modeled physically by trying to calculate
capacity of the communication channel and the effects of the physical processes which modify the transmission of
noise. Communication is a continuous process and a chan- signals.
nel is the ‘pipe’ along which a information is conveyed, and
there are a wide variety of different communication channels For example:: In wireless communications the channel is often
available, from basic face-to-face conversation, through to modeled by a random attenuation (known as fading) of the
telecommunication channels like the telephone or e-mail and transmitted signal, followed by additive noise. It can be
computational channels like the medical record. Channels modeled by calculating the reflection of every object in the
have attributes like capacity and noise, which determine environment.
their suitability for different tasks. When two parties ex-
change information across a channel at the same time, this is The attenuation term is a simplification of the underlying
known as synchronous communication. physical processes and captures the change in signal power
over the course of the transmission. The noise in the model
For example:: Telephones are one of the commonest two-way captures external interference and/or electronic noise in the
synchronous channels. It is the nature of synchronous com- receiver. If the attenuation term is complex, it describes the
munication that it is interruptive, and these interruptions relative time that a signal takes to get through the channel.
may have a negative impact on individuals who have high The statistics of the random attenuation are decided by
cognitive loads. previous measurements or physical simulations.

On the contrary, when individuals can be separated in time, Channel models may be continuous channel models in that
they may use an asynchronous channel to support their inte- there is no limit to how precisely their values may be defined.
raction. Since there can be no simultaneous discussion, con-
versations occur through a series of message exchanges. This As of today, it is now possible to assign multiple custom
can range from Post-it notes left on a colleague’s desk, to channels to a single ad unit. This feature enables us to track
sophisticated electronic information systems. One of the our ad performance with greater flexibility and view more
benefits of asynchronous communication is that it is not in- granular information. When generating our ad code, we will
herently interruptive, and if a communication is not urgent,
be able to add up to five custom channels to a specific in-
asynchronous channels may be a preferred way of commu-
stance of ad code. Multiple channels can be very useful
nicating with otherwise busy individuals.
when we want to track one ad unit across several different
metrics simultaneously.
A communication system is thus a bundle of different com-
ponents and the utility of the overall system is determined by
For example: To run a sports website and we have to place a
the appropriateness of all the components together.
leader board at the top and bottom of every page. To track
If even one element of the system bundle is inappropriate to
the performance of the ad placement, we have to create two
the setting, the communication system can under perform.
custom channels - ‘Top Leader board’ and ‘Bottom Leader
A communication system provides only a channel for mu-
board’ and regenerated our ad code appropriately.
tual information exchange which is not apriori dedicated to
IJSER © 2010
http://www.ijser.org

p j (B ) is the probability of finding the subsystem B in its From a thermodynamic viewpoint. X ( A. one for low valued prob- H ( A. and the type of page on which they appear (football or baseball). pw ) is of the form of a homo- for two pulses are independent. the noise amplitudes Here.. A common channels. H ( A.. COMPOSABILITY OF VARIOUS ENTROPIES PRESCRIBED BY THE EXISTENCE OF The additive requirement for the system(channel) puts con. B) f H ( A). Abe [9] discuss the thermo- dynamic system. channels and is divided into two independent subsystems of tems/processes and attracting great attention. j th state. p 2 . EQUILIBRIUM straints on the symmetry of the system under consideration: it is indivisibly connected with the homogeneity of the sys- Let X ( A. . p2 .. three genera- lized entropies. Now. w your baseball pages at the same time. the Tsalli’s entropy. December-2010 2 ISSN 2229-5518 Also. Moreo. ad units tagged with multiple custom channels will log i 1 i wl 1 impressions or clicks in each channel. B) X ( A) X ( B) The joint probability of the total system of channel is In this paper. making it noisy. B) . pi pi iff 1 and pi pi iff 1 the overall amplitude different from that of another. i. which is a familiar additional assumption in ordi- nary circumstances .. geneous function. B ) H p( A. you can hear the static hiss on a radio or see it as snow on a television screen. are shown to be consistent Where pi (A) is the probability of finding the subsystem with the equilibrium condition. Issue 3.. we are trying to discuss the additivity (noise) of subsystems of channels.org .. Where pl pi .. x) (2. (K) otherwise identical.. Another characterization. The entropy of the total system.International Journal of Scientific & Engineering Research.. With multiple custom pl pl pl channels. Channels exhibit complex sys. but here we are taking the system of chan.1 KAPUR’S ENTROPY is expressed in terms of the entropies of the subsys- tems.. 2..ijser. if we also want to compare your football pages and H (K ) ( p1 . H ( p1 . pulse. we discuss a general composition law for the system and satisfies pseudoadditivity. A and B. the normalized Tsalli’s Pij ( A... each leader ( pm ) H .2) IJSER © 2010 http://www.. and the Kapur’s entropy. of pseudo- additivity of sub-systems A and B is: Where f is symmetrical bivariate function: f ( x. H ( B) abilities and other for high valued probabilities. Further. this is not a problem. In the present work. Volume 1. 1.. Here..1) entropy.. Just create two new custom channels called ‘Football Pages’ and ‘Baseball Pages’ and (K ) p1 p 2 p add them to the appropriate ad units.. we are considering two subsystems of a channel. y ) f ( y . pm ) ( pl ) H (K ) p1 p2 . Let X ( A. B ) satisfy additivity as: tems. p w ) H (K ) ( pl .. pm pi with wl wm w ver. the heat disturbs the sig- nal... we will see pi pi large number of impressions and clicks when we view chan and the sets and are conditional probabili- nel reports than aggregate reports. feature of these systems is that they stay in non equilibrium stationary states for significantly long periods.e. channels are generally in homogeneous. pl pm ties. . In these sys. Generally.. Shannon realized that the noise added to one digital pulse would generally make Also. H ( A) and H (B) . B ) Pi ( A) Pj ( B ) (2. w pm p m pm boards will be tagged with two custom channels that let us wl w know the position they are in (top or bottom). p . which obtains for a partition of probability space for two segments. As a result. There... A in its i th state and nels for the same entropies.. . B ) be the total channel capacity of the system of tem.

( A. we notice two important points: For independent variables A and B . ( A) H K. 0 (2. ( p) log 2 N . Volume 1. ( B) (2. Tsalli’s Entropy and normalized Tsalli’s Entropy all converges H ( A. H K. N 1 ( pi ) i 1 For Renyi’s Entropy and Boltzmann-Gibbs Shannon Entro- py . Also. H K.4) to Boltzmann-Gibbs-Shannon Entropy. ( p) reduces to H ( p) Since H ( p ) and H (T ) ( p ) are composite functions and satisfies pseudoadditivity as: and when 1 . B ) H ( NT ) ( A) H ( NT ) ( B ) ( 1) H ( NT ) ( A) H ( NT ) ( B ) (2. Renyi’s Entropy.5) K 1 i 1 1 H .ijser. B ) H ( A) H ( B) ( ) H ( A).10 a) H (T ) ( p) H ( NT ) ( p ) N The above Entropies have a common feature “ Sum plus product ” as: ( pi ) i 1 f ( x.org . ( p) is a composite function which satisfies H (T ) ( A. . B) H K.8) Where ( ) is the frequency of channel .1) . ( p) reduces to H ( R ) ( p) . ( A) H K. Where ( ) is a function of the entropic index. y ) x y ( ) xy 1 1 1 (2. 1 i 1 (R) When 1 and 1 . i 1 pi and Tsalli’s Entropy [4] i 1 . Here.10) N 1 H (T ) ( p ) ( pi ) 1 ( 0) (2.6) In the limiting case.International Journal of Scientific & Engineering Research. B ) H (T ) ( A) H (T ) ( B ) pseudoadditivity as: (1 ) H (T ) ( A) H (T ) ( B ) (2. Issue 3. H ( NT ) ( A. December-2010 3 ISSN 2229-5518 The given function is a composite function. ( B ) The normalized Tsalli’s Entropy is ( . ( ) 0 IJSER © 2010 http://www. B) H ( A) H ( B) (2. (i) As 1 . Renyi’s Entropy and normal- From [8]. H K. Also. of states at a given scale.3) i 1 Where N is the total no.9) According to Boltzmann-Gibbs-Shannon entropy [3] N gives pseudoadditivity as (2.H ( B) ized Tsalli’s Entropy are concave only and for 0 .7) H ( p) pi ln( pi ) (2. Tsalli’s Entropy is always concave.7) H K. H(A . ) H K. The generalized entropies of Kapur of order and type [7] is According to Renyi’s Entropy [1] N 1 N pi H ( R ) ( p) ln ( pi ) ( 0) (2. (ii) For (0.

17) The stationary state of the total system may be defined by the ( H ( A)) maximization of total entropy with fixed X ( A.18) H ( A. in most physical systems k is in- For Normalized Tsalli’s Entropy . g and h are some functions. H ( B) 1 dh ( H ( A)) h ( H ( B)) (2. H ( B) where k . i.17) with H (B) and (2.15) f H ( A).org . H ( B ) ( H ( A)) k ( H ( A). December-2010 4 ISSN 2229-5518 symmetrical function due to (2.ijser. in general. H ( B )) g ( H ( A)) h( H ( B )) (2. ( .17) and (2. B ) . Henceforth. H ( B) ( H ( A)) f H ( A).18) with H ( A) f H ( A). ( ) 1 In order to establish (2. For Kapur’s Entropy.18). Also k is a ( H ( B)) IJSER © 2010 http://www. ) (1 )(1 ) Equations (2.11) ( H ( B)) X ( A) X ( B) (2.19) ( H ( A)) ( H ( B)) ( H ( B)) ( H ( A)) f H ( A). 2 f H ( A). H ( B) 2 f H ( A). Issue 3. H ( B) (2. H ( B) d ( H ( B)) (2. ( A) ( B) (2. Volume 1.12) It is obvious that From these equations it follows that.21) ( H ( B)) d ( H ( A)) k ( H ( A). if X is the channel capacity of the system Where is a frequency and is known as separation constant. H ( B) g ( H ( A)) h( H ( B)) (2.e. H ( B )) h( H ( A)) g ( H ( B )) (2.15) and (2.14). ( ) 1 and dependent of the entropies of subsystems and the possibility of k to be entropy dependent is limited. we get f H ( A).13) and put it in (2. f H ( A). H ( B) 0 (2. Assuming that. this makes the channel highly non-trivial for non-extensive systems with long range interactions.20) Where is the frequency. Separation of variables can only be possible if the derivation of the composite function satisfy Put the value of g ( H ( A)) and g ( H ( B)) in (2. H ( B) d ( H ( A)) ( H ( A)) d ( X ( A)) Differentiate (2. B) f H ( A). H ( B) h( H ( A)) g ( H ( B)) (2.14) g ( H ( A)) d ( H ( A)) g ( H ( B)) d ( H ( B)) (2.16) f H ( A). f H ( A).2) For Tsalli’s Entropy.16) becomes as: The additivity of channel capacity is not obvious. k will be taken to be an unity. we get ( H ( B)) d ( X ( B)) 1 dh( H ( A)) 1 dh( H ( B)) In equilibrium state.International Journal of Scientific & Engineering Research.19)..

1982. Bell’s systems Tech. let us suppose that both A and B are in the completely In such a situation. 1948. E Shannon.25) However. Thus. it is worth noting that composability assumes divisibility of the total system into independent subsystems. B) = 0. composability of entropy has been taken as a basic premise. Non-extensive Statistical Mechanics and H ( A) h ( H ( A)) (2. H ( A.22) d ( H ( B)) Or h (H ) H 1 (2. Actually reliability of this independence also puts stringent Eq. 547-561. 27. the existence of equilibrium. 1 3. Journal.International Journal of Scientific & Engineering Research.22) with H (B) . In the present work. For example: If the total system contains a long range inte- In order to find h . December-2010 5 ISSN 2229-5518 1 dh ( H ( B)) h ( H ( A)) (2. [3] C. The Mathematical Theory of Com- By (2. we get get H ( A. H ( B) f H ( A). Vol. H ( B) Composability does not seem to be possible to define the equilibrium states in the standard manner. Journal of Statistical Physics. 279-423. This shows the validity of composition law of the system for Kapur’s entropy and satisfies pseudoadditivity.28) Integrating (2. H ( A. H ( B) H ( A) H (B) H ( A) H ( B) (2.26) [2] B. 419. 1. let us suppose that only B is in the completely Pro ceeding of 4th Berkeley Symposium on Mathematical statis- ordered state.25) is the form of the composite function prescribed by constraints on physical nature of the systems. IJSER © 2010 http://www.ijser. if entropy does 1 1 not satisfy composability. (2. In order to get convergent result in the limit 0. Setting c Here. Volume 1.e. CONCLUSION Without loss of generality. the indepen- dence of the subsystems may hardly be realiable in general. Renyi’s. i.27) Ther mody namics.24) 0 plest case.29) 1 h ( H ( A)) h ( H ( B )) c (2.10a).org . 1 1 [4] C. REFERENCES h (0) 1 [1] A. Issue 3. On measures of entropy and information. we Put (2.e. We have shown how the three generalized en- lt h ( H ) 1. H tropies type pseudoadditivity can be obtained as the sim- (2. 1961.23) Where c is the constant of integration.21) with H ( A) and (2. pp.23).26) in (2.25). 27. Eq. Historical Background and Present Status. we get mu nication. tics and Probability. (2. we have derived the most general pseudoadditivity and simultaneously impose the condition rule for composite entropy based only on the existence of equilibrium. f H ( A). h ( H ( A)) h ( H ( B)) (2. i. Lesche. B) H ( A) (2. Tsallis. (2. ractions between its microscopic components. pp. thorough generalization of the standard ordered state.29) is similar to eq. This concept puts a stringent constraint on possible forms of entropies. Again. In [10] the entropy arising from the idea of quantum groups is not composable. formalism may be required. B) f H ( A).

2001. S. Tsallis. [10] S. pp. K.T. [6] J. December-2010 6 ISSN 2229-5518 In: Abe. 479-487. [5] C. N. H. Academic Press..) Nonextensive Statistic- al Mechanics and Its Applications. Physics Review E 63.1994. 211.org . Kapur. Journal of Statistical Physics. 2001. Landsberg. (eds. 061105.International Journal of Scientific & Engineering Research. 52. 1998. IJSER © 2010 http://www. [9] S. V.pp.ijser. 4 . The mathematical seminar. General pseudoadditivity of composable entropy prescribed by the existence of equilibrium. Generalization entropy of order and type .N Kapur. Abe. Physics Letters A 247. Boston. Abe. 78-84. Y. Entropy optimisation Principles with Applications. 326. [7] J. 1988. Vedral. Issue 3. Volume 1. Kesavan. [8] P. Okamoto. Springer-Verlag Heidelberg. 1997. Possible generalization of Boltzmann-Gibbs sta tistics. Physics Letters A 224. 1967.