Editors: Andres M. Kowalski, Ra´ul D. Rossignoli, Evaldo M. F. Curado

Concepts and Recent Advances in Generalized Information Measures and Statistics

eBook: US $59 Special Offer (PDF + Printed Copy): US $185
Printed Copy: US $155
Library License: US $236
ISBN: 978-1-60805-761-0 (Print)
ISBN: 978-1-60805-760-3 (Online)
Year of Publication: 2013
DOI: 10.2174/97816080576031130101

Introduction

Introduction: Summary of Contents

The goal of this book is to offer an updated overview on generalized information measures and statistics, including the basic concepts as well as some recent relevant applications.

The book begins with an historical introduction describing the fascinating development of the concepts of heat and entropy. Starting from the ideas of the ancient Greece, an account of the main historical breakthroughs is provided, which allows to appreciate the fundamental contributions of Nicolas Sadi Carnot, Rudolf Clausius, Ludwig Boltzmann, Josiah Willard Gibbs and others. It ends with the seminal works of Claude Shannon, which led to the foundation of Information Theory, and Edwin Jaynes, which provided the connection of the latter with Statistical Mechanics.

The second chapter is a basic tutorial on the essentials of information entropy, describing in an accessible level the concepts and quantities used in the rest of this book. The Shannon entropy and its associated measures such as the conditional entropy, the mutual information (a measure of correlations) and the relative entropy (also known as Kullback-Leibler divergence, a measure of the discrepancy between two probability distributions) are all presented, together with their main properties and the most important proofs. We also provide the main features of the Fisher Information, which can be presented in terms of the relative entropy between two slightly displaced distributions, and the associated Cramer-Rao bound. Other topics include the definition of entropy in quantum systems, the fundamental property of concavity and a brief introduction to the maximum entropy approach and its connection with statistical mechanics. It contains finally the Shannon-Khinchin axioms leading to the uniqueness theorem for the Shannon entropy together with an introduction to the concept of generalized entropies.

Chapters 3, 4 and 5, are devoted precisely to the generalized entropy concept. Chapter 3 presents a review by Constantino Tsallis of the famous non-additive entropy Sq which is known by his name, together with the associated generalized statistical mechanics and q-distributions. As described there, such generalized framework allows for the possibility of an extensive thermodynamic entropy in strongly correlated systems where the standard additive Boltzmann-Gibbs entropy is non-extensive. The chapter includes a comprehensive list of relevant recent applications of the formalism in the most diverse fields, together with the concomitant references. It also comments on recent relevant results related with the connection of Sq with the theory of numbers through the Riemann zeta function. Chapter 4 presents an axiomatic approach for deriving the form of a generalized entropy. After describing in full detail the four Shannon-Khinchin axioms, it considers the situation where just the first three are conserved, together with the requirement of two newly discovered scaling laws which the generalized entropy should fulfill. It is shown that this leads to a general form of entropy depending essentially on two parameters, which define entropic equivalence classes. The connection with the Shannon, Tsallis, R´enyi and other entropies is described in detail, together with the associated distribution functions and some related aspects. It includes an appendix containing the technical details and the demonstration of four associated theorems.

Chapter 5 discusses the relation between generalized entropies and the concept of majorization. The latter is a powerful and elegant mathematical theory for comparing probability distributions, which leads to a rigorous concept of mixedness and disorder. This chapter describes first the concept of majorization in an accessible level. It then considers its connection with entropy, and shows that by means of generalized entropies it is possible to express the majorization relation in terms of entropic inequalities. It also describes the majorization properties of the probability distributions determined by the maximization of general entropic forms, and the concept of mixing parameters, i.e., parameters whose increase ensure majorization. Finally, the concept of majorization in the quantum case, i.e., for density operators, is also examined. As application, the problem of quantum entanglement detection is considered, where it is shown that majorization leads to a generalized entropic criterion for separability, which is much stronger than the standard entropic criterion. In chapter 6, the notion of distance measures for probability distributions is reviewed. An overview of the most frequently used metrics and distances like Euclidean metrics, Wootters’s distance, Fisher metric and Kullback-Leibler divergence is made, centering the analysis in the distance known as the Jensen-Shannon divergence both in their classical and quantum versions. Application of the latter as a measure of quantum entanglement is also discussed. This chapter is related to the next two chapters, which are devoted to Statistical Measures of Complexity, because of the dependence of these measures with distances in probability space.

There is no universally accepted definition of complexity, nor of quantifiers of complex- xi ity. An extensive list of relevant contributions can be found in the introduction of the chapter 8, as well as in the references of chapter 7. A comparative classification of various complexity measures, by Wackerbauer, Witt, Atmanspacher, Kurths and Scheingraber, can be found in the reference [10] of chapter 8. We will here consider just a particular class of complexity measures based on information theory, which are essentially a combination of an entropy with a distance measure in probability space. Such measures vanish when the probability distribution implies either full certainty or complete uncertainty, being maximum at some “intermediate” distribution. This approach is precisely adopted in chapter 7, where Ricardo L´opez-Ruiz, Hector Mancini and Xabier Calbet introduce the well-known measure of complexity known by their surnames (LMC Statistical Measure of Complexity). Its properties are discussed in full detail and some interesting applications (gaussian and exponential distributions, and complexity in a two-level laser model) are also provided. In chapter 8 the properties of a Generalized Statistical Complexity Measure are discussed. The authors adopt the functional product form of the LMC Statistical Measure of Complexity, but consider different entropic forms and different definitions of distance between distributions of probability, beyond the Shannon Entropy and Euclidean distance used in chapter 7. In particular, the use of the Jensen divergence introduced en chapter 6, together with the Shannon Entropy (Shannon Jensen Statistical Complexity) is analyzed in depth. Another important aspect considered in this chapter is the methodology for the proper determination of the underlying probability distribution function (PDF), associated with a given dynamical system or time series. We should also mention here the Statistical Complexity of Shiner, Davison and Landsberg (ref. [13] of chapter 8).

In chapters 9-15, different applications of generalized information measures are considered. Chapter 9 deals with the Fisher Information, whose basic properties were introduced in Chapter 2. The chapter describes its use in radial probability distributions associated with quantum states, determining the related Cramer-Rao inequalities and the explicit expressions of the Fisher information for both ground and excited states of D-dimensional hydrogenic systems. It then considers its application to some physico-chemical processes, showing that the Fisher information can be a valuable tool for detecting the transition rate and the stationary points of a chemical reaction.

Chapters 10 and 11 deal with problems in physics while chapters 12-15 with applications in others fields. In particular, chapters 14 and 15 are devoted to biological applications. In chapter 10, the links between the entanglement concept (see also chapter 5) and the information entropy are analyzed, as represented by different measures like the Shannon, Renyi (see chapter 2 and 4) and Tsallis (see chapter 3) ones. In chapter 11, the authors review the difference between quantum statistical treatments and semiclassical ones, using a semiclassical Fisher Information measure built up with Husimi distributions. Chapter 12 deals with the use of Information theory tools for characterizing pseudo random number generators obtained from chaotic dynamical systems. The authors make use of the conjunction between Entropy and the Shannon Jensen Statistical Complexity introduced in chapter 8, to evaluate the quality of pseudo random number generator. It is done by quantifying the equiprobability of all its values and statistical independence between consecutive outputs by means the comparison of a Shannon Entropy calculated with a Histogram PDF and a Shannon Jensen Statistical Complexity calculated with a Symbolic Bandt–Pompe PDF (see chapter 8) in an Entropy–Statistical Complexity plane. In chapter 13, the authors employ different information measures such as Shannon Entropy, Fisher Information measure (see chapter 2) and the Shannon Jensen Statistical Complexity (introduced in chapter 8), to analyze sedimentary data corresponding to the Holocene and so characterize changes in the dynamical behavior of ENSO (El Ni˜no/Southern Oscillation) during this period.

In chapter 14, the authors present an application of wavelet-based information measures to characterize red blood cells membrane viscoelasticity. Relative Energy, Shannon Entropy, Shannon Jensen Statistical Complexity calculated with a Wavelet PDF, technic introduced in chapter 8, together with an Entropy–Complexity plane, are used to analyzing a human haematological disease.

Finally, in chapter 15 the authors apply an information theoretic approach to analyze the role of spike correlations in the neuronal code. By considering certain brain structures as communication channels, application of Information Theory becomes feasible, allowing in particular to investigate correlations through the pertinent mutual information. It is a nice example of the important role played by Information theoretical methods in current problems of Theoretical Neuroscience. The chapter also includes a comprehensive list of references on the subject.

Preface

Since the introduction of the concept of information entropy by Claude Shannon in his famous 1948 article [1], quantifiers based on information theory have played an increasingly fundamental role in several fields. Different generalizations of the Shannon entropy have been developed, among them the R´enyi and Tsallis entropies, which have found important applications not only in physics but also in quite distinct areas such as biology, economy, cognitive sciences, etc. In addition, other information measures such as the Fisher information, which predates the Shannon entropy, and the more recent statistical complexities, have also proved to be useful and powerful tools in different scenarios, allowing in particular to analyze time series and data series independently of their sources.

It is our goal to expose in this E-book, in a broadly accessible level, the basic concepts and some of the latest developments in the field of generalized information measures, understanding as such all those quantities which allow to obtain and quantify information from a probability distribution. Addressed not only to physicists, but also to researchers in other fields like biology, medicine, economics, etc., it offers through its chapters an overview of the main measures and techniques, together with some recent relevant applications which illustrate their potential. Its scope ranges from generalized entropies and the majorization based concept of disorder to complexity measures and metrics in probability space. It includes methods for extracting probability distributions from general data series and applications ranging from quantum entanglement to biology and brain modeling. A comprehensive list of references is also contained.

Reference

[1] Claude E. Shannon, “A Mathematical Theory of Communication”, Bell System Technical Journal 27, 379–423 and 623–656 (1948).

Andres M. Kowalskia, Ra´ul D. Rossignolia, Evaldo M. F. Curadob

a Departamento de F´isica–IFLP, Universidad Nacional de La Plata

and Comisi´on de Investigaciones Cient´ificas, La Plata, Argentina

b Centro Brasileiro de Pesquisas F´isicas and National Institute of

Science and Technology for Complex Systems, Rio de Janeiro, Brasil

RELATED BOOKS

.Probability and Statistics: Theory and Exercises.
.Introductory Statistics.
.Introductory Statistical Procedures with SPSS.
.Reliability Calculations with the Stochastic Finite Element.