Reply to "comment on 'Rényi entropy yields artificial biases not in the data and incorrect updating due to the finite-size data' " COMMENTS COMMENTS

Thomas Oikonomou, G. Baris Bagci

Research output: Contribution to journalArticle

Abstract

We reply to the preceding Comment by Jizba and Korbel [Jizba and Korbel, Phys. Rev. E 100, 026101 (2019)10.1103/PhysRevE.100.026101] by first pointing out that the Schur concavity proposed by them falls short of identifying the correct intervals of normalization for the optimum probability distribution even though normalization is a necessary ingredient in the entropy maximization procedure. Second, their treatment of the subset independence axiom requires a modification of the Lagrange multipliers one begins with, thereby rendering the optimization less trustworthy. We also explicitly demonstrate that the Rényi entropy violates the subset independence axiom and compare it with the Shannon entropy. Third, the composition rule offered by Jizba and Korbel is shown to yield probability distributions even without a need for the entropy maximization procedure at the expense of creating artificial bias in the data.

Original languageEnglish
Article number026102
JournalPhysical Review E
Volume100
Issue number2
DOIs
Publication statusPublished - Aug 20 2019

Fingerprint

Entropy Maximization
Axiom
Normalization
Updating
Probability Distribution
Entropy
entropy
Subset
Shannon Entropy
Concavity
Violate
Lagrange multipliers
Rendering
set theory
concavity
Interval
Necessary
Optimization
ingredients
Demonstrate

ASJC Scopus subject areas

  • Statistical and Nonlinear Physics
  • Statistics and Probability
  • Condensed Matter Physics

Cite this

@article{d30a56e429814ce9b330db00de975ded,
title = "Reply to {"}comment on 'R{\'e}nyi entropy yields artificial biases not in the data and incorrect updating due to the finite-size data' {"} COMMENTS COMMENTS",
abstract = "We reply to the preceding Comment by Jizba and Korbel [Jizba and Korbel, Phys. Rev. E 100, 026101 (2019)10.1103/PhysRevE.100.026101] by first pointing out that the Schur concavity proposed by them falls short of identifying the correct intervals of normalization for the optimum probability distribution even though normalization is a necessary ingredient in the entropy maximization procedure. Second, their treatment of the subset independence axiom requires a modification of the Lagrange multipliers one begins with, thereby rendering the optimization less trustworthy. We also explicitly demonstrate that the R{\'e}nyi entropy violates the subset independence axiom and compare it with the Shannon entropy. Third, the composition rule offered by Jizba and Korbel is shown to yield probability distributions even without a need for the entropy maximization procedure at the expense of creating artificial bias in the data.",
author = "Thomas Oikonomou and Bagci, {G. Baris}",
year = "2019",
month = "8",
day = "20",
doi = "10.1103/PhysRevE.100.026102",
language = "English",
volume = "100",
journal = "Physical review. E",
issn = "2470-0045",
publisher = "American Physical Society",
number = "2",

}

TY - JOUR

T1 - Reply to "comment on 'Rényi entropy yields artificial biases not in the data and incorrect updating due to the finite-size data' " COMMENTS COMMENTS

AU - Oikonomou, Thomas

AU - Bagci, G. Baris

PY - 2019/8/20

Y1 - 2019/8/20

N2 - We reply to the preceding Comment by Jizba and Korbel [Jizba and Korbel, Phys. Rev. E 100, 026101 (2019)10.1103/PhysRevE.100.026101] by first pointing out that the Schur concavity proposed by them falls short of identifying the correct intervals of normalization for the optimum probability distribution even though normalization is a necessary ingredient in the entropy maximization procedure. Second, their treatment of the subset independence axiom requires a modification of the Lagrange multipliers one begins with, thereby rendering the optimization less trustworthy. We also explicitly demonstrate that the Rényi entropy violates the subset independence axiom and compare it with the Shannon entropy. Third, the composition rule offered by Jizba and Korbel is shown to yield probability distributions even without a need for the entropy maximization procedure at the expense of creating artificial bias in the data.

AB - We reply to the preceding Comment by Jizba and Korbel [Jizba and Korbel, Phys. Rev. E 100, 026101 (2019)10.1103/PhysRevE.100.026101] by first pointing out that the Schur concavity proposed by them falls short of identifying the correct intervals of normalization for the optimum probability distribution even though normalization is a necessary ingredient in the entropy maximization procedure. Second, their treatment of the subset independence axiom requires a modification of the Lagrange multipliers one begins with, thereby rendering the optimization less trustworthy. We also explicitly demonstrate that the Rényi entropy violates the subset independence axiom and compare it with the Shannon entropy. Third, the composition rule offered by Jizba and Korbel is shown to yield probability distributions even without a need for the entropy maximization procedure at the expense of creating artificial bias in the data.

UR - http://www.scopus.com/inward/record.url?scp=85072132513&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85072132513&partnerID=8YFLogxK

U2 - 10.1103/PhysRevE.100.026102

DO - 10.1103/PhysRevE.100.026102

M3 - Article

VL - 100

JO - Physical review. E

JF - Physical review. E

SN - 2470-0045

IS - 2

M1 - 026102

ER -