CSH Workshop: “The Rényi Entropy in Machine Learning: whither now?”


Jul 24, 2023Jul 25, 2023

Loading Events
  • This event has passed.

Event Navigation

 

 

Abstract:

 

Despite being defined more than 60 years ago, the Rényi family of entropies has been slow in reaching the fields of machine learning and artificial intelligence.

 

Part of the problem, of course, was the definition of a proper conditional entropy, for which a number of candidates have appeared over the years, e.g., the version proposed by Jizba and Arimitsu based on the information-theoretic axiomatic.

 

Although, not long ago, the controversy was raised following the proposition of the Rényi as a valuable framework for inference axiomatized in the framework of statistical inference (initially formulated by  Shore and Johnson) seems to have been resolved favorably.

 

It is not strange, then, that pioneering work by Principe and colleagues in using the Rényi entropy of order two as a reasonable cost in several applications in signal processing has been paralleled by the physics and statistics community, first in completing and clarifying the entropy as a descriptive framework as well as improved understanding its properties in terms of those of the better known Shannon family of entropy.

 

Further, among applications in machine learning, we can find autoencoders and cost functions for inference and explaining the special place of the Rényi entropy for artificial intelligence. Indeed, there are proposals for using Rényi entropy to model the predictive capabilities of the brain. The workshop’s main aim is to explore how the overall picture of machine learning—inference, cost functions, models,etc.—would be further changed in the Rényi information-theoretic learning setting.

 

 

 

 

 

 

 

 

 

Agenda:

 

 

 

 

Monday 24th of July

 

 

9:30 – 10:00 Welcome and Introduction

 

10:00 – 10:40 Petr Jizba “Unde venis Renyi entropy”

 

10:40 – 11:00 Discussion and Coffee

 

11:00 – 11:40 Francisco J. Valverde Albacete “The Rényi Entropies Operate in Positive Semifields”

 

11:40 – 12:00 Discussion

 

12:00 – 13:30 Lunch

 

13:30 – 14:10 Carmen Peláez-Moreno “Opening the black box of machine learning with entropy triangles”

 

14:10 – 14:30 Discussion and Coffee

 

14:30 – 15:10 Andrea Somazzi “Learn your entropy from informative data: an axiom ensuring the consistent identification of generalized entropies”

 

15:10 – 15:30 Discussion

 

 

 

 

 

Tuesday 25th of July

 

 

10:00 – 10:40 Shujian Yu “The Matrix-based Renyi’s Entropy with its Deep Learning Applications”

 

10:40 – 11:00 Discussion and Coffee

 

11:00 – 11:40 Jan Korbel “Thermodynamics of exponential Kolmogorov-Nagumo averages”

 

11:40 – 12:00 Discussion

 

12:00 – 13:30 Lunch

 

13:30 – 14:10 Zlata Tabachová “Causal inference in time series in terms of Rényi transfer entropy”

 

14:10 – 14:30 Discussion and Coffee

 

14:30 – 15:10 Rudolf Hanel “Equivalence of information production and generalized entropies in complex processes”

 

15:10 – 16:00 Discussion and Round Table

 

 

 

 

 

 

 

Organizers:

 

Jan Korbel (CSH)

 

Stefan Thurner (CSH)

 

Carmen Peláez-Moreno (Universidad Carlos III de Madrid)

 

Francisco J. Valverde-Albacete (Universidad Rey Juan Carlos)

Details

Start
Jul 24, 2023
End
Jul 25, 2023

Organizers

Jan Korbel
Stefan Thurner
Carmen Peláez-Moreno
Francisco J. Valverde-Albacete

Venue

Complexity Science Hub Vienna
Josefstaedter Straße 39
Vienna,1080Austria
+ Google Map