dc.contributor.author | Charvin, Hippolyte | |
dc.contributor.author | Catenacci Volpi, Nicola | |
dc.contributor.author | Polani, Daniel | |
dc.date.accessioned | 2024-03-25T13:03:08Z | |
dc.date.available | 2024-03-25T13:03:08Z | |
dc.date.issued | 2023-11-29 | |
dc.identifier.citation | Charvin , H , Catenacci Volpi , N & Polani , D 2023 , ' Towards Information Theory-Based Discovery of Equivariances ' , Paper presented at NeurIPS 2023 Workshop on Symmetry and Geometry in Neural Representations , New Orleans , United States , 16/12/23 - 16/12/23 pp. 1-23 . | |
dc.identifier.citation | conference | |
dc.identifier.other | ORCID: /0000-0002-3233-5847/work/152250372 | |
dc.identifier.uri | http://hdl.handle.net/2299/27491 | |
dc.description | © 2023 H. Charvin, N. Catenacci Volpi & D. Polani. | |
dc.description.abstract | The presence of symmetries imposes a stringent set of constraints on a system. This constrained structure allows intelligent agents interacting with such a system to drasti- cally improve the efficiency of learning and generalization, through the internalisation of the system’s symmetries into their information-processing. In parallel, principled mod- els of complexity-constrained learning and behaviour make increasing use of information- theoretic methods. Here, we wish to marry these two perspectives and understand whether and in which form the information-theoretic lens can “see” the effect of symmetries of a system. For this purpose, we propose a novel variant of the Information Bottleneck prin- ciple, which has served as a productive basis for many principled studies of learning and information-constrained adaptive behaviour. We show (in the discrete case) that our ap- proach formalises a certain duality between symmetry and information parsimony: namely, channel equivariances can be characterised by the optimal mutual information-preserving joint compression of the channel’s input and output. This information-theoretic treatment furthermore suggests a principled notion of “soft” equivariance, whose “coarseness” is mea- sured by the amount of input-output mutual information preserved by the corresponding optimal compression. This new notion offers a bridge between the field of bounded ratio- nality and the study of symmetries in neural representations. The framework may also allow (exact and soft) equivariances to be automatically discovered. | en |
dc.format.extent | 23 | |
dc.format.extent | 362821 | |
dc.language.iso | eng | |
dc.relation.ispartof | | |
dc.subject | Channel equivariances | |
dc.subject | Information Bottleneck | |
dc.subject | Symmetry Discovery | |
dc.title | Towards Information Theory-Based Discovery of Equivariances | en |
dc.contributor.institution | Department of Computer Science | |
dc.contributor.institution | School of Physics, Engineering & Computer Science | |
dc.contributor.institution | Centre for Future Societies Research | |
dc.contributor.institution | Adaptive Systems | |
dc.contributor.institution | Centre for AI and Robotics Research | |
dc.description.status | Peer reviewed | |
dc.identifier.url | https://openreview.net/forum?id=oD8DD5jQ5I&referrer=%5BProgram%20Chair%20Console%5D(%2Fgroup%3Fid%3DNeurIPS.cc%2F2023%2FWorkshop%2FNeurReps%2FProgram_Chairs%23paper-status) | |
rioxxterms.type | Other | |
herts.preservation.rarelyaccessed | true | |