To Örebro University

oru.seÖrebro University Publications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Approximate Compression of CNF Concepts
Department of Computer Science, KU Leuven, Leuven, Belgium; Leuven.AI, KU Leuven Institute for AI, Leuven, Belgium.
Department of Computer Science, KU Leuven, Leuven, Belgium; Leuven.AI, KU Leuven Institute for AI, Leuven, Belgium.
Department of Computer Science, KU Leuven, Leuven, Belgium; Leuven.AI, KU Leuven Institute for AI, Leuven, Belgium.
Örebro University, School of Science and Technology. Department of Computer Science, KU Leuven, Leuven, Belgium; Leuven.AI, KU Leuven Institute for AI, Leuven, Belgium. (Center for Applied Autonomous Sensor Systems (AASS))ORCID iD: 0000-0002-6860-6303
2025 (English)In: Discovery Science: 27th International Conference, DS 2024, Pisa, Italy, October 14–16, 2024, Proceedings, Part II / [ed] Dino Pedreschi; Anna Monreale; Riccardo Guidotti; Roberto Pellungrini; Francesca Naretto, Springer, 2025, Vol. 15244, p. 149-164Conference paper, Published paper (Refereed)
Abstract [en]

We consider a novel concept-learning and merging task, motivated by two use-cases. The first is about merging and compressing music playlists, and the second about federated learning with data privacy constraints. Both settings involve multiple learned concepts that must be merged and compressed into a single interpretable and accurate concept description. Our concept descriptions are logical formulae in CNF, for which merging, i.e. disjoining, multiple CNFs may lead to very large concept descriptions. To make the concepts interpretable, we compress them relative to a dataset. We propose a new method named CoWC (Compression Of Weighted Cnf) that approximates a CNF by exploiting techniques of itemset mining and inverse resolution. CoWC compresses the CNF size while also considering the F1-score w.r.t. the dataset. Our empirical evaluation shows that CoWC outperforms alternative compression approaches.

Place, publisher, year, edition, pages
Springer, 2025. Vol. 15244, p. 149-164
Series
Lecture Notes in Computer Science (LNCS), ISSN 0302-9743, E-ISSN 1611-3349
Keywords [en]
Concept learning, Formula compression
National Category
Computer Sciences
Identifiers
URN: urn:nbn:se:oru:diva-120608DOI: 10.1007/978-3-031-78980-9_10ISI: 001447234300010Scopus ID: 2-s2.0-85219193083ISBN: 9783031789793 (print)ISBN: 9783031789809 (electronic)OAI: oai:DiVA.org:oru-120608DiVA, id: diva2:1952315
Conference
27th International Conference on Discovery Science, Pisa, Italy, October 14-16, 2024
Funder
Wallenberg AI, Autonomous Systems and Software Program (WASP)Knut and Alice Wallenberg Foundation
Note

D was supported by the EU H2020I CT48 project “TAILOR” under contract #952215. This research received funding from the Flemish Government under the “Onderzoeks programma Artificiële Intelligentie (AI) Vlaanderen” programme. LDR is also supported by the Wallenberg AI, Autonomous Systems and Software Program (WASP) funded by the Knut and Alice Wallenberg Foundation.

Available from: 2025-04-15 Created: 2025-04-15 Last updated: 2025-04-15Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textScopus

Authority records

De Raedt, Luc

Search in DiVA

By author/editor
De Raedt, Luc
By organisation
School of Science and Technology
Computer Sciences

Search outside of DiVA

GoogleGoogle Scholar

doi
isbn
urn-nbn

Altmetric score

doi
isbn
urn-nbn
Total: 4 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf