To Örebro University

oru.seÖrebro University Publications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
kLog: A language for logical and relational learning with kernels
Dipartimento di Ingegneria dell'Informazione, Università degli Studi di Firenze, Firenze, Italy; Departement Computerwetenschappen, KU Leuven, Heverlee, Belgium.
Institut für Informatik, Albert-Ludwigs-Universität, Freiburg, Germany.
Institut für Informatik, Albert-Ludwigs-Universität, Freiburg, Germany; Departement Computerwetenschappen, KU Leuven, Heverlee, Belgium.ORCID iD: 0000-0002-6860-6303
Departement Computerwetenschappen, KU Leuven, Heverlee, Belgium.
2014 (English)In: Artificial Intelligence, ISSN 0004-3702, E-ISSN 1872-7921, Vol. 217, p. 117-143Article in journal (Refereed) Published
Abstract [en]

We introduce kLog, a novel approach to statistical relational learning. Unlike standard approaches, kLog does not represent a probability distribution directly. It is rather a language to perform kernel-based learning on expressive logical and relational representations. kLog allows users to specify learning problems declaratively. It builds on simple but powerful concepts: learning from interpretations, entity/relationship data modeling, logic programming, and deductive databases. Access by the kernel to the rich representation is mediated by a technique we call graphicalization: the relational representation is first transformed into a graph — in particular, a grounded entity/relationship diagram. Subsequently, a choice of graph kernel defines the feature space. kLog supports mixed numerical and symbolic data, as well as background knowledge in the form of Prolog or Datalog programs as in inductive logic programming systems. The kLog framework can be applied to tackle the same range of tasks that has made statistical relational learning so popular, including classification, regression, multitask learning, and collective classification. We also report about empirical comparisons, showing that kLog can be either more accurate, or much faster at the same level of accuracy, than Tilde and Alchemy. kLog is GPLv3 licensed and is available at http://klog.dinfo.unifi.it along with tutorials.

Place, publisher, year, edition, pages
Amsterdam: Elsevier, 2014. Vol. 217, p. 117-143
Keywords [en]
Logical and relational learning, Statistical relational learning, Kernel methods, Prolog, Deductive databases
National Category
Computer Sciences
Identifiers
URN: urn:nbn:se:oru:diva-86355DOI: 10.1016/j.artint.2014.08.003ISI: 000345180400006Scopus ID: 2-s2.0-84908457312OAI: oai:DiVA.org:oru-86355DiVA, id: diva2:1474654
Note

Funding Agencies:

KU Leuven SF/09/014 GOA/08/008

Ministry of Education, Universities and Research (MIUR)

Research Projects of National Relevance (PRIN) 2009LNP494

European Research Council (ERC) 240186

Available from: 2020-10-09 Created: 2020-10-09 Last updated: 2020-12-02Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textScopus

Authority records

De Raedt, Luc

Search in DiVA

By author/editor
De Raedt, Luc
In the same journal
Artificial Intelligence
Computer Sciences

Search outside of DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 79 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf