Towards Abstract Relational Learning in Human Robot Interaction
2020 (English)In: CognitIve RobotiCs for intEraction (CIRCE), 2020Conference paper, Published paper (Refereed)
Abstract [en]
Humans have a rich representation of the entities in their environment. Entities are described by their attributes, and entities that share attributes are often semantically related. For example, if two books have "Natural Language Processing" as the value of their `title' attribute, we can expect that their `topic' attribute will also be equal, namely, "NLP". Humans tend to generalize such observations, and infer sufficient conditions under which the `topic' attribute of any entity is "NLP". If robots need to interact successfully with humans, they need to represent entities, attributes, and generalizations in a similar way. This ends in a contextualized cognitive agent that can adapt its understanding, where context provides sufficient conditions for a correct understanding. In this work, we address the problem of how to obtain these representations through human-robot interaction. We integrate visual perception and natural language input to incrementally build a semantic model of the world, and then use inductive reasoning to infer logical rules that capture generic semantic relations, true in this model. These relations can be used to enrich the human-robot interaction, to populate a knowledge base with inferred facts, or to remove uncertainty in the robot's sensory inputs.
Place, publisher, year, edition, pages
2020.
Keywords [en]
Relational learning, Human Robot Interaction, Natural Language Processing, Inductive logic, Entity Attribute Value model, Grounding, Scene Understanding
National Category
Computer Sciences
Research subject
Computer Science
Identifiers
URN: urn:nbn:se:oru:diva-89142OAI: oai:DiVA.org:oru-89142DiVA, id: diva2:1524129
Conference
29th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN 2020), Virtual Confernece, 31 Aug. - 4 Sept., 2020.
2021-01-312021-01-312021-04-20Bibliographically approved