Conceptual representations of meaning have long been the general focus of Artificial Intelligence (AI) towards the fundamental goal of machine understanding, with innumerable efforts made in Knowledge Representation, Speech and Natural Language Processing, Computer Vision, inter alia. Even today, at the core of Natural Language Understanding lies the task of Semantic Parsing, the objective of which is to convert natural sentences into machine-readable representations. Through this paper, we aim to revamp the historical dream of AI, by putting forward a novel, all-embracing, fully semantic meaning representation, that goes beyond the many existing formalisms. Indeed, we tackle their key limits by fully abstracting text into meaning and introducing language-independent concepts and semantic relations, in order to obtain an interlingual representation. Our proposal aims to overcome the language barrier, and connect not only texts across languages, but also images, videos, speech and sound, and logical formulas, across many fields of AI.
2022, BabelNet Meaning Representation: A Fully Semantic Formalism to Overcome Language Barriers, Pages 12274-12279 (volume: 36)
BabelNet Meaning Representation: A Fully Semantic Formalism to Overcome Language Barriers (04b Atto di convegno in volume)
Navigli Roberto, Blloshmi Rexhina, Martinez Lorenzo Abelardo Carlos
Gruppo di ricerca: Natural Language Processing