CALCULUS

Common sense and Anticipation enriched Learning of Continuous representations

sUpporting Language UnderStanding

CALCULUS focuses on learning effective anticipatory representations of events and their narrative structures that are trained on language and visual data. The machine learning methods on which CALCULUS will build belong to the family of latent variable models where it will rely on Bayesian probabilistic models and neural networks as starting points. CALCULUS focuses on settings with limited training data that are manually annotated and especially aims at developing novel machine learning paradigms for natural language understanding. CALCULUS also evaluates the inference potential of the anticipatory representations in situations not seen in the training data and for inferring spatial, temporal and causal information in metric real world spaces. The best models for language understanding will be integrated in a demonstrator that translates language to events happening in a 3-D virtual world. See the CALCULUS website for the latest updates.



Period From 2018-09-01 to 2024-09-30.
Financed by European Research Council Advanced Grant H2020-ERC-2017-ADG 788506
Supervised by Marie-Francine Moens
Staff Nathan Cornille
Ruben Cartuyvels
Aristotelis Chrysakis
Thierry Deruyttere
Tuur Leeuwenberg
Victor Milewski
Damien Sileo
Graham Spinks
Guillem Collell Talleda
Contact Marie-Francine Moens

More information can be found on the project website http://calculus-project.eu/

Publications

  1. Cornille, Nathan & Moens, Marie-Francine Improving Language Understanding in Machines through Anticipation. In 3rd Human Brain Project Curriculum Workshop on Cognitive Systems. 2019
  2. Spinks, Graham & Moens, Marie-Francine Justifying Diagnosis Decisions by Deep Neural Networks Journal of Biomedical Informatics. 2019
  3. Deruyttere, Thierry & Moens, Marie-Francine Giving Commands to a Self-driving Car: A Multimodal Reasoner for Visual Grounding. In Proceedings of AAAI 2020 Reasoning for Complex Question Answering Workshop. 2020
  4. Cornille, Nathan & Moens, Marie-Francine Improving Representation Learning with Pervasive Internal Regression (PIR). In Proceedings of the CSHL Meeting: From Neuroscience to Artificially Intelligent Systems (NAISys). 2020
  5. Spinks, Graham, Cartuyvels, Ruben & Moens, Marie-Francine Learning Grammar in Confined Worlds. In Proceedings of the International Workshop on Spoken Dialog System Technology (IWSDS 2020). 2020
  6. Aristotelis Chrysakis and Marie-Francine Moens Online Continual Learning from Imbalanced Data. In Proceedings of the 37th International Conference on Machine Learning (ICML 2020). 2020
  7. Milewski, Victor, Moens, Marie-Francine & Calixto, Iacer Are Scene Graphs Good Enough to Improve Image Captioning? Proceedings of the 1st Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 10th International Joint Conference on Natural Language Processing. ACL. 2020
  8. Leeuwenberg, Tuur & Moens, Marie-Francine Towards extracting absolute event timelines from English clinical reports. IEEE/ACM Transactions on Audio, Speech and Language Processing . 2020
  9. Ruben Cartuyvels, Graham Spinks & Marie-Francine Moens Autoregressive Reasoning over Chains of Facts with Transformers Proceedings of the 28th International Conference on Computational Linguistics (COLING 2020). (pp. 6916-6930). ACL. 2020
  10. Graham Spinks and Marie-Francine Moens Structured (De)composable Representations Trained with Neural Networks. In Proceedings of the 9th IAPR TC3 Workshop on Artificial Neural Networks in Pattern Recognition (ANNPR 2020), Springer. 2020
  11. Evin Pınar Örnek and Marie-Francine Moens Verbs On Action: Zero Shot Activity Recognition with Videos In Proceedings of CVPR Workshop on Visual Learning with Limited Labels. 2020
  12. Graham Spinks & Marie-Francine Moens Structured (De)composable Representations Trained with Neural Networks. Computers - Special Issue Artificial Neural Networks in Pattern Recognition (invited publication). 2020
  13. Parisa Kordjamshidi, James Pustejovski & Marie-Francine Moens Representation Learning and Reasoning on Spatial Language for Downstream NLP Tasks. In Proceedings of EMNLP (tutorial abstracts) (pp. 28-33). ACL. 2020
  14. Dario Pavllo, Graham Spinks, Thomas Hofmann, Marie-Francine Moens & Aurelien Lucchi Convolutional Generation of Textured 3D Meshes. In Proceedings of the Thirty-fourth Conference on Neural Information Processing Systems (NeurIPS) (oral presentation). 2020
  15. Guillem Collell & Marie-Francine Moens How Do Simple Transformations of Text and Image Features Impact Cosine-based Semantic Match? In Proceedings of the 43rd European Conference on Information Retrieval . Springer. 2020
  16. Deruyttere, Thierry, Milewski, Victor and Moens, Marie-Francine Giving commands to a self-driving car: How to deal with uncertain situations? Engineering Applications of Artificial Intelligence , 103, Art.No. 104257, 1-20. 2021
  17. Li, Ruiqi, Zhao, Xiang and Moens, Marie-Francine A Brief Overview of Universal Sentence Representation Methods: A Linguistic View. ACM Computing Surveys. 2021
  18. Cartuyvels, Ruben, Spinks, Graham and Moens, Marie-Francine Discrete and Continuous Representations and Processing in Deep Learning: Looking Forward. AI Open Elsevier.  2021
  19. Vladimir Araujo, Andrés Villa, Marie-Francine Moens, Alvaro Soto and Marcelo Mendoza Augmenting BERT-style Models with Predictive Coding to Improve Discourse-Level Representations In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing (EMNLP). 2021


Back to all projects