How to make the machine to conduct deep thinking has always been a global problem faced by the development of AI.
Machine reading comprehension is undergoing this challenge.
In this regard, Cloudwalk and the team of Professor Zhao Hai from Shanghai Jiao Tong University jointly published a paper entitled Semantics-aware BERT for Language Understanding.
To solve the problem that the machine reading comprehension model fails to correctly understand text semantics, this paper proposed to use explicit semantic role information to improve the modeling performance of deep language models from the perspective of computational linguistics.
Address of the paper: https://arxiv.org/abs/1909.02209