We use cookies to improve your experience with our site.
Liu HC, Dong LF, Zhang XM. Multimodal dependence attention and large-scale data based offline handwritten formula recognition. JOURNAL OFCOMPUTER SCIENCE AND TECHNOLOGY 39(3): 654−670 May 2024. DOI: 10.1007/s11390-022-1987-y.
Citation: Liu HC, Dong LF, Zhang XM. Multimodal dependence attention and large-scale data based offline handwritten formula recognition. JOURNAL OFCOMPUTER SCIENCE AND TECHNOLOGY 39(3): 654−670 May 2024. DOI: 10.1007/s11390-022-1987-y.

Multimodal Dependence Attention and Large-Scale Data Based Offline Handwritten Formula Recognition

  • Offline handwritten formula recognition is a challenging task due to the variety of handwritten symbols and two-dimensional formula structures. Recently, the deep neural network recognizers based on the encoder-decoder framework have achieved great improvements on this task. However, the unsatisfactory recognition performance for formulas with long \text\LaTeX strings is one shortcoming of the existing work. Moreover, lacking sufficient training data also limits the capability of these recognizers. In this paper, we design a multimodal dependence attention (MDA) module to help the model learn visual and semantic dependencies among symbols in the same formula to improve the recognition performance of the formulas with long \text\LaTeX strings. To alleviate overfitting and further improve the recognition performance, we also propose a new dataset, Handwritten Formula Image Dataset (HFID), which contains 25620 handwritten formula images collected from real life. We conduct extensive experiments to demonstrate the effectiveness of our proposed MDA module and HFID dataset and achieve state-of-the-art performances, 63.79% and 65.24% expression accuracy on CROHME 2014 and CROHME 2016, respectively.
  • loading

Catalog

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return