publication . Conference object . Preprint . 2019

A Morpho-Syntactically Informed LSTM-CRF Model for Named Entity Recognition

Lilia Simeonova; Kiril Simov; Petya Osenova; Preslav Nakov;
Open Access English
  • Published: 27 Aug 2019
Abstract
We propose a morphologically informed model for named entity recognition, which is based on LSTM-CRF architecture and combines word embeddings, Bi-LSTM character embeddings, part-of-speech (POS) tags, and morphological information. While previous work has focused on learning from raw word input, using word and character embeddings only, we show that for morphologically rich languages, such as Bulgarian, access to POS information contributes more to the performance gains than the detailed morphological information. Thus, we show that named entity recognition needs only coarse-grained POS tags, but at the same time it can benefit from simultaneously using some POS information of different granularity. Our evaluation results over a standard dataset show sizable improvements over the state-of-the-art for Bulgarian NER.
Comment: named entity recognition; Bulgarian NER; morphology; morpho-syntax
Persistent Identifiers
Fields of Science and Technology classification (FOS)
02 engineering and technology, 0202 electrical engineering, electronic engineering, information engineering, 020201 artificial intelligence & image processing, 03 medical and health sciences, 0302 clinical medicine, 030221 ophthalmology & optometry
Subjects
free text keywords: Computer Science - Computation and Language, 68T50, I.2.7, Architecture, Morpho, biology.organism_classification, biology, Word (computer architecture), Artificial intelligence, business.industry, business, Bulgarian, language.human_language, language, Granularity, Computer science, Named-entity recognition, computer.software_genre, computer, Natural language processing, Character (mathematics)
Communities
Communities with gateway
OpenAIRE Connect image
Other Communities
  • DARIAH EU
36 references, page 1 of 3

Alan Akbik, Tanja Bergmann, and Roland Vollgraf. 2019. Pooled contextualized embeddings for named entity recognition. In Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. Minneapolis, MN, USA, NAACL-HLT '19, pages 724-728. [OpenAIRE]

Douglas E. Appelt, Jerry R. Hobbs, John Bear, David Israel, Megumi Kameyama, David Martin, Karen Myers, and Mabry Tyson. 1995. SRI international FASTUS system: MUC-6 test results and analysis. In Proceedings of the 6th Conference on Message Understanding. Columbia, MD, USA, MUC6 '95, pages 237-248.

Piotr Bojanowski, Edouard Grave, Armand Joulin, and Tomas Mikolov. 2017. Enriching word vectors with subword information. Transactions of the Association for Computational Linguistics 5:135-146.

Jason P.C. Chiu and Eric Nichols. 2016. Named entity recognition with bidirectional LSTM-CNNs. Transactions of the Association for Computational Linguistics 4:357-370.

Jacob Devlin, Ming-Wei Chang, Kenton Lee, and Kristina Toutanova. 2019. BERT: Pre-training of deep bidirectional transformers for language understanding. In Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics. Minneapolis, MN, USA, NAACL-HLT '2019, pages 4171-4186.

George Doddington, Alexis Mitchell, Mark Przybocki, Lance Ramshaw, Stephanie Strassel, and Ralph Weischedel. 2004. The automatic content extraction (ACE) program - tasks, data, and evaluation. In Proceedings of the Fourth International Conference on Language Resources and Evaluation. Lisbon, Portugal, LREC '04.

Georgi Georgiev, Preslav Nakov, Kuzman Ganchev, Petya Osenova, and Kiril Simov. 2009. Featurerich named entity recognition for Bulgarian using conditional random fields. In Proceedings of the International Conference on Recent Adcances in Natural Language Processing. Borovets, Bulgaria, RANLP '09, pages 113-117.

Dan Gillick, Cliff Brunk, Oriol Vinyals, and Amarnag Subramanya. 2016. Multilingual language processing from bytes. In Proceedings of the 2016 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. San Diego, CA, USA, NAACLHLT '16, pages 1296-1306. [OpenAIRE]

Ralph Grishman and Beth Sundheim. 1996. Message understanding conference-6: A brief history. In Proceedings of the 16th Conference on Computational Linguistics. Copenhagen, Denmark, COLING '96, pages 466-471. [OpenAIRE]

Geoffrey E. Hinton, Nitish Srivastava, Alex Krizhevsky, Ilya Sutskever, and Ruslan Salakhutdinov. 2012. Improving neural networks by preventing co-adaptation of feature detectors. CoRR abs/1207.0580.

Diederik P. Kingma and Jimmy Ba. 2014. A method for stochastic optimization. abs/1412.6980.

Wang Ling, Chris Dyer, Alan W Black, Isabel Trancoso, Ramo´n Fermandez, Silvio Amir, Lu´ıs Marujo, and Tiago Lu´ıs. 2015. Finding function in form: Compositional character models for open vocabulary word representation. In Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing. Lisbon, Portugal, EMNLP '15, pages 1520-1530.

Xiaodong Liu, Pengcheng He, Weizhu Chen, and Jianfeng Gao. 2019a. Improving multi-task deep neural networks via knowledge distillation for natural language understanding. CoRR abs/1904.09482.

Xiaodong Liu, Pengcheng He, Weizhu Chen, and Jianfeng Gao. 2019b. Multi-task deep neural networks for natural language understanding. CoRR abs/1901.11504.

Yinhan Liu, Myle Ott, Naman Goyal, Jingfei Du, Mandar Joshi, Danqi Chen, Omer Levy, Mike Lewis, Luke Zettlemoyer, and Veselin Stoyanov. 2019c. RoBERTa: A robustly optimized BERT pretraining approach. CoRR abs/1907.11692.

36 references, page 1 of 3
Any information missing or wrong?Report an Issue