这次我们分析一个中文NER项目,有了之前过一遍sequence_tagging_master项目的经历,我感觉这一次应该会对NER有更深入的理解。好了,完事开头难,项目的地址在这里。
- 首先看项目结构。
好吧有点一抹黑,先从README.md入手。
Recurrent neural networks for Chinese named entity recognition in TensorFlow
This repository contains a simple demo for chainese named entity recognition.
Contributer
- Jingyuan Zhang
- Mingjie Chen
- some data processing codes from glample/tagger
Requirements
Model
The model is a birectional LSTM neural network with a CRF layer. Sequence of chinese characters are projected into sequence of dense vectors, and concated with extra features as the inputs of recurrent layer, here we employ one hot vectors representing word boundary features for illustration. The recurrent layer is a bidirectional LSTM layer, outputs of forward and backword vectors are concated and projected to score of each tag. A CRF layer is used to overcome label-bias problem.
Our model is similar to the state-of-the-art Chinese named entity recognition model proposed in Character-Based LSTM-CRF with Radical-Level Features for Chinese Named Entity Recognition.
Basic Usage
Default parameters:
- batch size: 20
- gradient clip: 5
- embedding size: 100
- optimizer: Adam
- dropout rate: 0.5
- learning rate: 0.001
Word vectors are trained with gensim version of word2vec on Chinese WiKi corpus, provided by Chuanhai Dong.
Train the model with default parameters:
$ python3 main.py --train=True --clean=True
Online evaluate:
$ python3 main.py
Suggested readings:
-
Natural Language Processing (Almost) from Scratch.
Propose a unified neural network architecture for sequence labeling tasks. -
Neural Architectures for Named Entity Recognition.
End-to-end Sequence Labeling via Bi-directional LSTM-CNNs-CRF.
Combine Character-based word representations and word representations to enhance sequence labeling systems. -
Transfer Learning for Sequence Tagging with Hierarchical Recurrent Networks.
Multi-task Multi-domain Representation Learning for Sequence Tagging.
Transfer learning for sequence tagging. -
Named Entity Recognition for Chinese Social Media with Jointly Trained Embeddings.
Propose a joint training objective for the embeddings that makes use of both (NER) labeled and unlabeled raw text -
Improving Named Entity Recognition for Chinese Social Media with Word Segmentation Representation Learning.
An Empirical Study of Automatic Chinese Word Segentation for Spoken Language Understanding and Named Entity Recognition.
Using word segmentation outputs as additional features for sequence labeling syatems. -
Semi-supervised Sequence Tagging with Bidirectional Language Models.
State-of-the-art model on Conll03 NER task, adding pre-trained context embeddings from bidirectional language models for sequence labeling task. -
Character-Based LSTM-CRF with Radical-Level Features for Chinese Named Entity Recognition.
State-of-the-art model on SIGHAN2006 NER task. -
Named Entity Recognition with Bidirectional LSTM-CNNs.
Method to apply lexicon features.
基本上就是说使用了双向LSTM进行训练并使用了通过中文Wiki语料训练的词向量进行特征提取,最后采用CRF动态规划进行预测以克服标注偏置问题。最后文末还给了一些参考资料,,,好吧,理论啥的咱也不懂,咱现在资格也不够,我们先跑完这个模型再说!
在Basic Usage里,这里设置了一些默认参数如batch size的大小(20是不是有点小。)gradient clip是梯度的阈值,用来防止梯度爆炸问题,这里我们设置5,embedding size,词向量的维度,optimizer,优化器,Adam默认不解释,dropout用来防止过拟合,增加模型的鲁棒性,还有最后学习率0.001。
- 先运行了一下main.py函数,这里我稍改动了一些地方,直接就能运行,看一下效果。
还在运行。。。好了,大概看一下使用方法,我们开始征程!