model: Bert+BiLstm
loss function: Cross Entropy Error Function
- chinese weibo dataset
- chinese ontonote4
- chinese Resume
- chinese MSRA
- git clone https://huggingface.co/bert-base-chinese
- pip install transformers
- conda install -c huggingface transformers
- form file config/config.py change bert_data_path, bert_tag_path, bert_model_path, bert_vocab_path
- [2e-5, 3e-5, 4e-5, 5e-5]
| datasets | P | R | F1 |
|---|---|---|---|
| 0.6866 | 0.6932 | 0.6899 | |
| Resume | 0.9214 | 0.9497 | 0.9353 |
| MSRA | 0.9616 | 0.9104 | 0.9353 |
| ontonote4 | 0.8344 | 0.7404 | 0.7846 |