library_name: transformers | |
license: mit | |
language: | |
- en | |
metrics: | |
- accuracy | |
- perplexity | |
base_model: | |
- bert-base-cased | |
pipeline_tag: fill-mask | |
# BERT base for filling user actions in requirement specifications | |
This model fills masks ([MASK]) in requirements specifications. During the fine-tuning process, POS verbs were used as a proxy of user actions. | |
- **Developed by:** Fabian C. Peña, Steffen Herbold | |
- **Finetuned from:** [bert-base-cased](https://huggingface.co/bert-base-cased) | |
- **Replication kit:** [https://github.com/aieng-lab/senlp-benchmark](https://github.com/aieng-lab/senlp-benchmark) | |
- **Language:** English | |
- **License:** MIT | |
## Citation | |
``` | |
@misc{pena2025benchmark, | |
author = {Fabian Peña and Steffen Herbold}, | |
title = {Evaluating Large Language Models on Non-Code Software Engineering Tasks}, | |
year = {2025} | |
} | |
``` | |