File size: 1,810 Bytes
5d91a63
 
 
 
 
 
 
 
f08dd81
5d91a63
 
 
f08dd81
5d91a63
f08dd81
5d91a63
 
f08dd81
 
 
 
f0fbea6
f08dd81
 
e20b0c8
 
f08dd81
e20b0c8
 
 
 
f08dd81
 
 
 
 
 
 
 
 
5d91a63
 
 
 
 
 
 
 
 
 
0945122
5d91a63
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
---
license: mit
tags:
- chess
- pgn
- tokenizer
pretty_name: PGN Dataset
size_categories:
- 1M<n<10M
source_datasets:
- https://www.kaggle.com/datasets/milesh1/35-million-chess-games
task_categories:
- token-classification
task_ids:
- parsing
dataset_info:
  features:
  - name: PGN
    dtype: string
  splits:
  - name: train
    num_bytes: 1322248438.672783
    num_examples: 3171142
  - name: test
    num_bytes: 73458431.90975918
    num_examples: 176175
  - name: validation
    num_bytes: 73458431.90975918
    num_examples: 176175
  download_size: 984493640
  dataset_size: 1469165302.4923015
configs:
- config_name: default
  data_files:
  - split: train
    path: data/train-*
  - split: test
    path: data/test-*
  - split: validation
    path: data/validation-*
---

# PGN Dataset

**Last Updated**: 2025-01-26

## Description

This is a dataset of chess games in Portable Game Notation (PGN) format. The dataset was created by cleaning and formatting the [milesh1/35-million-chess-games](https://www.kaggle.com/milesh1/35-million-chess-games) dataset from Kaggle.

This version of the pgn-dataset does not include the `[g_start]` and `[g_end]` special tokens that [PGNTokenizer]((https://huggingface.co/InterwebAlchemy/PGNTokenizer) uses to denote the start and end of a game. The [pgn-dataset-including-special-tokens](https://huggingface.co/datasets/InterwebAlchemy/pgn-dataset-including-special-tokens) dataset includes these tokens.

## Notes

This was the training dataset for [`PGNTokenizer`](https://huggingface.co/InterwebAlchemy/PGNTokenizer). For more information about using `PGNTokenizer` visit the [GitHub repository](https://github.com/DVDAGames/pgn-tokenizer).

You can read more about [research from the original dataset](https://chess-research-project.readthedocs.io/en/latest/).