configs:
- config_name: default
data_files:
- split: train
path:
- v0/documents/*.jsonl.gz
task_categories:
- text-generation
language:
- en
pretty_name: Python Enhancement Proposals
Python Enhancement Proposals
Description
Python Enhancement Proposals, or PEPs, are design documents that generally provide a technical specification and rationale for new features of the Python programming language. There are been 661 PEPs published. The majority of PEPs are published in the Public Domain, but 5 were published under the “Open Publication License” and omitted from this dataset. PEPs are long, highly-polished, and technical in nature and often include code examples paired with their prose. PEPs are authored in ReStructured Text; we used pandoc to convert them to plain text.
Dataset Statistics
Documents | UTF-8 GB |
---|---|
656 | 0.01 |
License Issues
While we aim to produce datasets with completely accurate licensing information, license laundering and inaccurate metadata can cause us to erroneously assign the incorrect license to some documents (for further discussion of this limitation, please see our paper). If you believe you have found an instance of incorrect licensing in this dataset, please start a discussion on this repository.
Other Versions
This is the "raw" version of the Python Enhancement Proposals dataset. If you are looking for the filtered version used to train Comma v0.1, you can find it here.
Citation
If you use this dataset, please cite:
@article{kandpal2025common,
title={{The Common Pile v0.1: An 8TB Dataset of Public Domain and Openly Licensed Text}},
author={Nikhil Kandpal and Brian Lester and Colin Raffel and Sebastian Majstorovic and Stella Biderman and Baber Abbasi and Luca Soldaini and Enrico Shippole and A. Feder Cooper and Aviya Skowron and Shayne Longpre and Lintang Sutawika and Alon Albalak and Zhenlin Xu and Guilherme Penedo and Loubna Ben and Elie Bakouch and John David and Honglu Fan and Dashiell Stander and Guangyu Song and Aaron Gokaslan and John Kirchenbauer and Tom Goldstein and Brian R and Bhavya Kailkhura and Tyler Murray},
journal={arXiv preprint},
year={2025}
}