File size: 725 Bytes
d4d1d03
 
 
 
 
 
 
 
dafb2fb
d4d1d03
 
dafb2fb
d4d1d03
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
# Dataset Card for Enwiki Dataset

This is an automatically updating dataset containing ~7 million English Wikipedia articles, with expanded templates and converted into Markdown. This dataset was created with the intention to provide a bite-sized, LLM-readable version of Wikipedia for various applications, including RAG.

## Dataset Overview

There are two main versions of the dataset:

- **`merged-articles`** - Complete Wikipedia dump with all articles merged into a single file.
- **`merged-article-chunked`** - Articles chunked into ~700 word segments with Markdown-header hierarchical breadcrumbs (Gemini embeddings coming soon!)

The latest version of the dataset was updated on `2025-08-07`.

Blogpost coming soon!