Spaces:
Sleeping
Sleeping
File size: 2,576 Bytes
a38838b 432e4ad a38838b e21a865 a38838b cfdfaec 5cfee63 cfdfaec c391c37 2ce34e7 a38838b 9ce6e20 a38838b 9ce6e20 a38838b 9ce6e20 7a6b63e a38838b e21a865 a38838b f9b7b07 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 |
**Live Challenge Day (May 12)**
1. We will conduct two live sessions on May 12 as part of the Live Challenge Day (calendar invites will be sent shortly):
* Session 1: 7:00 β 9:00 UTC
* Session 2: 15:00 β 17:00 UTC
2. Just before your session start time, your team leader will receive an email containing the DataMorgana generated Question file (500 questions) in JSONL format, where each line contains a Question JSON object β see Question [JSON schema](Question.json.schema) and [example](Question_Example.json)
3. You must generate and submit your Answer file in JSONL format, where each line contains an Answer JSON object β see Answer [JSON schema](Answer.json.schema) and [example](Answer_Example.json) β within 2 hours from your session start time\
3.1 [**JSONL format:**](https://jsonlines.org/) Each line must be a single, complete JSON object, with no line breaks within an object, and no multi-line formatting or indentation\
3.2 Here is a simple [script](Create_and_Verify_Answer_File_for_LiveRAG.ipynb) for generating and verifying a valid Answer file\
3.2 It is **highly recommended** to parallelize and use retries when sending requests to Falcon β see a simple [script](Falcon_Ai71_Usage.ipynb) of parallel batch request with retries using AI71 Falcon\
3.3 [Submission instructions](Submission_Instructions.md)\
3.4 All submissions made by the same team members will be considered together\
3.5 Submissions made outside your designated Session time window will be disqualified
5. Please refer to the LiveRAG Challenge [Evaluation Guidelines](Evaluation_Guidelines_for_LiveRAG.md) for important information about the evaluation process
6. You must share your RAG system Git repository with us by end-of-day AoE May 13 via email at sigir2025-liverag-gen@tii.ae to enable result reproduction
7. Additional information:\
6.1 The automatic evaluation results leaderboard will be published on the HuggingFace Challenge page once the evaluation process is completed\
6.2 The top-performing teams will undergo manual evaluation to determine the final winners, who will be announced on July 17 during the SIGIR 2025 LiveRAG Workshop
**Dry Test (May 5)**
1. To ensure a smooth Live Challenge Day, we will conduct a Dry Test on May 5 (calendar invites will be sent shortly)
2. The Dry Test DataMorgana generated Question file will contain 50 questions
3. Your answers will not be evaluated, and no leaderboard will be published for the Dry Test
4. Note that instructions 1β4 from the Live Challenge Day still apply |