{ "results": { "polish_psc_regex": { "exact_match,score-first": 0.774582560296846, "exact_match_stderr,score-first": 0.012732678170354205, "f1,score-first": 0.9417098445595855, "f1_stderr,score-first": "N/A", "alias": "polish_psc_regex" } }, "group_subtasks": { "polish_psc_regex": [] }, "configs": { "polish_psc_regex": { "task": "polish_psc_regex", "dataset_path": "allegro/klej-psc", "training_split": "train", "test_split": "test", "doc_to_text": "Fragment 1: \"{{extract_text}}\"\nFragment 2: \"{{summary_text}}\"\nPytanie: jaka jest zależność między fragmentami 1 i 2?\nMożliwe odpowiedzi:\nA - wszystkie odpowiedzi poprawne\nB - dotyczą tego samego artykułu\nC - dotyczą różnych artykułów\nD - brak poprawnej odpowiedzi\nPrawidłowa odpowiedź:", "doc_to_target": "{{{0: 'A', 1: 'C', 2: 'B', 3: 'D'}.get(label|int + 1)}}", "description": "", "target_delimiter": " ", "fewshot_delimiter": "\n\n", "num_fewshot": 0, "metric_list": [ { "metric": "exact_match", "aggregation": "mean", "higher_is_better": true }, { "metric": "def f1(predictions, references):\n _prediction = predictions[0]\n _reference = references[0]\n string_label = [\"B\", \"C\"]\n reference = string_label.index(_reference)\n prediction = (\n string_label.index(_prediction)\n if _prediction in string_label\n else 0\n )\n\n return (prediction, reference)\n", "aggregation": "def agg_f1(items):\n predictions, references = zip(*items)\n references, predictions = np.asarray(references), np.asarray(predictions)\n\n return sklearn.metrics.f1_score(references, predictions)\n", "higher_is_better": true } ], "output_type": "generate_until", "generation_kwargs": { "until": [ ".", "," ], "do_sample": false, "temperature": 0.0, "max_gen_toks": 50 }, "repeats": 1, "filter_list": [ { "name": "score-first", "filter": [ { "function": "regex", "regex_pattern": "(\\b[ABCD]\\b)" }, { "function": "take_first" } ] } ], "should_decontaminate": true, "doc_to_decontamination_query": "{{extract_text}} {{summary_text}}" } }, "versions": { "polish_psc_regex": "Yaml" }, "n-shot": { "polish_psc_regex": 0 }, "higher_is_better": { "polish_psc_regex": { "exact_match": true, "f1": true } }, "n-samples": { "polish_psc_regex": { "original": 1078, "effective": 1078 } }, "config": { "model": "local-completions", "model_args": "model=meta-llama/Meta-Llama-3.1-405B-Instruct-FP8,base_url=http://149.156.182.180:9096/c02/v1/,tokenizer_backend=huggingface", "batch_size": 1, "batch_sizes": [], "device": null, "use_cache": "openai_cache_llama405_klej_mc_bele", "limit": null, "bootstrap_iters": 100000, "gen_kwargs": null, "random_seed": 0, "numpy_seed": 1234, "torch_seed": 1234, "fewshot_seed": 1234 }, "git_hash": "2132286", "date": 1723762285.4156775, "pretty_env_info": "PyTorch version: 2.1.2+cu121\nIs debug build: False\nCUDA used to build PyTorch: 12.1\nROCM used to build PyTorch: N/A\n\nOS: Rocky Linux 9.2 (Blue Onyx) (x86_64)\nGCC version: (GCC) 11.3.1 20221121 (Red Hat 11.3.1-4)\nClang version: Could not collect\nCMake version: Could not collect\nLibc version: glibc-2.34\n\nPython version: 3.10.4 (main, Dec 14 2022, 11:01:42) [GCC 11.3.0] (64-bit runtime)\nPython platform: Linux-5.14.0-362.18.1.el9_3.x86_64-x86_64-with-glibc2.34\nIs CUDA available: False\nCUDA runtime version: No CUDA\nCUDA_MODULE_LOADING set to: N/A\nGPU models and configuration: No CUDA\nNvidia driver version: No CUDA\ncuDNN version: No CUDA\nHIP runtime version: N/A\nMIOpen runtime version: N/A\nIs XNNPACK available: True\n\nCPU:\nArchitecture: x86_64\nCPU op-mode(s): 32-bit, 64-bit\nAddress sizes: 48 bits physical, 48 bits virtual\nByte Order: Little Endian\nCPU(s): 32\nOn-line CPU(s) list: 0-31\nVendor ID: AuthenticAMD\nModel name: AMD EPYC-Rome Processor\nCPU family: 23\nModel: 49\nThread(s) per core: 1\nCore(s) per socket: 1\nSocket(s): 32\nStepping: 0\nBogoMIPS: 4990.62\nFlags: fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm rep_good nopl cpuid extd_apicid tsc_known_freq pni pclmulqdq ssse3 fma cx16 sse4_1 sse4_2 x2apic movbe popcnt tsc_deadline_timer aes xsave avx f16c rdrand hypervisor lahf_lm cmp_legacy cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw topoext perfctr_core ssbd ibrs ibpb stibp vmmcall fsgsbase tsc_adjust bmi1 avx2 smep bmi2 rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 xsaves clzero xsaveerptr wbnoinvd arat umip rdpid arch_capabilities\nHypervisor vendor: KVM\nVirtualization type: full\nL1d cache: 1 MiB (32 instances)\nL1i cache: 1 MiB (32 instances)\nL2 cache: 16 MiB (32 instances)\nL3 cache: 512 MiB (32 instances)\nNUMA node(s): 1\nNUMA node0 CPU(s): 0-31\nVulnerability Gather data sampling: Not affected\nVulnerability Itlb multihit: Not affected\nVulnerability L1tf: Not affected\nVulnerability Mds: Not affected\nVulnerability Meltdown: Not affected\nVulnerability Mmio stale data: Not affected\nVulnerability Retbleed: Mitigation; untrained return thunk; SMT disabled\nVulnerability Spec rstack overflow: Vulnerable: Safe RET, no microcode\nVulnerability Spec store bypass: Mitigation; Speculative Store Bypass disabled via prctl\nVulnerability Spectre v1: Mitigation; usercopy/swapgs barriers and __user pointer sanitization\nVulnerability Spectre v2: Mitigation; Retpolines, IBPB conditional, STIBP disabled, RSB filling, PBRSB-eIBRS Not affected\nVulnerability Srbds: Not affected\nVulnerability Tsx async abort: Not affected\n\nVersions of relevant libraries:\n[pip3] numpy==1.26.4\n[pip3] torch==2.1.2\n[pip3] triton==2.1.0\n[conda] Could not collect", "transformers_version": "4.43.1", "upper_git_hash": "2132286315025b3abd7a22b7309f7052be200287", "task_hashes": { "polish_psc_regex": "0065cab6bd75fa16d7b0b782973d6452c76ac6f272bfcb4e037049c1d2a420a5" }, "model_source": "local-completions", "model_name": "meta-llama/Meta-Llama-3.1-405B-Instruct-FP8", "model_name_sanitized": "meta-llama__Meta-Llama-3.1-405B-Instruct-FP8", "system_instruction": null, "system_instruction_sha": null, "chat_template": null, "chat_template_sha": null, "start_time": 16944535.384112675, "end_time": 16947510.900810417, "total_evaluation_time_seconds": "2975.516697742045" }