modelId
stringlengths
5
139
author
stringlengths
2
42
last_modified
timestamp[us, tz=UTC]date
2020-02-15 11:33:14
2025-09-12 00:41:42
downloads
int64
0
223M
likes
int64
0
11.7k
library_name
stringclasses
555 values
tags
listlengths
1
4.05k
pipeline_tag
stringclasses
55 values
createdAt
timestamp[us, tz=UTC]date
2022-03-02 23:29:04
2025-09-12 00:40:24
card
stringlengths
11
1.01M
gridoneai/Llama-3-8B-Jungso-Instruct-DoRA-3k
gridoneai
2024-06-04T05:08:31Z
5
0
transformers
[ "transformers", "safetensors", "llama", "text-generation", "conversational", "license:cc-by-nc-sa-4.0", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2024-06-04T04:33:41Z
--- license: cc-by-nc-sa-4.0 ---
tyzhu/find_marker_both_sent_train_400_eval_40_meta-llama_Llama-2-7b-hf_3e-4_lora
tyzhu
2024-06-04T05:07:29Z
0
0
null
[ "generated_from_trainer", "base_model:meta-llama/Llama-2-7b-hf", "base_model:finetune:meta-llama/Llama-2-7b-hf", "license:llama2", "region:us" ]
null
2024-06-03T15:27:32Z
--- license: llama2 base_model: meta-llama/Llama-2-7b-hf tags: - generated_from_trainer metrics: - accuracy model-index: - name: find_marker_both_sent_train_400_eval_40_meta-llama_Llama-2-7b-hf_3e-4_lora results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # find_marker_both_sent_train_400_eval_40_meta-llama_Llama-2-7b-hf_3e-4_lora This model is a fine-tuned version of [meta-llama/Llama-2-7b-hf](https://huggingface.co/meta-llama/Llama-2-7b-hf) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.4635 - Accuracy: 0.7684 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0003 - train_batch_size: 4 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 8 - total_train_batch_size: 32 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: constant - lr_scheduler_warmup_ratio: 0.05 - num_epochs: 50.0 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | No log | 1.0 | 76 | 1.3324 | 0.6840 | | 1.3427 | 2.0 | 152 | 1.2941 | 0.6869 | | 0.9021 | 3.0 | 228 | 1.2098 | 0.6951 | | 0.4941 | 3.99 | 304 | 1.0273 | 0.7106 | | 0.4941 | 4.99 | 380 | 0.8586 | 0.7275 | | 0.2514 | 5.99 | 456 | 0.7044 | 0.7424 | | 0.1881 | 6.99 | 532 | 0.6187 | 0.7511 | | 0.1665 | 8.0 | 609 | 0.5968 | 0.7554 | | 0.1665 | 9.0 | 685 | 0.5775 | 0.7559 | | 0.1515 | 10.0 | 761 | 0.5874 | 0.7557 | | 0.1449 | 11.0 | 837 | 0.5756 | 0.7566 | | 0.1392 | 11.99 | 913 | 0.5477 | 0.7597 | | 0.1392 | 12.99 | 989 | 0.5625 | 0.7594 | | 0.1329 | 13.99 | 1065 | 0.5615 | 0.7607 | | 0.1308 | 14.99 | 1141 | 0.5757 | 0.7590 | | 0.129 | 16.0 | 1218 | 0.5631 | 0.7603 | | 0.129 | 17.0 | 1294 | 0.5434 | 0.7617 | | 0.1283 | 18.0 | 1370 | 0.5661 | 0.7602 | | 0.1285 | 19.0 | 1446 | 0.5533 | 0.7622 | | 0.1271 | 19.99 | 1522 | 0.5589 | 0.7611 | | 0.1271 | 20.99 | 1598 | 0.5553 | 0.7615 | | 0.1274 | 21.99 | 1674 | 0.5423 | 0.7617 | | 0.1283 | 22.99 | 1750 | 0.5276 | 0.7627 | | 0.1312 | 24.0 | 1827 | 0.5273 | 0.7626 | | 0.1289 | 25.0 | 1903 | 0.5155 | 0.7635 | | 0.1289 | 26.0 | 1979 | 0.5015 | 0.7656 | | 0.1261 | 27.0 | 2055 | 0.5148 | 0.7643 | | 0.1282 | 27.99 | 2131 | 0.4968 | 0.7643 | | 0.1266 | 28.99 | 2207 | 0.5018 | 0.7652 | | 0.1266 | 29.99 | 2283 | 0.4969 | 0.7660 | | 0.1253 | 30.99 | 2359 | 0.4921 | 0.7665 | | 0.1231 | 32.0 | 2436 | 0.5045 | 0.7652 | | 0.1244 | 33.0 | 2512 | 0.5048 | 0.7659 | | 0.1244 | 34.0 | 2588 | 0.5072 | 0.7659 | | 0.1233 | 35.0 | 2664 | 0.5268 | 0.7653 | | 0.1251 | 35.99 | 2740 | 0.5202 | 0.7644 | | 0.1281 | 36.99 | 2816 | 0.5094 | 0.7645 | | 0.1281 | 37.99 | 2892 | 0.5036 | 0.7657 | | 0.1266 | 38.99 | 2968 | 0.4802 | 0.7674 | | 0.1252 | 40.0 | 3045 | 0.4851 | 0.7672 | | 0.1246 | 41.0 | 3121 | 0.4873 | 0.7680 | | 0.1246 | 42.0 | 3197 | 0.4734 | 0.7679 | | 0.1231 | 43.0 | 3273 | 0.4781 | 0.7678 | | 0.1222 | 43.99 | 3349 | 0.4668 | 0.7682 | | 0.1235 | 44.99 | 3425 | 0.4828 | 0.7675 | | 0.1235 | 45.99 | 3501 | 0.4745 | 0.7692 | | 0.1235 | 46.99 | 3577 | 0.4672 | 0.7687 | | 0.1215 | 48.0 | 3654 | 0.4720 | 0.7676 | | 0.1213 | 49.0 | 3730 | 0.4601 | 0.7681 | | 0.122 | 49.92 | 3800 | 0.4635 | 0.7684 | ### Framework versions - Transformers 4.34.0 - Pytorch 2.1.0+cu121 - Datasets 2.18.0 - Tokenizers 0.14.1
chainup244/google-gemma-7b-1717477313
chainup244
2024-06-04T05:06:36Z
7
0
transformers
[ "transformers", "safetensors", "gemma", "text-generation", "arxiv:1910.09700", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2024-06-04T05:01:56Z
--- library_name: transformers tags: [] --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
martinsinnona/visdecode_vega_3
martinsinnona
2024-06-04T05:06:13Z
49
0
transformers
[ "transformers", "safetensors", "pix2struct", "image-text-to-text", "arxiv:1910.09700", "endpoints_compatible", "region:us" ]
image-text-to-text
2024-06-04T04:27:43Z
--- library_name: transformers tags: [] --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
dmavkgo/vilt_finetuned_200
dmavkgo
2024-06-04T05:02:13Z
63
0
transformers
[ "transformers", "safetensors", "vilt", "visual-question-answering", "generated_from_trainer", "dataset:vqa", "base_model:dandelin/vilt-b32-mlm", "base_model:finetune:dandelin/vilt-b32-mlm", "license:apache-2.0", "endpoints_compatible", "region:us" ]
visual-question-answering
2024-06-04T03:32:11Z
--- license: apache-2.0 base_model: dandelin/vilt-b32-mlm tags: - generated_from_trainer datasets: - vqa model-index: - name: vilt_finetuned_200 results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vilt_finetuned_200 This model is a fine-tuned version of [dandelin/vilt-b32-mlm](https://huggingface.co/dandelin/vilt-b32-mlm) on the vqa dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 4 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 20 ### Training results ### Framework versions - Transformers 4.41.2 - Pytorch 2.3.0+cu121 - Datasets 2.19.2 - Tokenizers 0.19.1
FuturisticVibes/Meta-Llama-3-70B-Instruct-abliterated-v3.5-6.0bpw-h8-exl2
FuturisticVibes
2024-06-04T04:58:52Z
5
0
transformers
[ "transformers", "safetensors", "llama", "text-generation", "conversational", "license:llama3", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "6-bit", "exl2", "region:us" ]
text-generation
2024-06-04T04:51:48Z
--- library_name: transformers license: llama3 --- I have no idea what I’m doing… if this causes the apocalypse someone please let me know. Meta-Llama-3-70B-Instruct-abliterated-v3.5 6.0bpw h8 EXL2 Includes [measurement.json](https://huggingface.co/FuturisticVibes/Meta-Llama-3-70B-Instruct-abliterated-v3.5-6.0bpw-h8-exl2/tree/measurement) file for further quantization Up next is a new, old, long dead, but never forgotten friend… Assuming I can put enough money into RunPod to rent an H100 for a bit… Original Model: https://huggingface.co/failspy/Meta-Llama-3-70B-Instruct-abliterated-v3.5 # Original Model Card # Llama-3-70B-Instruct-abliterated-v3.5 Model Card [My original Jupyter "cookbook" to replicate the methodology can be found here](https://huggingface.co/failspy/llama-3-70B-Instruct-abliterated/blob/main/ortho_cookbook.ipynb) [My personal library o' code used](https://github.com/FailSpy/abliterator) (WIP, looking to improve and generalize) This is [meta-llama/Meta-Llama-3-70B-Instruct](https://huggingface.co/meta-llama/Meta-Llama-3-70B-Instruct) with orthogonalized bfloat16 safetensor weights, generated with a refined methodology based on that which was described in the preview paper/blog post: '[Refusal in LLMs is mediated by a single direction](https://www.alignmentforum.org/posts/jGuXSZgv6qfdhMCuJ/refusal-in-llms-is-mediated-by-a-single-direction)' which I encourage you to read to understand more. ## V3.5? Second try. I felt that the V3 methodology of 70B wasn't well applied, and u/Nexesenex on reddit kinda confirmed my suspicions. So go blame them. :P This one has only a single layer modified(!) and that seems to have completely eliminated moralizing disclaimers. I hope you'll find this model better than 70B-V3! As well, this also fixes the tokenizer. ## Hang on, "abliteration"? Orthogonalization? Ablation? What is this? TL;DR: This model has had certain weights manipulated to "inhibit" the model's ability to express refusal. It is not in anyway _guaranteed_ that it won't refuse you, understand your request, it may still lecture you about ethics/safety, etc. It is tuned in all other respects the same as the original 70B instruct model was, just with the strongest refusal directions orthogonalized out. **TL;TL;DR;DR: It's uncensored in the purest form I can manage -- no new or changed behaviour in any other respect from the original model.** As far as "abliteration": it's just a fun play-on-words using the original "ablation" term used in the original paper to refer to removing features, which I made up particularly to differentiate the model from "uncensored" fine-tunes. Ablate + obliterated = Abliterated Anyways, orthogonalization/ablation are both aspects to refer to the same thing here, the technique in which the refusal feature was "ablated" from the model was via orthogonalization. ## A little more on the methodology, and why this is interesting To me, ablation (or applying the methodology for the inverse, "augmentation") seems to be good for inducing/removing very specific features that you'd have to spend way too many tokens on encouraging or discouraging in your system prompt. Instead, you just apply your system prompt in the ablation script against a blank system prompt on the same dataset and orthogonalize for the desired behaviour in the final model weights. > Why this over fine-tuning? Ablation is much more surgical in nature whilst also being effectively executed with a _lot_ less data than fine-tuning, which I think is its main advantage. As well, and its most valuable aspect is it keeps as much of the original model's knowledge and training intact, whilst removing its tendency to behave in one very specific undesireable manner. (In this case, refusing user requests.) Fine tuning is still exceptionally useful and the go-to for broad behaviour changes; however, you may be able to get close to your desired behaviour with very few samples using the ablation/augmentation techniques. It may also be a useful step to add to your model refinement: orthogonalize -> fine-tune or vice-versa. I haven't really gotten around to exploring this model stacked with fine-tuning, I encourage others to give it a shot if they've got the capacity. > Okay, fine, but why V3? There's no V2 70B? Well, I released a V2 a while back for 8B under Cognitive Computations. It ended up being not worth it to try V2 with 70B, I wanted to refine the model before wasting compute cycles on what might not even be a better model. I am however quite pleased about this latest methodology, it seems to have induced fewer hallucinations. So to show that it's a new fancy methodology from even that of the 8B V2, I decided to do a Microsoft and double up on my version jump because it's *such* an advancement (or so the excuse went, when in actuality it was because too many legacy but actively used Microsoft libraries checked for 'Windows 9' in the OS name to detect Windows 95/98 as one.) ## Quirkiness awareness notice This model may come with interesting quirks, with the methodology being so new. I encourage you to play with the model, and post any quirks you notice in the community tab, as that'll help us further understand what this orthogonalization has in the way of side effects. If you manage to develop further improvements, please share! This is really the most basic way to use ablation, but there are other possibilities that I believe are as-yet unexplored. Additionally, feel free to reach out in any way about this. I'm on the Cognitive Computations Discord, I'm watching the Community tab, reach out! I'd love to see this methodology used in other ways, and so would gladly support whoever whenever I can.
mradermacher/Llama3-13B-lingyang-v1-GGUF
mradermacher
2024-06-04T04:56:59Z
49
0
transformers
[ "transformers", "gguf", "mergekit", "merge", "Llama3", "en", "base_model:wwe180/Llama3-13B-lingyang-v1", "base_model:quantized:wwe180/Llama3-13B-lingyang-v1", "license:other", "endpoints_compatible", "region:us", "conversational" ]
null
2024-06-04T04:10:40Z
--- base_model: wwe180/Llama3-13B-lingyang-v1 language: - en library_name: transformers license: - other quantized_by: mradermacher tags: - mergekit - merge - Llama3 --- ## About <!-- ### quantize_version: 2 --> <!-- ### output_tensor_quantised: 1 --> <!-- ### convert_type: hf --> <!-- ### vocab_type: --> <!-- ### tags: --> static quants of https://huggingface.co/wwe180/Llama3-13B-lingyang-v1 <!-- provided-files --> weighted/imatrix quants seem not to be available (by me) at this time. If they do not show up a week or so after the static ones, I have probably not planned for them. Feel free to request them by opening a Community Discussion. ## Usage If you are unsure how to use GGUF files, refer to one of [TheBloke's READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for more details, including on how to concatenate multi-part files. ## Provided Quants (sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants) | Link | Type | Size/GB | Notes | |:-----|:-----|--------:|:------| | [GGUF](https://huggingface.co/mradermacher/Llama3-13B-lingyang-v1-GGUF/resolve/main/Llama3-13B-lingyang-v1.Q2_K.gguf) | Q2_K | 5.2 | | | [GGUF](https://huggingface.co/mradermacher/Llama3-13B-lingyang-v1-GGUF/resolve/main/Llama3-13B-lingyang-v1.IQ3_XS.gguf) | IQ3_XS | 5.8 | | | [GGUF](https://huggingface.co/mradermacher/Llama3-13B-lingyang-v1-GGUF/resolve/main/Llama3-13B-lingyang-v1.Q3_K_S.gguf) | Q3_K_S | 6.0 | | | [GGUF](https://huggingface.co/mradermacher/Llama3-13B-lingyang-v1-GGUF/resolve/main/Llama3-13B-lingyang-v1.IQ3_S.gguf) | IQ3_S | 6.0 | beats Q3_K* | | [GGUF](https://huggingface.co/mradermacher/Llama3-13B-lingyang-v1-GGUF/resolve/main/Llama3-13B-lingyang-v1.IQ3_M.gguf) | IQ3_M | 6.2 | | | [GGUF](https://huggingface.co/mradermacher/Llama3-13B-lingyang-v1-GGUF/resolve/main/Llama3-13B-lingyang-v1.Q3_K_M.gguf) | Q3_K_M | 6.6 | lower quality | | [GGUF](https://huggingface.co/mradermacher/Llama3-13B-lingyang-v1-GGUF/resolve/main/Llama3-13B-lingyang-v1.Q3_K_L.gguf) | Q3_K_L | 7.2 | | | [GGUF](https://huggingface.co/mradermacher/Llama3-13B-lingyang-v1-GGUF/resolve/main/Llama3-13B-lingyang-v1.IQ4_XS.gguf) | IQ4_XS | 7.4 | | | [GGUF](https://huggingface.co/mradermacher/Llama3-13B-lingyang-v1-GGUF/resolve/main/Llama3-13B-lingyang-v1.Q4_K_S.gguf) | Q4_K_S | 7.8 | fast, recommended | | [GGUF](https://huggingface.co/mradermacher/Llama3-13B-lingyang-v1-GGUF/resolve/main/Llama3-13B-lingyang-v1.Q4_K_M.gguf) | Q4_K_M | 8.2 | fast, recommended | | [GGUF](https://huggingface.co/mradermacher/Llama3-13B-lingyang-v1-GGUF/resolve/main/Llama3-13B-lingyang-v1.Q5_K_S.gguf) | Q5_K_S | 9.3 | | | [GGUF](https://huggingface.co/mradermacher/Llama3-13B-lingyang-v1-GGUF/resolve/main/Llama3-13B-lingyang-v1.Q5_K_M.gguf) | Q5_K_M | 9.5 | | | [GGUF](https://huggingface.co/mradermacher/Llama3-13B-lingyang-v1-GGUF/resolve/main/Llama3-13B-lingyang-v1.Q6_K.gguf) | Q6_K | 11.0 | very good quality | | [GGUF](https://huggingface.co/mradermacher/Llama3-13B-lingyang-v1-GGUF/resolve/main/Llama3-13B-lingyang-v1.Q8_0.gguf) | Q8_0 | 14.2 | fast, best quality | Here is a handy graph by ikawrakow comparing some lower-quality quant types (lower is better): ![image.png](https://www.nethype.de/huggingface_embed/quantpplgraph.png) And here are Artefact2's thoughts on the matter: https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9 ## FAQ / Model Request See https://huggingface.co/mradermacher/model_requests for some answers to questions you might have and/or if you want some other model quantized. ## Thanks I thank my company, [nethype GmbH](https://www.nethype.de/), for letting me use its servers and providing upgrades to my workstation to enable this work in my free time. <!-- end -->
Jonathanmfc/Stock-News-Analysis-Distilbert
Jonathanmfc
2024-06-04T04:52:08Z
0
0
null
[ "license:cc-by-nc-4.0", "region:us" ]
null
2024-06-04T04:50:58Z
--- license: cc-by-nc-4.0 ---
Jimheaver/T5-text_code_Lora
Jimheaver
2024-06-04T04:52:02Z
0
0
transformers
[ "transformers", "safetensors", "arxiv:1910.09700", "endpoints_compatible", "region:us" ]
null
2024-06-03T12:46:56Z
--- library_name: transformers tags: [] --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
mssma/ko-solar-10.7b-v0.8
mssma
2024-06-04T04:50:40Z
62
0
transformers
[ "transformers", "safetensors", "llama", "text-generation", "ko", "license:apache-2.0", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2024-06-04T04:41:44Z
--- library_name: transformers license: apache-2.0 language: - ko --- # usage ``` from transformers import AutoModelForCausalLM, AutoTokenizer import torch path = "mssma/ko-solar-10.7b-v0.8" model = AutoModelForCausalLM.from_pretrained( path, return_dict=True, torch_dtype=torch.float16, device_map='auto' ) tokenizer = AutoTokenizer.from_pretrained(path) ```
vaibhavchavan/flan-t5-small-finetuned-xsum
vaibhavchavan
2024-06-04T04:45:04Z
110
0
transformers
[ "transformers", "tensorboard", "safetensors", "t5", "text2text-generation", "generated_from_trainer", "base_model:google/flan-t5-small", "base_model:finetune:google/flan-t5-small", "license:apache-2.0", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text2text-generation
2024-05-30T03:20:29Z
--- license: apache-2.0 base_model: google/flan-t5-small tags: - generated_from_trainer metrics: - rouge model-index: - name: flan-t5-small-finetuned-xsum results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # flan-t5-small-finetuned-xsum This model is a fine-tuned version of [google/flan-t5-small](https://huggingface.co/google/flan-t5-small) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: nan - Rouge1: 3.5714 - Rouge2: 1.2195 - Rougel: 3.5714 - Rougelsum: 3.5714 - Gen Len: 19.0 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 2000 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len | |:-------------:|:------:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:-------:| | No log | 1.0 | 1 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 2.0 | 2 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 3.0 | 3 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 4.0 | 4 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 5.0 | 5 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 6.0 | 6 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 7.0 | 7 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 8.0 | 8 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 9.0 | 9 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 10.0 | 10 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 11.0 | 11 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 12.0 | 12 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 13.0 | 13 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 14.0 | 14 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 15.0 | 15 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 16.0 | 16 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 17.0 | 17 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 18.0 | 18 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 19.0 | 19 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 20.0 | 20 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 21.0 | 21 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 22.0 | 22 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 23.0 | 23 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 24.0 | 24 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 25.0 | 25 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 26.0 | 26 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 27.0 | 27 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 28.0 | 28 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 29.0 | 29 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 30.0 | 30 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 31.0 | 31 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 32.0 | 32 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 33.0 | 33 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 34.0 | 34 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 35.0 | 35 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 36.0 | 36 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 37.0 | 37 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 38.0 | 38 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 39.0 | 39 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 40.0 | 40 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 41.0 | 41 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 42.0 | 42 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 43.0 | 43 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 44.0 | 44 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 45.0 | 45 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 46.0 | 46 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 47.0 | 47 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 48.0 | 48 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 49.0 | 49 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 50.0 | 50 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 51.0 | 51 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 52.0 | 52 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 53.0 | 53 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 54.0 | 54 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 55.0 | 55 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 56.0 | 56 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 57.0 | 57 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 58.0 | 58 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 59.0 | 59 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 60.0 | 60 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 61.0 | 61 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 62.0 | 62 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 63.0 | 63 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 64.0 | 64 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 65.0 | 65 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 66.0 | 66 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 67.0 | 67 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 68.0 | 68 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 69.0 | 69 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 70.0 | 70 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 71.0 | 71 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 72.0 | 72 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 73.0 | 73 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 74.0 | 74 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 75.0 | 75 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 76.0 | 76 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 77.0 | 77 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 78.0 | 78 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 79.0 | 79 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 80.0 | 80 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 81.0 | 81 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 82.0 | 82 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 83.0 | 83 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 84.0 | 84 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 85.0 | 85 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 86.0 | 86 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 87.0 | 87 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 88.0 | 88 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 89.0 | 89 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 90.0 | 90 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 91.0 | 91 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 92.0 | 92 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 93.0 | 93 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 94.0 | 94 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 95.0 | 95 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 96.0 | 96 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 97.0 | 97 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 98.0 | 98 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 99.0 | 99 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 100.0 | 100 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 101.0 | 101 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 102.0 | 102 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 103.0 | 103 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 104.0 | 104 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 105.0 | 105 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 106.0 | 106 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 107.0 | 107 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 108.0 | 108 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 109.0 | 109 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 110.0 | 110 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 111.0 | 111 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 112.0 | 112 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 113.0 | 113 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 114.0 | 114 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 115.0 | 115 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 116.0 | 116 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 117.0 | 117 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 118.0 | 118 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 119.0 | 119 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 120.0 | 120 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 121.0 | 121 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 122.0 | 122 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 123.0 | 123 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 124.0 | 124 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 125.0 | 125 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 126.0 | 126 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 127.0 | 127 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 128.0 | 128 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 129.0 | 129 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 130.0 | 130 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 131.0 | 131 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 132.0 | 132 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 133.0 | 133 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 134.0 | 134 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 135.0 | 135 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 136.0 | 136 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 137.0 | 137 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 138.0 | 138 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 139.0 | 139 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 140.0 | 140 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 141.0 | 141 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 142.0 | 142 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 143.0 | 143 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 144.0 | 144 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 145.0 | 145 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 146.0 | 146 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 147.0 | 147 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 148.0 | 148 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 149.0 | 149 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 150.0 | 150 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 151.0 | 151 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 152.0 | 152 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 153.0 | 153 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 154.0 | 154 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 155.0 | 155 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 156.0 | 156 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 157.0 | 157 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 158.0 | 158 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 159.0 | 159 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 160.0 | 160 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 161.0 | 161 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 162.0 | 162 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 163.0 | 163 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 164.0 | 164 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 165.0 | 165 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 166.0 | 166 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 167.0 | 167 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 168.0 | 168 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 169.0 | 169 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 170.0 | 170 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 171.0 | 171 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 172.0 | 172 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 173.0 | 173 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 174.0 | 174 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 175.0 | 175 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 176.0 | 176 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 177.0 | 177 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 178.0 | 178 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 179.0 | 179 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 180.0 | 180 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 181.0 | 181 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 182.0 | 182 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 183.0 | 183 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 184.0 | 184 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 185.0 | 185 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 186.0 | 186 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 187.0 | 187 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 188.0 | 188 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 189.0 | 189 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 190.0 | 190 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 191.0 | 191 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 192.0 | 192 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 193.0 | 193 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 194.0 | 194 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 195.0 | 195 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 196.0 | 196 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 197.0 | 197 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 198.0 | 198 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 199.0 | 199 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 200.0 | 200 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 201.0 | 201 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 202.0 | 202 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 203.0 | 203 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 204.0 | 204 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 205.0 | 205 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 206.0 | 206 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 207.0 | 207 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 208.0 | 208 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 209.0 | 209 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 210.0 | 210 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 211.0 | 211 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 212.0 | 212 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 213.0 | 213 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 214.0 | 214 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 215.0 | 215 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 216.0 | 216 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 217.0 | 217 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 218.0 | 218 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 219.0 | 219 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 220.0 | 220 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 221.0 | 221 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 222.0 | 222 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 223.0 | 223 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 224.0 | 224 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 225.0 | 225 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 226.0 | 226 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 227.0 | 227 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 228.0 | 228 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 229.0 | 229 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 230.0 | 230 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 231.0 | 231 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 232.0 | 232 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 233.0 | 233 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 234.0 | 234 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 235.0 | 235 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 236.0 | 236 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 237.0 | 237 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 238.0 | 238 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 239.0 | 239 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 240.0 | 240 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 241.0 | 241 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 242.0 | 242 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 243.0 | 243 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 244.0 | 244 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 245.0 | 245 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 246.0 | 246 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 247.0 | 247 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 248.0 | 248 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 249.0 | 249 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 250.0 | 250 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 251.0 | 251 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 252.0 | 252 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 253.0 | 253 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 254.0 | 254 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 255.0 | 255 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 256.0 | 256 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 257.0 | 257 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 258.0 | 258 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 259.0 | 259 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 260.0 | 260 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 261.0 | 261 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 262.0 | 262 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 263.0 | 263 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 264.0 | 264 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 265.0 | 265 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 266.0 | 266 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 267.0 | 267 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 268.0 | 268 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 269.0 | 269 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 270.0 | 270 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 271.0 | 271 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 272.0 | 272 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 273.0 | 273 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 274.0 | 274 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 275.0 | 275 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 276.0 | 276 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 277.0 | 277 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 278.0 | 278 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 279.0 | 279 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 280.0 | 280 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 281.0 | 281 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 282.0 | 282 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 283.0 | 283 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 284.0 | 284 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 285.0 | 285 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 286.0 | 286 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 287.0 | 287 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 288.0 | 288 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 289.0 | 289 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 290.0 | 290 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 291.0 | 291 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 292.0 | 292 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 293.0 | 293 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 294.0 | 294 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 295.0 | 295 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 296.0 | 296 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 297.0 | 297 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 298.0 | 298 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 299.0 | 299 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 300.0 | 300 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 301.0 | 301 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 302.0 | 302 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 303.0 | 303 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 304.0 | 304 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 305.0 | 305 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 306.0 | 306 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 307.0 | 307 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 308.0 | 308 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 309.0 | 309 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 310.0 | 310 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 311.0 | 311 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 312.0 | 312 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 313.0 | 313 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 314.0 | 314 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 315.0 | 315 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 316.0 | 316 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 317.0 | 317 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 318.0 | 318 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 319.0 | 319 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 320.0 | 320 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 321.0 | 321 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 322.0 | 322 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 323.0 | 323 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 324.0 | 324 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 325.0 | 325 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 326.0 | 326 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 327.0 | 327 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 328.0 | 328 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 329.0 | 329 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 330.0 | 330 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 331.0 | 331 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 332.0 | 332 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 333.0 | 333 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 334.0 | 334 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 335.0 | 335 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 336.0 | 336 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 337.0 | 337 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 338.0 | 338 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 339.0 | 339 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 340.0 | 340 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 341.0 | 341 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 342.0 | 342 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 343.0 | 343 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 344.0 | 344 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 345.0 | 345 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 346.0 | 346 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 347.0 | 347 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 348.0 | 348 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 349.0 | 349 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 350.0 | 350 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 351.0 | 351 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 352.0 | 352 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 353.0 | 353 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 354.0 | 354 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 355.0 | 355 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 356.0 | 356 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 357.0 | 357 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 358.0 | 358 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 359.0 | 359 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 360.0 | 360 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 361.0 | 361 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 362.0 | 362 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 363.0 | 363 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 364.0 | 364 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 365.0 | 365 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 366.0 | 366 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 367.0 | 367 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 368.0 | 368 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 369.0 | 369 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 370.0 | 370 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 371.0 | 371 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 372.0 | 372 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 373.0 | 373 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 374.0 | 374 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 375.0 | 375 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 376.0 | 376 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 377.0 | 377 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 378.0 | 378 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 379.0 | 379 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 380.0 | 380 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 381.0 | 381 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 382.0 | 382 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 383.0 | 383 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 384.0 | 384 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 385.0 | 385 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 386.0 | 386 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 387.0 | 387 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 388.0 | 388 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 389.0 | 389 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 390.0 | 390 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 391.0 | 391 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 392.0 | 392 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 393.0 | 393 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 394.0 | 394 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 395.0 | 395 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 396.0 | 396 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 397.0 | 397 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 398.0 | 398 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 399.0 | 399 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 400.0 | 400 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 401.0 | 401 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 402.0 | 402 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 403.0 | 403 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 404.0 | 404 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 405.0 | 405 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 406.0 | 406 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 407.0 | 407 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 408.0 | 408 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 409.0 | 409 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 410.0 | 410 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 411.0 | 411 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 412.0 | 412 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 413.0 | 413 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 414.0 | 414 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 415.0 | 415 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 416.0 | 416 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 417.0 | 417 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 418.0 | 418 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 419.0 | 419 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 420.0 | 420 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 421.0 | 421 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 422.0 | 422 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 423.0 | 423 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 424.0 | 424 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 425.0 | 425 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 426.0 | 426 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 427.0 | 427 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 428.0 | 428 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 429.0 | 429 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 430.0 | 430 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 431.0 | 431 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 432.0 | 432 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 433.0 | 433 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 434.0 | 434 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 435.0 | 435 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 436.0 | 436 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 437.0 | 437 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 438.0 | 438 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 439.0 | 439 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 440.0 | 440 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 441.0 | 441 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 442.0 | 442 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 443.0 | 443 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 444.0 | 444 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 445.0 | 445 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 446.0 | 446 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 447.0 | 447 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 448.0 | 448 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 449.0 | 449 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 450.0 | 450 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 451.0 | 451 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 452.0 | 452 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 453.0 | 453 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 454.0 | 454 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 455.0 | 455 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 456.0 | 456 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 457.0 | 457 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 458.0 | 458 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 459.0 | 459 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 460.0 | 460 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 461.0 | 461 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 462.0 | 462 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 463.0 | 463 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 464.0 | 464 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 465.0 | 465 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 466.0 | 466 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 467.0 | 467 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 468.0 | 468 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 469.0 | 469 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 470.0 | 470 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 471.0 | 471 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 472.0 | 472 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 473.0 | 473 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 474.0 | 474 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 475.0 | 475 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 476.0 | 476 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 477.0 | 477 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 478.0 | 478 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 479.0 | 479 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 480.0 | 480 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 481.0 | 481 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 482.0 | 482 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 483.0 | 483 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 484.0 | 484 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 485.0 | 485 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 486.0 | 486 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 487.0 | 487 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 488.0 | 488 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 489.0 | 489 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 490.0 | 490 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 491.0 | 491 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 492.0 | 492 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 493.0 | 493 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 494.0 | 494 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 495.0 | 495 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 496.0 | 496 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 497.0 | 497 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 498.0 | 498 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | No log | 499.0 | 499 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 500.0 | 500 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 501.0 | 501 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 502.0 | 502 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 503.0 | 503 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 504.0 | 504 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 505.0 | 505 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 506.0 | 506 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 507.0 | 507 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 508.0 | 508 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 509.0 | 509 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 510.0 | 510 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 511.0 | 511 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 512.0 | 512 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 513.0 | 513 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 514.0 | 514 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 515.0 | 515 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 516.0 | 516 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 517.0 | 517 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 518.0 | 518 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 519.0 | 519 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 520.0 | 520 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 521.0 | 521 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 522.0 | 522 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 523.0 | 523 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 524.0 | 524 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 525.0 | 525 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 526.0 | 526 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 527.0 | 527 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 528.0 | 528 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 529.0 | 529 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 530.0 | 530 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 531.0 | 531 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 532.0 | 532 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 533.0 | 533 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 534.0 | 534 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 535.0 | 535 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 536.0 | 536 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 537.0 | 537 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 538.0 | 538 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 539.0 | 539 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 540.0 | 540 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 541.0 | 541 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 542.0 | 542 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 543.0 | 543 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 544.0 | 544 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 545.0 | 545 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 546.0 | 546 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 547.0 | 547 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 548.0 | 548 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 549.0 | 549 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 550.0 | 550 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 551.0 | 551 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 552.0 | 552 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 553.0 | 553 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 554.0 | 554 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 555.0 | 555 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 556.0 | 556 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 557.0 | 557 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 558.0 | 558 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 559.0 | 559 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 560.0 | 560 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 561.0 | 561 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 562.0 | 562 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 563.0 | 563 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 564.0 | 564 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 565.0 | 565 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 566.0 | 566 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 567.0 | 567 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 568.0 | 568 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 569.0 | 569 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 570.0 | 570 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 571.0 | 571 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 572.0 | 572 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 573.0 | 573 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 574.0 | 574 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 575.0 | 575 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 576.0 | 576 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 577.0 | 577 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 578.0 | 578 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 579.0 | 579 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 580.0 | 580 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 581.0 | 581 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 582.0 | 582 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 583.0 | 583 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 584.0 | 584 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 585.0 | 585 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 586.0 | 586 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 587.0 | 587 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 588.0 | 588 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 589.0 | 589 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 590.0 | 590 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 591.0 | 591 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 592.0 | 592 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 593.0 | 593 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 594.0 | 594 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 595.0 | 595 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 596.0 | 596 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 597.0 | 597 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 598.0 | 598 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 599.0 | 599 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 600.0 | 600 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 601.0 | 601 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 602.0 | 602 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 603.0 | 603 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 604.0 | 604 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 605.0 | 605 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 606.0 | 606 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 607.0 | 607 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 608.0 | 608 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 609.0 | 609 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 610.0 | 610 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 611.0 | 611 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 612.0 | 612 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 613.0 | 613 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 614.0 | 614 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 615.0 | 615 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 616.0 | 616 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 617.0 | 617 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 618.0 | 618 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 619.0 | 619 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 620.0 | 620 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 621.0 | 621 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 622.0 | 622 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 623.0 | 623 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 624.0 | 624 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 625.0 | 625 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 626.0 | 626 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 627.0 | 627 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 628.0 | 628 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 629.0 | 629 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 630.0 | 630 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 631.0 | 631 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 632.0 | 632 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 633.0 | 633 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 634.0 | 634 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 635.0 | 635 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 636.0 | 636 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 637.0 | 637 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 638.0 | 638 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 639.0 | 639 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 640.0 | 640 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 641.0 | 641 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 642.0 | 642 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 643.0 | 643 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 644.0 | 644 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 645.0 | 645 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 646.0 | 646 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 647.0 | 647 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 648.0 | 648 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 649.0 | 649 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 650.0 | 650 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 651.0 | 651 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 652.0 | 652 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 653.0 | 653 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 654.0 | 654 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 655.0 | 655 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 656.0 | 656 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 657.0 | 657 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 658.0 | 658 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 659.0 | 659 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 660.0 | 660 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 661.0 | 661 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 662.0 | 662 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 663.0 | 663 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 664.0 | 664 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 665.0 | 665 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 666.0 | 666 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 667.0 | 667 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 668.0 | 668 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 669.0 | 669 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 670.0 | 670 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 671.0 | 671 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 672.0 | 672 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 673.0 | 673 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 674.0 | 674 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 675.0 | 675 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 676.0 | 676 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 677.0 | 677 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 678.0 | 678 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 679.0 | 679 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 680.0 | 680 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 681.0 | 681 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 682.0 | 682 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 683.0 | 683 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 684.0 | 684 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 685.0 | 685 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 686.0 | 686 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 687.0 | 687 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 688.0 | 688 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 689.0 | 689 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 690.0 | 690 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 691.0 | 691 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 692.0 | 692 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 693.0 | 693 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 694.0 | 694 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 695.0 | 695 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 696.0 | 696 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 697.0 | 697 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 698.0 | 698 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 699.0 | 699 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 700.0 | 700 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 701.0 | 701 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 702.0 | 702 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 703.0 | 703 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 704.0 | 704 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 705.0 | 705 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 706.0 | 706 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 707.0 | 707 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 708.0 | 708 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 709.0 | 709 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 710.0 | 710 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 711.0 | 711 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 712.0 | 712 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 713.0 | 713 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 714.0 | 714 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 715.0 | 715 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 716.0 | 716 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 717.0 | 717 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 718.0 | 718 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 719.0 | 719 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 720.0 | 720 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 721.0 | 721 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 722.0 | 722 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 723.0 | 723 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 724.0 | 724 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 725.0 | 725 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 726.0 | 726 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 727.0 | 727 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 728.0 | 728 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 729.0 | 729 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 730.0 | 730 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 731.0 | 731 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 732.0 | 732 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 733.0 | 733 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 734.0 | 734 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 735.0 | 735 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 736.0 | 736 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 737.0 | 737 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 738.0 | 738 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 739.0 | 739 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 740.0 | 740 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 741.0 | 741 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 742.0 | 742 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 743.0 | 743 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 744.0 | 744 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 745.0 | 745 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 746.0 | 746 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 747.0 | 747 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 748.0 | 748 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 749.0 | 749 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 750.0 | 750 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 751.0 | 751 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 752.0 | 752 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 753.0 | 753 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 754.0 | 754 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 755.0 | 755 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 756.0 | 756 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 757.0 | 757 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 758.0 | 758 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 759.0 | 759 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 760.0 | 760 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 761.0 | 761 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 762.0 | 762 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 763.0 | 763 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 764.0 | 764 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 765.0 | 765 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 766.0 | 766 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 767.0 | 767 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 768.0 | 768 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 769.0 | 769 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 770.0 | 770 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 771.0 | 771 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 772.0 | 772 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 773.0 | 773 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 774.0 | 774 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 775.0 | 775 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 776.0 | 776 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 777.0 | 777 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 778.0 | 778 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 779.0 | 779 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 780.0 | 780 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 781.0 | 781 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 782.0 | 782 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 783.0 | 783 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 784.0 | 784 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 785.0 | 785 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 786.0 | 786 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 787.0 | 787 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 788.0 | 788 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 789.0 | 789 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 790.0 | 790 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 791.0 | 791 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 792.0 | 792 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 793.0 | 793 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 794.0 | 794 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 795.0 | 795 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 796.0 | 796 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 797.0 | 797 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 798.0 | 798 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 799.0 | 799 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 800.0 | 800 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 801.0 | 801 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 802.0 | 802 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 803.0 | 803 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 804.0 | 804 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 805.0 | 805 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 806.0 | 806 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 807.0 | 807 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 808.0 | 808 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 809.0 | 809 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 810.0 | 810 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 811.0 | 811 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 812.0 | 812 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 813.0 | 813 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 814.0 | 814 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 815.0 | 815 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 816.0 | 816 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 817.0 | 817 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 818.0 | 818 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 819.0 | 819 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 820.0 | 820 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 821.0 | 821 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 822.0 | 822 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 823.0 | 823 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 824.0 | 824 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 825.0 | 825 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 826.0 | 826 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 827.0 | 827 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 828.0 | 828 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 829.0 | 829 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 830.0 | 830 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 831.0 | 831 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 832.0 | 832 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 833.0 | 833 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 834.0 | 834 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 835.0 | 835 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 836.0 | 836 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 837.0 | 837 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 838.0 | 838 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 839.0 | 839 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 840.0 | 840 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 841.0 | 841 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 842.0 | 842 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 843.0 | 843 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 844.0 | 844 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 845.0 | 845 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 846.0 | 846 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 847.0 | 847 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 848.0 | 848 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 849.0 | 849 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 850.0 | 850 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 851.0 | 851 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 852.0 | 852 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 853.0 | 853 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 854.0 | 854 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 855.0 | 855 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 856.0 | 856 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 857.0 | 857 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 858.0 | 858 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 859.0 | 859 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 860.0 | 860 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 861.0 | 861 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 862.0 | 862 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 863.0 | 863 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 864.0 | 864 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 865.0 | 865 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 866.0 | 866 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 867.0 | 867 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 868.0 | 868 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 869.0 | 869 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 870.0 | 870 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 871.0 | 871 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 872.0 | 872 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 873.0 | 873 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 874.0 | 874 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 875.0 | 875 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 876.0 | 876 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 877.0 | 877 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 878.0 | 878 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 879.0 | 879 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 880.0 | 880 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 881.0 | 881 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 882.0 | 882 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 883.0 | 883 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 884.0 | 884 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 885.0 | 885 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 886.0 | 886 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 887.0 | 887 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 888.0 | 888 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 889.0 | 889 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 890.0 | 890 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 891.0 | 891 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 892.0 | 892 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 893.0 | 893 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 894.0 | 894 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 895.0 | 895 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 896.0 | 896 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 897.0 | 897 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 898.0 | 898 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 899.0 | 899 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 900.0 | 900 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 901.0 | 901 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 902.0 | 902 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 903.0 | 903 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 904.0 | 904 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 905.0 | 905 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 906.0 | 906 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 907.0 | 907 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 908.0 | 908 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 909.0 | 909 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 910.0 | 910 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 911.0 | 911 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 912.0 | 912 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 913.0 | 913 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 914.0 | 914 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 915.0 | 915 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 916.0 | 916 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 917.0 | 917 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 918.0 | 918 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 919.0 | 919 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 920.0 | 920 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 921.0 | 921 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 922.0 | 922 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 923.0 | 923 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 924.0 | 924 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 925.0 | 925 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 926.0 | 926 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 927.0 | 927 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 928.0 | 928 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 929.0 | 929 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 930.0 | 930 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 931.0 | 931 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 932.0 | 932 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 933.0 | 933 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 934.0 | 934 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 935.0 | 935 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 936.0 | 936 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 937.0 | 937 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 938.0 | 938 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 939.0 | 939 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 940.0 | 940 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 941.0 | 941 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 942.0 | 942 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 943.0 | 943 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 944.0 | 944 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 945.0 | 945 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 946.0 | 946 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 947.0 | 947 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 948.0 | 948 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 949.0 | 949 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 950.0 | 950 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 951.0 | 951 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 952.0 | 952 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 953.0 | 953 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 954.0 | 954 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 955.0 | 955 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 956.0 | 956 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 957.0 | 957 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 958.0 | 958 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 959.0 | 959 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 960.0 | 960 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 961.0 | 961 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 962.0 | 962 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 963.0 | 963 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 964.0 | 964 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 965.0 | 965 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 966.0 | 966 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 967.0 | 967 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 968.0 | 968 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 969.0 | 969 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 970.0 | 970 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 971.0 | 971 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 972.0 | 972 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 973.0 | 973 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 974.0 | 974 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 975.0 | 975 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 976.0 | 976 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 977.0 | 977 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 978.0 | 978 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 979.0 | 979 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 980.0 | 980 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 981.0 | 981 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 982.0 | 982 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 983.0 | 983 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 984.0 | 984 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 985.0 | 985 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 986.0 | 986 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 987.0 | 987 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 988.0 | 988 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 989.0 | 989 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 990.0 | 990 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 991.0 | 991 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 992.0 | 992 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 993.0 | 993 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 994.0 | 994 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 995.0 | 995 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 996.0 | 996 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 997.0 | 997 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 998.0 | 998 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 999.0 | 999 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1000.0 | 1000 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1001.0 | 1001 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1002.0 | 1002 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1003.0 | 1003 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1004.0 | 1004 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1005.0 | 1005 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1006.0 | 1006 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1007.0 | 1007 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1008.0 | 1008 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1009.0 | 1009 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1010.0 | 1010 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1011.0 | 1011 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1012.0 | 1012 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1013.0 | 1013 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1014.0 | 1014 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1015.0 | 1015 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1016.0 | 1016 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1017.0 | 1017 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1018.0 | 1018 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1019.0 | 1019 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1020.0 | 1020 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1021.0 | 1021 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1022.0 | 1022 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1023.0 | 1023 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1024.0 | 1024 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1025.0 | 1025 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1026.0 | 1026 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1027.0 | 1027 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1028.0 | 1028 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1029.0 | 1029 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1030.0 | 1030 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1031.0 | 1031 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1032.0 | 1032 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1033.0 | 1033 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1034.0 | 1034 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1035.0 | 1035 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1036.0 | 1036 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1037.0 | 1037 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1038.0 | 1038 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1039.0 | 1039 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1040.0 | 1040 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1041.0 | 1041 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1042.0 | 1042 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1043.0 | 1043 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1044.0 | 1044 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1045.0 | 1045 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1046.0 | 1046 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1047.0 | 1047 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1048.0 | 1048 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1049.0 | 1049 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1050.0 | 1050 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1051.0 | 1051 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1052.0 | 1052 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1053.0 | 1053 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1054.0 | 1054 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1055.0 | 1055 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1056.0 | 1056 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1057.0 | 1057 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1058.0 | 1058 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1059.0 | 1059 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1060.0 | 1060 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1061.0 | 1061 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1062.0 | 1062 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1063.0 | 1063 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1064.0 | 1064 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1065.0 | 1065 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1066.0 | 1066 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1067.0 | 1067 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1068.0 | 1068 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1069.0 | 1069 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1070.0 | 1070 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1071.0 | 1071 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1072.0 | 1072 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1073.0 | 1073 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1074.0 | 1074 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1075.0 | 1075 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1076.0 | 1076 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1077.0 | 1077 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1078.0 | 1078 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1079.0 | 1079 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1080.0 | 1080 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1081.0 | 1081 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1082.0 | 1082 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1083.0 | 1083 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1084.0 | 1084 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1085.0 | 1085 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1086.0 | 1086 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1087.0 | 1087 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1088.0 | 1088 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1089.0 | 1089 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1090.0 | 1090 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1091.0 | 1091 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1092.0 | 1092 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1093.0 | 1093 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1094.0 | 1094 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1095.0 | 1095 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1096.0 | 1096 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1097.0 | 1097 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1098.0 | 1098 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1099.0 | 1099 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1100.0 | 1100 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1101.0 | 1101 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1102.0 | 1102 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1103.0 | 1103 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1104.0 | 1104 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1105.0 | 1105 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1106.0 | 1106 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1107.0 | 1107 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1108.0 | 1108 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1109.0 | 1109 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1110.0 | 1110 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1111.0 | 1111 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1112.0 | 1112 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1113.0 | 1113 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1114.0 | 1114 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1115.0 | 1115 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1116.0 | 1116 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1117.0 | 1117 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1118.0 | 1118 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1119.0 | 1119 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1120.0 | 1120 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1121.0 | 1121 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1122.0 | 1122 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1123.0 | 1123 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1124.0 | 1124 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1125.0 | 1125 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1126.0 | 1126 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1127.0 | 1127 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1128.0 | 1128 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1129.0 | 1129 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1130.0 | 1130 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1131.0 | 1131 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1132.0 | 1132 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1133.0 | 1133 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1134.0 | 1134 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1135.0 | 1135 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1136.0 | 1136 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1137.0 | 1137 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1138.0 | 1138 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1139.0 | 1139 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1140.0 | 1140 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1141.0 | 1141 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1142.0 | 1142 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1143.0 | 1143 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1144.0 | 1144 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1145.0 | 1145 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1146.0 | 1146 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1147.0 | 1147 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1148.0 | 1148 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1149.0 | 1149 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1150.0 | 1150 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1151.0 | 1151 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1152.0 | 1152 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1153.0 | 1153 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1154.0 | 1154 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1155.0 | 1155 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1156.0 | 1156 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1157.0 | 1157 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1158.0 | 1158 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1159.0 | 1159 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1160.0 | 1160 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1161.0 | 1161 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1162.0 | 1162 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1163.0 | 1163 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1164.0 | 1164 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1165.0 | 1165 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1166.0 | 1166 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1167.0 | 1167 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1168.0 | 1168 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1169.0 | 1169 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1170.0 | 1170 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1171.0 | 1171 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1172.0 | 1172 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1173.0 | 1173 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1174.0 | 1174 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1175.0 | 1175 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1176.0 | 1176 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1177.0 | 1177 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1178.0 | 1178 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1179.0 | 1179 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1180.0 | 1180 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1181.0 | 1181 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1182.0 | 1182 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1183.0 | 1183 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1184.0 | 1184 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1185.0 | 1185 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1186.0 | 1186 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1187.0 | 1187 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1188.0 | 1188 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1189.0 | 1189 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1190.0 | 1190 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1191.0 | 1191 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1192.0 | 1192 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1193.0 | 1193 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1194.0 | 1194 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1195.0 | 1195 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1196.0 | 1196 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1197.0 | 1197 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1198.0 | 1198 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1199.0 | 1199 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1200.0 | 1200 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1201.0 | 1201 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1202.0 | 1202 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1203.0 | 1203 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1204.0 | 1204 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1205.0 | 1205 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1206.0 | 1206 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1207.0 | 1207 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1208.0 | 1208 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1209.0 | 1209 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1210.0 | 1210 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1211.0 | 1211 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1212.0 | 1212 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1213.0 | 1213 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1214.0 | 1214 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1215.0 | 1215 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1216.0 | 1216 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1217.0 | 1217 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1218.0 | 1218 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1219.0 | 1219 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1220.0 | 1220 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1221.0 | 1221 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1222.0 | 1222 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1223.0 | 1223 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1224.0 | 1224 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1225.0 | 1225 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1226.0 | 1226 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1227.0 | 1227 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1228.0 | 1228 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1229.0 | 1229 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1230.0 | 1230 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1231.0 | 1231 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1232.0 | 1232 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1233.0 | 1233 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1234.0 | 1234 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1235.0 | 1235 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1236.0 | 1236 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1237.0 | 1237 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1238.0 | 1238 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1239.0 | 1239 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1240.0 | 1240 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1241.0 | 1241 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1242.0 | 1242 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1243.0 | 1243 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1244.0 | 1244 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1245.0 | 1245 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1246.0 | 1246 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1247.0 | 1247 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1248.0 | 1248 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1249.0 | 1249 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1250.0 | 1250 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1251.0 | 1251 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1252.0 | 1252 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1253.0 | 1253 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1254.0 | 1254 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1255.0 | 1255 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1256.0 | 1256 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1257.0 | 1257 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1258.0 | 1258 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1259.0 | 1259 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1260.0 | 1260 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1261.0 | 1261 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1262.0 | 1262 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1263.0 | 1263 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1264.0 | 1264 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1265.0 | 1265 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1266.0 | 1266 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1267.0 | 1267 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1268.0 | 1268 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1269.0 | 1269 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1270.0 | 1270 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1271.0 | 1271 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1272.0 | 1272 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1273.0 | 1273 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1274.0 | 1274 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1275.0 | 1275 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1276.0 | 1276 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1277.0 | 1277 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1278.0 | 1278 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1279.0 | 1279 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1280.0 | 1280 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1281.0 | 1281 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1282.0 | 1282 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1283.0 | 1283 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1284.0 | 1284 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1285.0 | 1285 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1286.0 | 1286 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1287.0 | 1287 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1288.0 | 1288 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1289.0 | 1289 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1290.0 | 1290 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1291.0 | 1291 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1292.0 | 1292 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1293.0 | 1293 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1294.0 | 1294 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1295.0 | 1295 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1296.0 | 1296 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1297.0 | 1297 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1298.0 | 1298 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1299.0 | 1299 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1300.0 | 1300 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1301.0 | 1301 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1302.0 | 1302 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1303.0 | 1303 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1304.0 | 1304 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1305.0 | 1305 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1306.0 | 1306 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1307.0 | 1307 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1308.0 | 1308 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1309.0 | 1309 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1310.0 | 1310 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1311.0 | 1311 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1312.0 | 1312 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1313.0 | 1313 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1314.0 | 1314 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1315.0 | 1315 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1316.0 | 1316 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1317.0 | 1317 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1318.0 | 1318 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1319.0 | 1319 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1320.0 | 1320 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1321.0 | 1321 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1322.0 | 1322 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1323.0 | 1323 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1324.0 | 1324 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1325.0 | 1325 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1326.0 | 1326 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1327.0 | 1327 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1328.0 | 1328 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1329.0 | 1329 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1330.0 | 1330 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1331.0 | 1331 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1332.0 | 1332 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1333.0 | 1333 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1334.0 | 1334 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1335.0 | 1335 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1336.0 | 1336 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1337.0 | 1337 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1338.0 | 1338 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1339.0 | 1339 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1340.0 | 1340 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1341.0 | 1341 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1342.0 | 1342 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1343.0 | 1343 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1344.0 | 1344 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1345.0 | 1345 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1346.0 | 1346 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1347.0 | 1347 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1348.0 | 1348 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1349.0 | 1349 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1350.0 | 1350 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1351.0 | 1351 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1352.0 | 1352 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1353.0 | 1353 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1354.0 | 1354 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1355.0 | 1355 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1356.0 | 1356 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1357.0 | 1357 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1358.0 | 1358 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1359.0 | 1359 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1360.0 | 1360 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1361.0 | 1361 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1362.0 | 1362 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1363.0 | 1363 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1364.0 | 1364 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1365.0 | 1365 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1366.0 | 1366 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1367.0 | 1367 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1368.0 | 1368 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1369.0 | 1369 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1370.0 | 1370 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1371.0 | 1371 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1372.0 | 1372 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1373.0 | 1373 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1374.0 | 1374 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1375.0 | 1375 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1376.0 | 1376 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1377.0 | 1377 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1378.0 | 1378 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1379.0 | 1379 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1380.0 | 1380 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1381.0 | 1381 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1382.0 | 1382 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1383.0 | 1383 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1384.0 | 1384 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1385.0 | 1385 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1386.0 | 1386 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1387.0 | 1387 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1388.0 | 1388 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1389.0 | 1389 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1390.0 | 1390 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1391.0 | 1391 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1392.0 | 1392 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1393.0 | 1393 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1394.0 | 1394 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1395.0 | 1395 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1396.0 | 1396 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1397.0 | 1397 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1398.0 | 1398 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1399.0 | 1399 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1400.0 | 1400 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1401.0 | 1401 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1402.0 | 1402 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1403.0 | 1403 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1404.0 | 1404 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1405.0 | 1405 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1406.0 | 1406 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1407.0 | 1407 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1408.0 | 1408 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1409.0 | 1409 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1410.0 | 1410 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1411.0 | 1411 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1412.0 | 1412 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1413.0 | 1413 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1414.0 | 1414 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1415.0 | 1415 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1416.0 | 1416 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1417.0 | 1417 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1418.0 | 1418 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1419.0 | 1419 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1420.0 | 1420 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1421.0 | 1421 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1422.0 | 1422 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1423.0 | 1423 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1424.0 | 1424 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1425.0 | 1425 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1426.0 | 1426 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1427.0 | 1427 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1428.0 | 1428 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1429.0 | 1429 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1430.0 | 1430 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1431.0 | 1431 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1432.0 | 1432 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1433.0 | 1433 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1434.0 | 1434 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1435.0 | 1435 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1436.0 | 1436 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1437.0 | 1437 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1438.0 | 1438 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1439.0 | 1439 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1440.0 | 1440 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1441.0 | 1441 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1442.0 | 1442 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1443.0 | 1443 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1444.0 | 1444 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1445.0 | 1445 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1446.0 | 1446 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1447.0 | 1447 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1448.0 | 1448 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1449.0 | 1449 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1450.0 | 1450 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1451.0 | 1451 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1452.0 | 1452 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1453.0 | 1453 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1454.0 | 1454 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1455.0 | 1455 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1456.0 | 1456 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1457.0 | 1457 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1458.0 | 1458 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1459.0 | 1459 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1460.0 | 1460 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1461.0 | 1461 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1462.0 | 1462 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1463.0 | 1463 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1464.0 | 1464 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1465.0 | 1465 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1466.0 | 1466 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1467.0 | 1467 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1468.0 | 1468 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1469.0 | 1469 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1470.0 | 1470 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1471.0 | 1471 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1472.0 | 1472 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1473.0 | 1473 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1474.0 | 1474 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1475.0 | 1475 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1476.0 | 1476 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1477.0 | 1477 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1478.0 | 1478 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1479.0 | 1479 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1480.0 | 1480 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1481.0 | 1481 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1482.0 | 1482 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1483.0 | 1483 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1484.0 | 1484 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1485.0 | 1485 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1486.0 | 1486 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1487.0 | 1487 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1488.0 | 1488 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1489.0 | 1489 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1490.0 | 1490 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1491.0 | 1491 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1492.0 | 1492 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1493.0 | 1493 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1494.0 | 1494 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1495.0 | 1495 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1496.0 | 1496 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1497.0 | 1497 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1498.0 | 1498 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1499.0 | 1499 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1500.0 | 1500 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1501.0 | 1501 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1502.0 | 1502 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1503.0 | 1503 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1504.0 | 1504 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1505.0 | 1505 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1506.0 | 1506 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1507.0 | 1507 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1508.0 | 1508 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1509.0 | 1509 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1510.0 | 1510 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1511.0 | 1511 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1512.0 | 1512 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1513.0 | 1513 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1514.0 | 1514 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1515.0 | 1515 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1516.0 | 1516 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1517.0 | 1517 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1518.0 | 1518 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1519.0 | 1519 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1520.0 | 1520 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1521.0 | 1521 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1522.0 | 1522 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1523.0 | 1523 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1524.0 | 1524 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1525.0 | 1525 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1526.0 | 1526 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1527.0 | 1527 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1528.0 | 1528 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1529.0 | 1529 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1530.0 | 1530 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1531.0 | 1531 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1532.0 | 1532 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1533.0 | 1533 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1534.0 | 1534 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1535.0 | 1535 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1536.0 | 1536 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1537.0 | 1537 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1538.0 | 1538 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1539.0 | 1539 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1540.0 | 1540 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1541.0 | 1541 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1542.0 | 1542 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1543.0 | 1543 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1544.0 | 1544 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1545.0 | 1545 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1546.0 | 1546 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1547.0 | 1547 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1548.0 | 1548 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1549.0 | 1549 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1550.0 | 1550 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1551.0 | 1551 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1552.0 | 1552 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1553.0 | 1553 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1554.0 | 1554 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1555.0 | 1555 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1556.0 | 1556 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1557.0 | 1557 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1558.0 | 1558 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1559.0 | 1559 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1560.0 | 1560 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1561.0 | 1561 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1562.0 | 1562 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1563.0 | 1563 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1564.0 | 1564 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1565.0 | 1565 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1566.0 | 1566 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1567.0 | 1567 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1568.0 | 1568 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1569.0 | 1569 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1570.0 | 1570 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1571.0 | 1571 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1572.0 | 1572 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1573.0 | 1573 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1574.0 | 1574 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1575.0 | 1575 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1576.0 | 1576 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1577.0 | 1577 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1578.0 | 1578 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1579.0 | 1579 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1580.0 | 1580 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1581.0 | 1581 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1582.0 | 1582 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1583.0 | 1583 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1584.0 | 1584 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1585.0 | 1585 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1586.0 | 1586 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1587.0 | 1587 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1588.0 | 1588 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1589.0 | 1589 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1590.0 | 1590 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1591.0 | 1591 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1592.0 | 1592 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1593.0 | 1593 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1594.0 | 1594 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1595.0 | 1595 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1596.0 | 1596 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1597.0 | 1597 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1598.0 | 1598 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1599.0 | 1599 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1600.0 | 1600 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1601.0 | 1601 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1602.0 | 1602 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1603.0 | 1603 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1604.0 | 1604 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1605.0 | 1605 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1606.0 | 1606 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1607.0 | 1607 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1608.0 | 1608 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1609.0 | 1609 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1610.0 | 1610 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1611.0 | 1611 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1612.0 | 1612 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1613.0 | 1613 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1614.0 | 1614 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1615.0 | 1615 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1616.0 | 1616 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1617.0 | 1617 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1618.0 | 1618 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1619.0 | 1619 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1620.0 | 1620 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1621.0 | 1621 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1622.0 | 1622 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1623.0 | 1623 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1624.0 | 1624 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1625.0 | 1625 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1626.0 | 1626 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1627.0 | 1627 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1628.0 | 1628 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1629.0 | 1629 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1630.0 | 1630 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1631.0 | 1631 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1632.0 | 1632 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1633.0 | 1633 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1634.0 | 1634 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1635.0 | 1635 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1636.0 | 1636 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1637.0 | 1637 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1638.0 | 1638 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1639.0 | 1639 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1640.0 | 1640 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1641.0 | 1641 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1642.0 | 1642 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1643.0 | 1643 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1644.0 | 1644 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1645.0 | 1645 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1646.0 | 1646 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1647.0 | 1647 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1648.0 | 1648 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1649.0 | 1649 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1650.0 | 1650 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1651.0 | 1651 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1652.0 | 1652 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1653.0 | 1653 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1654.0 | 1654 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1655.0 | 1655 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1656.0 | 1656 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1657.0 | 1657 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1658.0 | 1658 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1659.0 | 1659 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1660.0 | 1660 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1661.0 | 1661 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1662.0 | 1662 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1663.0 | 1663 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1664.0 | 1664 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1665.0 | 1665 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1666.0 | 1666 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1667.0 | 1667 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1668.0 | 1668 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1669.0 | 1669 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1670.0 | 1670 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1671.0 | 1671 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1672.0 | 1672 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1673.0 | 1673 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1674.0 | 1674 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1675.0 | 1675 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1676.0 | 1676 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1677.0 | 1677 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1678.0 | 1678 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1679.0 | 1679 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1680.0 | 1680 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1681.0 | 1681 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1682.0 | 1682 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1683.0 | 1683 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1684.0 | 1684 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1685.0 | 1685 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1686.0 | 1686 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1687.0 | 1687 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1688.0 | 1688 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1689.0 | 1689 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1690.0 | 1690 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1691.0 | 1691 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1692.0 | 1692 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1693.0 | 1693 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1694.0 | 1694 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1695.0 | 1695 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1696.0 | 1696 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1697.0 | 1697 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1698.0 | 1698 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1699.0 | 1699 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1700.0 | 1700 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1701.0 | 1701 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1702.0 | 1702 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1703.0 | 1703 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1704.0 | 1704 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1705.0 | 1705 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1706.0 | 1706 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1707.0 | 1707 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1708.0 | 1708 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1709.0 | 1709 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1710.0 | 1710 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1711.0 | 1711 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1712.0 | 1712 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1713.0 | 1713 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1714.0 | 1714 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1715.0 | 1715 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1716.0 | 1716 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1717.0 | 1717 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1718.0 | 1718 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1719.0 | 1719 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1720.0 | 1720 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1721.0 | 1721 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1722.0 | 1722 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1723.0 | 1723 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1724.0 | 1724 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1725.0 | 1725 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1726.0 | 1726 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1727.0 | 1727 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1728.0 | 1728 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1729.0 | 1729 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1730.0 | 1730 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1731.0 | 1731 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1732.0 | 1732 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1733.0 | 1733 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1734.0 | 1734 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1735.0 | 1735 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1736.0 | 1736 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1737.0 | 1737 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1738.0 | 1738 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1739.0 | 1739 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1740.0 | 1740 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1741.0 | 1741 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1742.0 | 1742 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1743.0 | 1743 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1744.0 | 1744 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1745.0 | 1745 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1746.0 | 1746 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1747.0 | 1747 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1748.0 | 1748 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1749.0 | 1749 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1750.0 | 1750 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1751.0 | 1751 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1752.0 | 1752 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1753.0 | 1753 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1754.0 | 1754 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1755.0 | 1755 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1756.0 | 1756 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1757.0 | 1757 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1758.0 | 1758 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1759.0 | 1759 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1760.0 | 1760 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1761.0 | 1761 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1762.0 | 1762 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1763.0 | 1763 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1764.0 | 1764 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1765.0 | 1765 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1766.0 | 1766 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1767.0 | 1767 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1768.0 | 1768 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1769.0 | 1769 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1770.0 | 1770 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1771.0 | 1771 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1772.0 | 1772 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1773.0 | 1773 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1774.0 | 1774 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1775.0 | 1775 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1776.0 | 1776 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1777.0 | 1777 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1778.0 | 1778 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1779.0 | 1779 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1780.0 | 1780 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1781.0 | 1781 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1782.0 | 1782 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1783.0 | 1783 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1784.0 | 1784 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1785.0 | 1785 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1786.0 | 1786 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1787.0 | 1787 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1788.0 | 1788 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1789.0 | 1789 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1790.0 | 1790 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1791.0 | 1791 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1792.0 | 1792 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1793.0 | 1793 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1794.0 | 1794 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1795.0 | 1795 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1796.0 | 1796 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1797.0 | 1797 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1798.0 | 1798 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1799.0 | 1799 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1800.0 | 1800 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1801.0 | 1801 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1802.0 | 1802 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1803.0 | 1803 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1804.0 | 1804 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1805.0 | 1805 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1806.0 | 1806 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1807.0 | 1807 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1808.0 | 1808 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1809.0 | 1809 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1810.0 | 1810 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1811.0 | 1811 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1812.0 | 1812 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1813.0 | 1813 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1814.0 | 1814 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1815.0 | 1815 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1816.0 | 1816 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1817.0 | 1817 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1818.0 | 1818 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1819.0 | 1819 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1820.0 | 1820 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1821.0 | 1821 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1822.0 | 1822 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1823.0 | 1823 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1824.0 | 1824 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1825.0 | 1825 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1826.0 | 1826 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1827.0 | 1827 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1828.0 | 1828 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1829.0 | 1829 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1830.0 | 1830 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1831.0 | 1831 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1832.0 | 1832 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1833.0 | 1833 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1834.0 | 1834 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1835.0 | 1835 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1836.0 | 1836 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1837.0 | 1837 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1838.0 | 1838 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1839.0 | 1839 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1840.0 | 1840 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1841.0 | 1841 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1842.0 | 1842 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1843.0 | 1843 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1844.0 | 1844 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1845.0 | 1845 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1846.0 | 1846 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1847.0 | 1847 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1848.0 | 1848 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1849.0 | 1849 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1850.0 | 1850 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1851.0 | 1851 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1852.0 | 1852 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1853.0 | 1853 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1854.0 | 1854 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1855.0 | 1855 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1856.0 | 1856 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1857.0 | 1857 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1858.0 | 1858 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1859.0 | 1859 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1860.0 | 1860 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1861.0 | 1861 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1862.0 | 1862 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1863.0 | 1863 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1864.0 | 1864 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1865.0 | 1865 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1866.0 | 1866 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1867.0 | 1867 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1868.0 | 1868 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1869.0 | 1869 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1870.0 | 1870 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1871.0 | 1871 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1872.0 | 1872 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1873.0 | 1873 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1874.0 | 1874 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1875.0 | 1875 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1876.0 | 1876 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1877.0 | 1877 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1878.0 | 1878 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1879.0 | 1879 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1880.0 | 1880 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1881.0 | 1881 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1882.0 | 1882 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1883.0 | 1883 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1884.0 | 1884 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1885.0 | 1885 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1886.0 | 1886 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1887.0 | 1887 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1888.0 | 1888 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1889.0 | 1889 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1890.0 | 1890 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1891.0 | 1891 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1892.0 | 1892 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1893.0 | 1893 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1894.0 | 1894 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1895.0 | 1895 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1896.0 | 1896 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1897.0 | 1897 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1898.0 | 1898 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1899.0 | 1899 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1900.0 | 1900 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1901.0 | 1901 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1902.0 | 1902 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1903.0 | 1903 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1904.0 | 1904 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1905.0 | 1905 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1906.0 | 1906 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1907.0 | 1907 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1908.0 | 1908 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1909.0 | 1909 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1910.0 | 1910 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1911.0 | 1911 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1912.0 | 1912 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1913.0 | 1913 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1914.0 | 1914 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1915.0 | 1915 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1916.0 | 1916 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1917.0 | 1917 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1918.0 | 1918 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1919.0 | 1919 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1920.0 | 1920 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1921.0 | 1921 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1922.0 | 1922 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1923.0 | 1923 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1924.0 | 1924 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1925.0 | 1925 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1926.0 | 1926 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1927.0 | 1927 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1928.0 | 1928 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1929.0 | 1929 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1930.0 | 1930 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1931.0 | 1931 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1932.0 | 1932 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1933.0 | 1933 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1934.0 | 1934 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1935.0 | 1935 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1936.0 | 1936 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1937.0 | 1937 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1938.0 | 1938 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1939.0 | 1939 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1940.0 | 1940 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1941.0 | 1941 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1942.0 | 1942 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1943.0 | 1943 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1944.0 | 1944 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1945.0 | 1945 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1946.0 | 1946 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1947.0 | 1947 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1948.0 | 1948 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1949.0 | 1949 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1950.0 | 1950 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1951.0 | 1951 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1952.0 | 1952 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1953.0 | 1953 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1954.0 | 1954 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1955.0 | 1955 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1956.0 | 1956 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1957.0 | 1957 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1958.0 | 1958 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1959.0 | 1959 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1960.0 | 1960 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1961.0 | 1961 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1962.0 | 1962 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1963.0 | 1963 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1964.0 | 1964 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1965.0 | 1965 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1966.0 | 1966 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1967.0 | 1967 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1968.0 | 1968 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1969.0 | 1969 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1970.0 | 1970 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1971.0 | 1971 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1972.0 | 1972 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1973.0 | 1973 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1974.0 | 1974 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1975.0 | 1975 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1976.0 | 1976 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1977.0 | 1977 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1978.0 | 1978 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1979.0 | 1979 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1980.0 | 1980 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1981.0 | 1981 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1982.0 | 1982 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1983.0 | 1983 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1984.0 | 1984 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1985.0 | 1985 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1986.0 | 1986 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1987.0 | 1987 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1988.0 | 1988 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1989.0 | 1989 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1990.0 | 1990 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1991.0 | 1991 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1992.0 | 1992 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1993.0 | 1993 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1994.0 | 1994 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1995.0 | 1995 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1996.0 | 1996 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1997.0 | 1997 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1998.0 | 1998 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 1999.0 | 1999 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | | 0.0 | 2000.0 | 2000 | nan | 3.5714 | 1.2195 | 3.5714 | 3.5714 | 19.0 | ### Framework versions - Transformers 4.41.2 - Pytorch 2.3.0+cu121 - Datasets 2.19.2 - Tokenizers 0.19.1
mssma/ko-solar-10.7b-v0.7
mssma
2024-06-04T04:41:12Z
61
0
transformers
[ "transformers", "safetensors", "llama", "text-generation", "ko", "license:apache-2.0", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2024-06-04T04:30:55Z
--- library_name: transformers license: apache-2.0 language: - ko --- # usage ``` from transformers import AutoModelForCausalLM, AutoTokenizer import torch path = "mssma/ko-solar-10.7b-v0.7" model = AutoModelForCausalLM.from_pretrained( path, return_dict=True, torch_dtype=torch.float16, device_map='auto' ) tokenizer = AutoTokenizer.from_pretrained(path) ```
Ariffiq99/CRAB_COPA_KUCI_Bert_Base_Uncased_finetuned
Ariffiq99
2024-06-04T04:38:09Z
105
0
transformers
[ "transformers", "tensorboard", "safetensors", "bert", "multiple-choice", "generated_from_trainer", "base_model:Ariffiq99/COPA_KUCI_Bert_Base_Uncased_Finetuned", "base_model:finetune:Ariffiq99/COPA_KUCI_Bert_Base_Uncased_Finetuned", "license:apache-2.0", "endpoints_compatible", "region:us" ]
multiple-choice
2024-06-04T03:39:36Z
--- license: apache-2.0 base_model: Ariffiq99/COPA_KUCI_Bert_Base_Uncased_Finetuned tags: - generated_from_trainer metrics: - f1 model-index: - name: CRAB_COPA_KUCI_Bert_Base_Uncased_finetuned results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # CRAB_COPA_KUCI_Bert_Base_Uncased_finetuned This model is a fine-tuned version of [Ariffiq99/COPA_KUCI_Bert_Base_Uncased_Finetuned](https://huggingface.co/Ariffiq99/COPA_KUCI_Bert_Base_Uncased_Finetuned) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.7686 - F1: 0.7694 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 1 - eval_batch_size: 1 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 8 ### Training results | Training Loss | Epoch | Step | Validation Loss | F1 | |:-------------:|:-----:|:-----:|:---------------:|:------:| | 1.0747 | 1.0 | 2880 | 0.9424 | 0.7014 | | 0.9502 | 2.0 | 5760 | 0.8660 | 0.7167 | | 0.8039 | 3.0 | 8640 | 0.7995 | 0.7278 | | 0.7633 | 4.0 | 11520 | 0.8053 | 0.7333 | | 0.7705 | 5.0 | 14400 | 0.8241 | 0.75 | | 0.8075 | 6.0 | 17280 | 0.7628 | 0.7667 | | 0.6885 | 7.0 | 20160 | 0.7813 | 0.7708 | | 0.6746 | 8.0 | 23040 | 0.7686 | 0.7694 | ### Framework versions - Transformers 4.41.2 - Pytorch 2.3.0+cu121 - Datasets 2.19.2 - Tokenizers 0.19.1
mradermacher/Machroom-3B-model_stock-GGUF
mradermacher
2024-06-04T04:36:07Z
6
0
transformers
[ "transformers", "gguf", "mergekit", "merge", "en", "license:apache-2.0", "endpoints_compatible", "region:us", "conversational" ]
null
2024-06-04T04:25:44Z
--- base_model: DreadPoor/Machroom-3B-model_stock language: - en library_name: transformers license: apache-2.0 quantized_by: mradermacher tags: - mergekit - merge --- ## About <!-- ### quantize_version: 2 --> <!-- ### output_tensor_quantised: 1 --> <!-- ### convert_type: hf --> <!-- ### vocab_type: --> <!-- ### tags: --> static quants of https://huggingface.co/DreadPoor/Machroom-3B-model_stock <!-- provided-files --> weighted/imatrix quants seem not to be available (by me) at this time. If they do not show up a week or so after the static ones, I have probably not planned for them. Feel free to request them by opening a Community Discussion. ## Usage If you are unsure how to use GGUF files, refer to one of [TheBloke's READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for more details, including on how to concatenate multi-part files. ## Provided Quants (sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants) | Link | Type | Size/GB | Notes | |:-----|:-----|--------:|:------| | [GGUF](https://huggingface.co/mradermacher/Machroom-3B-model_stock-GGUF/resolve/main/Machroom-3B-model_stock.Q2_K.gguf) | Q2_K | 1.2 | | | [GGUF](https://huggingface.co/mradermacher/Machroom-3B-model_stock-GGUF/resolve/main/Machroom-3B-model_stock.IQ3_XS.gguf) | IQ3_XS | 1.3 | | | [GGUF](https://huggingface.co/mradermacher/Machroom-3B-model_stock-GGUF/resolve/main/Machroom-3B-model_stock.IQ3_S.gguf) | IQ3_S | 1.4 | beats Q3_K* | | [GGUF](https://huggingface.co/mradermacher/Machroom-3B-model_stock-GGUF/resolve/main/Machroom-3B-model_stock.Q3_K_S.gguf) | Q3_K_S | 1.4 | | | [GGUF](https://huggingface.co/mradermacher/Machroom-3B-model_stock-GGUF/resolve/main/Machroom-3B-model_stock.IQ3_M.gguf) | IQ3_M | 1.4 | | | [GGUF](https://huggingface.co/mradermacher/Machroom-3B-model_stock-GGUF/resolve/main/Machroom-3B-model_stock.Q3_K_M.gguf) | Q3_K_M | 1.5 | lower quality | | [GGUF](https://huggingface.co/mradermacher/Machroom-3B-model_stock-GGUF/resolve/main/Machroom-3B-model_stock.Q3_K_L.gguf) | Q3_K_L | 1.6 | | | [GGUF](https://huggingface.co/mradermacher/Machroom-3B-model_stock-GGUF/resolve/main/Machroom-3B-model_stock.IQ4_XS.gguf) | IQ4_XS | 1.6 | | | [GGUF](https://huggingface.co/mradermacher/Machroom-3B-model_stock-GGUF/resolve/main/Machroom-3B-model_stock.Q4_K_S.gguf) | Q4_K_S | 1.7 | fast, recommended | | [GGUF](https://huggingface.co/mradermacher/Machroom-3B-model_stock-GGUF/resolve/main/Machroom-3B-model_stock.Q4_K_M.gguf) | Q4_K_M | 1.8 | fast, recommended | | [GGUF](https://huggingface.co/mradermacher/Machroom-3B-model_stock-GGUF/resolve/main/Machroom-3B-model_stock.Q5_K_S.gguf) | Q5_K_S | 2.0 | | | [GGUF](https://huggingface.co/mradermacher/Machroom-3B-model_stock-GGUF/resolve/main/Machroom-3B-model_stock.Q5_K_M.gguf) | Q5_K_M | 2.1 | | | [GGUF](https://huggingface.co/mradermacher/Machroom-3B-model_stock-GGUF/resolve/main/Machroom-3B-model_stock.Q6_K.gguf) | Q6_K | 2.4 | very good quality | | [GGUF](https://huggingface.co/mradermacher/Machroom-3B-model_stock-GGUF/resolve/main/Machroom-3B-model_stock.Q8_0.gguf) | Q8_0 | 3.1 | fast, best quality | | [GGUF](https://huggingface.co/mradermacher/Machroom-3B-model_stock-GGUF/resolve/main/Machroom-3B-model_stock.f16.gguf) | f16 | 5.7 | 16 bpw, overkill | Here is a handy graph by ikawrakow comparing some lower-quality quant types (lower is better): ![image.png](https://www.nethype.de/huggingface_embed/quantpplgraph.png) And here are Artefact2's thoughts on the matter: https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9 ## FAQ / Model Request See https://huggingface.co/mradermacher/model_requests for some answers to questions you might have and/or if you want some other model quantized. ## Thanks I thank my company, [nethype GmbH](https://www.nethype.de/), for letting me use its servers and providing upgrades to my workstation to enable this work in my free time. <!-- end -->
Zoyd/nyunai_nyun-llama3-62B-5_0bpw_exl2
Zoyd
2024-06-04T04:34:37Z
5
0
transformers
[ "transformers", "safetensors", "llama", "text-generation", "license:llama3", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "5-bit", "exl2", "region:us" ]
text-generation
2024-06-03T22:38:53Z
--- license: llama3 --- **Exllamav2** quant (**exl2** / **5.0 bpw**) made with ExLlamaV2 v0.1.3 Other EXL2 quants: | **Quant** | **Model Size** | **lm_head** | | ----- | ---------- | ------- | |<center>**[2.2](https://huggingface.co/Zoyd/nyunai_nyun-llama3-62B-2_2bpw_exl2)**</center> | <center>18625 MB</center> | <center>6</center> | |<center>**[2.5](https://huggingface.co/Zoyd/nyunai_nyun-llama3-62B-2_5bpw_exl2)**</center> | <center>20645 MB</center> | <center>6</center> | |<center>**[3.0](https://huggingface.co/Zoyd/nyunai_nyun-llama3-62B-3_0bpw_exl2)**</center> | <center>24211 MB</center> | <center>6</center> | |<center>**[3.5](https://huggingface.co/Zoyd/nyunai_nyun-llama3-62B-3_5bpw_exl2)**</center> | <center>27784 MB</center> | <center>6</center> | |<center>**[3.75](https://huggingface.co/Zoyd/nyunai_nyun-llama3-62B-3_75bpw_exl2)**</center> | <center>29572 MB</center> | <center>6</center> | |<center>**[4.0](https://huggingface.co/Zoyd/nyunai_nyun-llama3-62B-4_0bpw_exl2)**</center> | <center>31359 MB</center> | <center>6</center> | |<center>**[4.25](https://huggingface.co/Zoyd/nyunai_nyun-llama3-62B-4_25bpw_exl2)**</center> | <center>33139 MB</center> | <center>6</center> | |<center>**[5.0](https://huggingface.co/Zoyd/nyunai_nyun-llama3-62B-5_0bpw_exl2)**</center> | <center>38500 MB</center> | <center>6</center> | |<center>**[6.0](https://huggingface.co/Zoyd/nyunai_nyun-llama3-62B-6_0bpw_exl2)**</center> | <center>45805 MB</center> | <center>8</center> | |<center>**[6.5](https://huggingface.co/Zoyd/nyunai_nyun-llama3-62B-6_5bpw_exl2)**</center> | <center>49410 MB</center> | <center>8</center> | |<center>**[8.0](https://huggingface.co/Zoyd/nyunai_nyun-llama3-62B-8_0bpw_exl2)**</center> | <center>54655 MB</center> | <center>8</center> |
hdve/google-gemma-2b-1717475491
hdve
2024-06-04T04:33:55Z
141
0
transformers
[ "transformers", "safetensors", "gemma", "text-generation", "arxiv:1910.09700", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2024-06-04T04:31:33Z
--- library_name: transformers tags: [] --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
impuneetg/gpt2-wikitext2
impuneetg
2024-06-04T04:31:23Z
138
0
transformers
[ "transformers", "tensorboard", "safetensors", "gpt2", "text-generation", "generated_from_trainer", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2024-06-01T23:35:00Z
--- tags: - generated_from_trainer model-index: - name: gpt2-wikitext2 results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # gpt2-wikitext2 This model was trained from scratch on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.1674 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.001 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - lr_scheduler_warmup_steps: 100 - num_epochs: 5 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:-----:|:---------------:| | 1.4222 | 1.0 | 2916 | 0.7671 | | 0.7233 | 2.0 | 5832 | 0.5135 | | 0.4245 | 3.0 | 8748 | 0.2865 | | 0.2039 | 4.0 | 11664 | 0.1831 | | 0.1082 | 5.0 | 14580 | 0.1674 | ### Framework versions - Transformers 4.41.2 - Pytorch 2.2.1+cu121 - Datasets 2.19.1 - Tokenizers 0.19.1
nyunai/nyun-c1-llama3-62B
nyunai
2024-06-04T04:25:09Z
7
7
transformers
[ "transformers", "safetensors", "llama", "text-generation", "license:llama3", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2024-05-31T05:09:07Z
--- license: llama3 --- # 🔹 Key Highlights: - 12% Fewer Parameters: nyun-llama3-62B comprises approximately 12% fewer parameters than the popular Llama-3-70B. - Intact Performance: Despite having fewer parameters, our model performs at par if not better, and occasionally outperforms, the Llama-3-70B. - No Fine-Tuning Required: This model undergoes no fine-tuning, showcasing the raw potential of our optimization techniques. ## Pipeline and Collaboration For insights into the pipeline and the list of methods used to optimize these models, check out our PruneGPT repository (https://github.com/nyunAI/PruneGPT). We invite companies and organizations interested in joining forces with us to release more such open-source variants to reach out at contact@nyunai.com. ### Model Performance | Dataset | Nyun-Llama3-62B | Meta-Llama3-70B | Meta-Llama2-70B | MBZUAI K2-65B | | --- | --- | --- | --- | --- | | MMLU (5-shot) | 78.9 | 79.5 | 69.7 | 67.9 | | Winogrande (5-shot) | 83.3 | 83.1 | 81.8 | 77.0 | | BoolQ (0-shot) | 85.3 | 79.0 | 73.1 | 83.0 | | Hellaswag (10-shot) | 85.8 | 88.0 | 86.9 | 85.5 | | Arc Challenge (25-shot) | 65.9 | 68.8 | 67.2 | 64.8 | | GSM8K (5-shot) | 70.9 | 76.9 | 52.6 | 50.2 | | Average | 78.4 | 79.2 | 71.9 | 71.4 | - **Developed by:** [Nyun AI](https://nyunai.com/) - **Repository:** [Github](https://github.com/nyunAI/PruneGPT)
MubarakB/T7KGvt4x8LnHYdJN9MQ0
MubarakB
2024-06-04T04:21:09Z
0
0
peft
[ "peft", "safetensors", "arxiv:1910.09700", "base_model:NousResearch/Llama-2-7b-chat-hf", "base_model:adapter:NousResearch/Llama-2-7b-chat-hf", "region:us" ]
null
2024-06-04T04:21:05Z
--- library_name: peft base_model: NousResearch/Llama-2-7b-chat-hf --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed] ### Framework versions - PEFT 0.11.1
SEHYONG/Llama-3-Open-Ko-8B-Instruct-kookmin7
SEHYONG
2024-06-04T04:11:42Z
7
0
transformers
[ "transformers", "safetensors", "llama", "text-generation", "text-generation-inference", "unsloth", "trl", "conversational", "en", "base_model:SEHYONG/Llama-3-Open-Ko-8B-Instruct-kookmin6", "base_model:finetune:SEHYONG/Llama-3-Open-Ko-8B-Instruct-kookmin6", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
text-generation
2024-06-04T04:05:37Z
--- language: - en license: apache-2.0 tags: - text-generation-inference - transformers - unsloth - llama - trl base_model: SEHYONG/Llama-3-Open-Ko-8B-Instruct-kookmin6 --- # Uploaded model - **Developed by:** SEHYONG - **License:** apache-2.0 - **Finetuned from model :** SEHYONG/Llama-3-Open-Ko-8B-Instruct-kookmin6 This llama model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library. [<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
Rudra360/Emoji_Suggester
Rudra360
2024-06-04T04:09:27Z
0
0
spacy
[ "spacy", "en", "region:us" ]
null
2024-06-03T14:17:44Z
--- language: - en library_name: spacy --- # Emoji Suggester Emoji Suggester is a tool designed to recommend relevant emojis based on incoming messages from social media apps, enhancing expressiveness and engagement in your conversations. The suggestions are powered by a model trained on a dataset of Twitter messages. ## Table of Contents - [Installation](#installation) - [Usage](#usage) - [Contributing](#contributing) - [License](#license) - [Contact](#contact) ## Installation To install Emoji Suggester, follow these steps: 1. Clone the repository: ```bash git clone https://huggingface.co/Rudra360/Emoji_Suggester or ```bash git clone git@huggingface.co:Rudra360/Emoji_Suggester.git ## Usage Change the Directory 1. go to emoji_suggester ```bash cd Emoji_Suggester Then the run the follwing script 2. from util import predict 3. message = "I'm so happy today!" suggested_emojis = predict(message) print(suggested_emojis)
hdve/Qwen-Qwen1.5-7B-1717473930
hdve
2024-06-04T04:08:43Z
7
0
transformers
[ "transformers", "safetensors", "qwen2", "text-generation", "conversational", "arxiv:1910.09700", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2024-06-04T04:06:13Z
--- library_name: transformers tags: [] --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
andikazf15/IndoBERT-QA-product-pred
andikazf15
2024-06-04T04:08:06Z
36
0
transformers
[ "transformers", "tensorboard", "safetensors", "bert", "question-answering", "generated_from_trainer", "base_model:rizquuula/mBERT-IndoSQuADv2_1691852742-16-2e-06-0.01-5", "base_model:finetune:rizquuula/mBERT-IndoSQuADv2_1691852742-16-2e-06-0.01-5", "license:apache-2.0", "endpoints_compatible", "region:us" ]
question-answering
2024-06-03T03:02:51Z
--- license: apache-2.0 base_model: rizquuula/mBERT-IndoSQuADv2_1691852742-16-2e-06-0.01-5 tags: - generated_from_trainer model-index: - name: IndoBERT-QA-product-pred results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # IndoBERT-QA-product-pred This model is a fine-tuned version of [rizquuula/mBERT-IndoSQuADv2_1691852742-16-2e-06-0.01-5](https://huggingface.co/rizquuula/mBERT-IndoSQuADv2_1691852742-16-2e-06-0.01-5) on the None dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3 - mixed_precision_training: Native AMP ### Training results ### Framework versions - Transformers 4.41.1 - Pytorch 2.3.0+cu121 - Datasets 2.19.2 - Tokenizers 0.19.1
chainup244/Qwen-Qwen1.5-7B-1717473432
chainup244
2024-06-04T04:04:37Z
7
0
transformers
[ "transformers", "safetensors", "qwen2", "text-generation", "conversational", "arxiv:1910.09700", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2024-06-04T03:57:17Z
--- library_name: transformers tags: [] --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
cgus/AlchemistCoder-DS-6.7B-exl2
cgus
2024-06-04T03:59:59Z
5
0
transformers
[ "transformers", "llama", "text-generation", "code generation", "conversational", "arxiv:2405.19265", "base_model:internlm/AlchemistCoder-DS-6.7B", "base_model:quantized:internlm/AlchemistCoder-DS-6.7B", "license:apache-2.0", "autotrain_compatible", "4-bit", "exl2", "region:us" ]
text-generation
2024-06-03T23:59:24Z
--- license: apache-2.0 base_model: internlm/AlchemistCoder-DS-6.7B inference: false tags: - code generation --- # AlchemistCoder-DS-6.7B-exl2 Original model: [AlchemistCoder-DS-6.7B](https://huggingface.co/internlm/AlchemistCoder-DS-6.7B) Model creator: [InternLM](https://huggingface.co/internlm) ## Quants [4bpw h6 (main)](https://huggingface.co/cgus/AlchemistCoder-DS-6.7B-exl2/tree/main) [4.25bpw h6](https://huggingface.co/cgus/AlchemistCoder-DS-6.7B-exl2/tree/4.25bpw-h6) [4.65bpw h6](https://huggingface.co/cgus/AlchemistCoder-DS-6.7B-exl2/tree/4.65bpw-h6) [5bpw h6](https://huggingface.co/cgus/AlchemistCoder-DS-6.7B-exl2/tree/5bpw-h6) [6bpw h6](https://huggingface.co/cgus/AlchemistCoder-DS-6.7B-exl2/tree/6bpw-h6) [8bpw h8](https://huggingface.co/cgus/AlchemistCoder-DS-6.7B-exl2/tree/8bpw-h8) ## Quantization notes Made with Exllamav2 0.1.3 with the default dataset. ## How to run This model is meant to be used with Exllamav2 loader that requires the model to be fully loaded into GPU VRAM. It primarily requires a Nvidia RTX card on Windows/Linux or AMD card on Linux. If you want to use this model but your system doesn't meet these requirements, you should look for GGUF versions of the model. It can be used with apps like: [Text Generation Webui](https://github.com/oobabooga/text-generation-webui) [KoboldAI](https://github.com/henk717/KoboldAI) [ExUI](https://github.com/turboderp/exui) [lollms-webui](https://github.com/ParisNeo/lollms-webui) # Original model card # AlchemistCoder: Harmonizing and Eliciting Code Capability by Hindsight Tuning on Multi-source Data [[🤗 HuggingFace](https://huggingface.co/internlm/AlchemistCoder-DS-6.7B)] [[📃 Paper](https://arxiv.org/abs/2405.19265)] [[🌐 Project Page](https://internlm.github.io/AlchemistCoder/)] ## ✨ Highlights > **Abstract:** *Open-source Large Language Models (LLMs) and their specialized variants, particularly Code LLMs, have recently delivered impressive performance. However, previous Code LLMs are typically fine-tuned on single-source data with limited quality and diversity, which may insufficiently elicit the potential of pre-trained Code LLMs. In this paper, we present AlchemistCoder, a series of Code LLMs with enhanced code generation and generalization capabilities fine-tuned on multi-source data. To achieve this, we pioneer to unveil inherent conflicts among the various styles and qualities in multi-source code corpora and introduce data-specific prompts with hindsight relabeling, termed AlchemistPrompts, to harmonize different data sources and instruction-response pairs. Additionally, we propose incorporating the data construction process into the fine-tuning data as code comprehension tasks, including instruction evolution, data filtering, and code review. Extensive experiments demonstrate that AlchemistCoder holds a clear lead among all models of the same size (6.7B/7B) and rivals or even surpasses larger models (15B/33B/70B), showcasing the efficacy of our method in refining instruction-following capabilities and advancing the boundaries of code intelligence.* - **AlchemistPrompts**: Designed as data-specific prompts for harmonizing inherent conflicts in multi-source data and mitigating the instruction/response misalignment at a fined-grained level. - **Code Comprehenstion Tasks**: Sourced from the process of data construction, consisting of instruction evolution, data filtering, and code review. - **Harmonized Multi-source Data**: Instruction tuned on 200M tokens, including 6 types of high-quality data. - **Superior Model Performance**: Surpassing all the open-source models of the same size (6.7/7B), and rivaling or even beating larger models (15B/33B/70B/ChatGPT) on 6 code benchmarks. - **Advanced generic capabilities**: Demonstrated by the significant improvements on MMLU, BBH, and GSM8K. ## 🚀 Quick Start ```python import torch from transformers import AutoModelForCausalLM, AutoTokenizer tokenizer = AutoTokenizer.from_pretrained("internlm/AlchemistCoder-DS-6.7B", trust_remote_code=True) model = AutoModelForCausalLM.from_pretrained("internlm/AlchemistCoder-DS-6.7B", trust_remote_code=True, torch_dtype=torch.bfloat16).cuda() model = model.eval() input_text = "Implement the Dijkstra algorithm in Python" inputs = tokenizer(input_text, return_tensors="pt").to(model.device) outputs = model.generate(**inputs, max_length=128) print(tokenizer.decode(outputs[0], skip_special_tokens=True)) ``` The above code will generate the following result: ```python import heapq def dijkstra(graph, start, end): """ Find the shortest path between two nodes in a graph using Dijkstra's algorithm. Args: graph (dict): A dictionary representing the graph, where the keys are nodes and the values are dictionaries containing the neighboring nodes and their edge weights. start: The starting node. end: The target node. Returns: list: The shortest path from the start node to the target node as a list of nodes. Raises: ValueError: If either the start or end node is not present in the graph. """ # Check if start and end nodes are in the graph if start not in graph: raise ValueError("Start node is not present in the graph.") if end not in graph: raise ValueError("End node is not present in the graph.") # Initialize the distance dictionary with infinite distances for all nodes distances = {node: float('inf') for node in graph} # Set the distance of the start node to 0 distances[start] = 0 # Initialize the heap with the start node heap = [(0, start)] # Initialize the previous dictionary to keep track of the path previous = {} while heap: # Pop the node with the smallest distance from the heap current_distance, current_node = heapq.heappop(heap) # If the current node is the end node, we have found the shortest path if current_node == end: # Reconstruct the path by following the previous nodes path = [] while current_node in previous: path.append(current_node) current_node = previous[current_node] path.append(start) # Reverse the path to get the correct order return path[::-1] # Iterate over the neighboring nodes and update their distances for neighbor, weight in graph[current_node].items(): new_distance = current_distance + weight # If a shorter path is found, update the distance and previous node if new_distance < distances[neighbor]: distances[neighbor] = new_distance previous[neighbor] = current_node heapq.heappush(heap, (new_distance, neighbor)) # If there is no path between the start and end nodes, return an empty list return [] ``` > The `dijkstra` function takes three arguments: `graph`, `start`, and `end`. The `graph` argument is a dictionary representing the graph, where the keys are nodes and the values are dictionaries containing the neighboring nodes and their edge weights. The `start` argument is the starting node, and the `end` argument is the target node. > The function first checks if the start and end nodes are present in the graph. If either node is not present, a `ValueError` is raised. > The function then initializes a `distances` dictionary with infinite distances for all nodes. It sets the distance of the start node to 0. It also initializes a heap with the start node and a `previous` dictionary to keep track of the path. > The algorithm then iterates over the nodes in the heap. For each node, it checks if it is the end node. If it is, the function reconstructs the path by following the previous nodes and returns the shortest path as a list of nodes in the correct order. > If the current node is not the end node, the algorithm iterates over its neighboring nodes and updates their distances if a shorter path is found. It also updates the `previous` dictionary to keep track of the path. > If there is no path between the start and end nodes, the function returns an empty list. > Note that this implementation assumes that the graph is a directed graph, and it uses a heap data structure to efficiently select the node with the smallest distance at each step. ## 🧪 Evaluation and Fine-tune Please refer to [**AlchemistCoder**](https://github.com/InternLM/AlchemistCoder) and [**InternLM**](https://github.com/InternLM/InternLM/tree/main). ## 😃 Acknowledgments *AlchemistCoder* is built with [**InternLM**](https://github.com/InternLM) and [**OpenCompass**](https://github.com/open-compass). Thanks for their awesome work! ## 📧 Contact If you have any questions, please create an issue on this repository or contact us at: - sugger@tongji.edu.cn - zhangwenwei@pjlab.org.cn ## 🌟 Citation If you find our work useful, please consider citing: ```bibtex @misc{song2024alchemistcoder, title={AlchemistCoder: Harmonizing and Eliciting Code Capability by Hindsight Tuning on Multi-source Data}, author={Zifan Song and Yudong Wang and Wenwei Zhang and Kuikun Liu and Chengqi Lyu and Demin Song and Qipeng Guo and Hang Yan and Dahua Lin and Kai Chen and Cairong Zhao}, year={2024}, eprint={2405.19265}, archivePrefix={arXiv}, primaryClass={cs.CL} } ```
deewuok/sentiment-lora
deewuok
2024-06-04T03:44:18Z
0
0
transformers
[ "transformers", "safetensors", "arxiv:1910.09700", "endpoints_compatible", "region:us" ]
null
2024-06-04T03:43:16Z
--- library_name: transformers tags: [] --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
cgihlstorf/llama2-13b32_1_0.0003_sequential
cgihlstorf
2024-06-04T03:41:18Z
0
0
peft
[ "peft", "arxiv:1910.09700", "base_model:meta-llama/Llama-2-13b-hf", "base_model:adapter:meta-llama/Llama-2-13b-hf", "region:us" ]
null
2024-06-04T03:40:19Z
--- library_name: peft base_model: meta-llama/Llama-2-13b-hf --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed] ### Framework versions - PEFT 0.10.0
Zoyd/nyunai_nyun-llama3-62B-2_5bpw_exl2
Zoyd
2024-06-04T03:41:15Z
5
0
transformers
[ "transformers", "safetensors", "llama", "text-generation", "license:llama3", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "exl2", "region:us" ]
text-generation
2024-06-03T14:47:50Z
--- license: llama3 --- **Exllamav2** quant (**exl2** / **2.5 bpw**) made with ExLlamaV2 v0.1.3 Other EXL2 quants: | **Quant** | **Model Size** | **lm_head** | | ----- | ---------- | ------- | |<center>**[2.2](https://huggingface.co/Zoyd/nyunai_nyun-llama3-62B-2_2bpw_exl2)**</center> | <center>18625 MB</center> | <center>6</center> | |<center>**[2.5](https://huggingface.co/Zoyd/nyunai_nyun-llama3-62B-2_5bpw_exl2)**</center> | <center>20645 MB</center> | <center>6</center> | |<center>**[3.0](https://huggingface.co/Zoyd/nyunai_nyun-llama3-62B-3_0bpw_exl2)**</center> | <center>24211 MB</center> | <center>6</center> | |<center>**[3.5](https://huggingface.co/Zoyd/nyunai_nyun-llama3-62B-3_5bpw_exl2)**</center> | <center>27784 MB</center> | <center>6</center> | |<center>**[3.75](https://huggingface.co/Zoyd/nyunai_nyun-llama3-62B-3_75bpw_exl2)**</center> | <center>29572 MB</center> | <center>6</center> | |<center>**[4.0](https://huggingface.co/Zoyd/nyunai_nyun-llama3-62B-4_0bpw_exl2)**</center> | <center>31359 MB</center> | <center>6</center> | |<center>**[4.25](https://huggingface.co/Zoyd/nyunai_nyun-llama3-62B-4_25bpw_exl2)**</center> | <center>33139 MB</center> | <center>6</center> | |<center>**[5.0](https://huggingface.co/Zoyd/nyunai_nyun-llama3-62B-5_0bpw_exl2)**</center> | <center>38500 MB</center> | <center>6</center> | |<center>**[6.0](https://huggingface.co/Zoyd/nyunai_nyun-llama3-62B-6_0bpw_exl2)**</center> | <center>45805 MB</center> | <center>8</center> | |<center>**[6.5](https://huggingface.co/Zoyd/nyunai_nyun-llama3-62B-6_5bpw_exl2)**</center> | <center>49410 MB</center> | <center>8</center> | |<center>**[8.0](https://huggingface.co/Zoyd/nyunai_nyun-llama3-62B-8_0bpw_exl2)**</center> | <center>54655 MB</center> | <center>8</center> |
srbdtwentyfour/mystery-llama-3-8b-v2
srbdtwentyfour
2024-06-04T03:39:26Z
0
0
transformers
[ "transformers", "safetensors", "text-generation-inference", "unsloth", "llama", "trl", "en", "base_model:unsloth/llama-3-8b-Instruct-bnb-4bit", "base_model:finetune:unsloth/llama-3-8b-Instruct-bnb-4bit", "license:apache-2.0", "endpoints_compatible", "region:us" ]
null
2024-06-03T08:18:31Z
--- language: - en license: apache-2.0 tags: - text-generation-inference - transformers - unsloth - llama - trl base_model: unsloth/llama-3-8b-Instruct-bnb-4bit --- # Uploaded model - **Developed by:** srbdtwentyfour - **License:** apache-2.0 - **Finetuned from model :** unsloth/llama-3-8b-Instruct-bnb-4bit This llama model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library. [<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
hienbm/llama-3-8b-bnb-4bit_mtast
hienbm
2024-06-04T03:34:23Z
0
0
transformers
[ "transformers", "safetensors", "text-generation-inference", "unsloth", "llama", "trl", "en", "base_model:unsloth/llama-3-8b-bnb-4bit", "base_model:finetune:unsloth/llama-3-8b-bnb-4bit", "license:apache-2.0", "endpoints_compatible", "region:us" ]
null
2024-05-28T05:26:48Z
--- language: - en license: apache-2.0 tags: - text-generation-inference - transformers - unsloth - llama - trl base_model: unsloth/llama-3-8b-bnb-4bit --- # Uploaded model - **Developed by:** hienbm - **License:** apache-2.0 - **Finetuned from model :** unsloth/llama-3-8b-bnb-4bit This llama model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library. [<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
quinnb/llama_train
quinnb
2024-06-04T03:31:55Z
79
0
transformers
[ "transformers", "safetensors", "llama", "text-generation", "trl", "sft", "arxiv:1910.09700", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "4-bit", "bitsandbytes", "region:us" ]
text-generation
2024-06-04T03:27:42Z
--- library_name: transformers tags: - trl - sft --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
Ariffiq99/CRAB_COPA_KUCI_xlm_roberta_base_finetuned
Ariffiq99
2024-06-04T03:25:02Z
6
0
transformers
[ "transformers", "tensorboard", "safetensors", "xlm-roberta", "multiple-choice", "generated_from_trainer", "base_model:Ariffiq99/COPA_KUCI_xlm_roberta_base_finetuned", "base_model:finetune:Ariffiq99/COPA_KUCI_xlm_roberta_base_finetuned", "license:mit", "endpoints_compatible", "region:us" ]
multiple-choice
2024-06-04T02:57:25Z
--- license: mit base_model: Ariffiq99/COPA_KUCI_xlm_roberta_base_finetuned tags: - generated_from_trainer metrics: - f1 model-index: - name: CRAB_COPA_KUCI_xlm_roberta_base_finetuned results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # CRAB_COPA_KUCI_xlm_roberta_base_finetuned This model is a fine-tuned version of [Ariffiq99/COPA_KUCI_xlm_roberta_base_finetuned](https://huggingface.co/Ariffiq99/COPA_KUCI_xlm_roberta_base_finetuned) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 1.1600 - F1: 0.7417 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 1 - eval_batch_size: 1 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 8 ### Training results | Training Loss | Epoch | Step | Validation Loss | F1 | |:-------------:|:-----:|:-----:|:---------------:|:------:| | 1.2245 | 1.0 | 2880 | 0.9044 | 0.6875 | | 1.1396 | 2.0 | 5760 | 1.0192 | 0.7042 | | 1.039 | 3.0 | 8640 | 1.1395 | 0.7222 | | 0.8411 | 4.0 | 11520 | 1.1650 | 0.7389 | | 0.7471 | 5.0 | 14400 | 1.1235 | 0.7361 | | 0.9344 | 6.0 | 17280 | 1.1646 | 0.7375 | | 0.7564 | 7.0 | 20160 | 1.0863 | 0.7417 | | 0.7116 | 8.0 | 23040 | 1.1600 | 0.7417 | ### Framework versions - Transformers 4.41.2 - Pytorch 2.3.0+cu121 - Datasets 2.19.2 - Tokenizers 0.19.1
Zoyd/nyunai_nyun-llama3-62B-6_0bpw_exl2
Zoyd
2024-06-04T03:19:34Z
5
0
transformers
[ "transformers", "safetensors", "llama", "text-generation", "license:llama3", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "6-bit", "exl2", "region:us" ]
text-generation
2024-06-03T23:58:04Z
--- license: llama3 --- **Exllamav2** quant (**exl2** / **6.0 bpw**) made with ExLlamaV2 v0.1.3 Other EXL2 quants: | **Quant** | **Model Size** | **lm_head** | | ----- | ---------- | ------- | |<center>**[2.2](https://huggingface.co/Zoyd/nyunai_nyun-llama3-62B-2_2bpw_exl2)**</center> | <center>18625 MB</center> | <center>6</center> | |<center>**[2.5](https://huggingface.co/Zoyd/nyunai_nyun-llama3-62B-2_5bpw_exl2)**</center> | <center>20645 MB</center> | <center>6</center> | |<center>**[3.0](https://huggingface.co/Zoyd/nyunai_nyun-llama3-62B-3_0bpw_exl2)**</center> | <center>24211 MB</center> | <center>6</center> | |<center>**[3.5](https://huggingface.co/Zoyd/nyunai_nyun-llama3-62B-3_5bpw_exl2)**</center> | <center>27784 MB</center> | <center>6</center> | |<center>**[3.75](https://huggingface.co/Zoyd/nyunai_nyun-llama3-62B-3_75bpw_exl2)**</center> | <center>29572 MB</center> | <center>6</center> | |<center>**[4.0](https://huggingface.co/Zoyd/nyunai_nyun-llama3-62B-4_0bpw_exl2)**</center> | <center>31359 MB</center> | <center>6</center> | |<center>**[4.25](https://huggingface.co/Zoyd/nyunai_nyun-llama3-62B-4_25bpw_exl2)**</center> | <center>33139 MB</center> | <center>6</center> | |<center>**[5.0](https://huggingface.co/Zoyd/nyunai_nyun-llama3-62B-5_0bpw_exl2)**</center> | <center>38500 MB</center> | <center>6</center> | |<center>**[6.0](https://huggingface.co/Zoyd/nyunai_nyun-llama3-62B-6_0bpw_exl2)**</center> | <center>45805 MB</center> | <center>8</center> | |<center>**[6.5](https://huggingface.co/Zoyd/nyunai_nyun-llama3-62B-6_5bpw_exl2)**</center> | <center>49410 MB</center> | <center>8</center> | |<center>**[8.0](https://huggingface.co/Zoyd/nyunai_nyun-llama3-62B-8_0bpw_exl2)**</center> | <center>54655 MB</center> | <center>8</center> |
k707peepee/llama-3-8b-bnb-4bit
k707peepee
2024-06-04T03:15:55Z
4
0
transformers
[ "transformers", "gguf", "llama", "text-generation-inference", "unsloth", "en", "base_model:unsloth/llama-3-8b-bnb-4bit", "base_model:quantized:unsloth/llama-3-8b-bnb-4bit", "license:apache-2.0", "endpoints_compatible", "region:us" ]
null
2024-06-04T03:07:55Z
--- language: - en license: apache-2.0 tags: - text-generation-inference - transformers - unsloth - llama - gguf base_model: unsloth/llama-3-8b-bnb-4bit --- # Uploaded model - **Developed by:** k707peepee - **License:** apache-2.0 - **Finetuned from model :** unsloth/llama-3-8b-bnb-4bit This llama model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library. [<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
fullnonstop/random_mask_brushnet_ckpt_sdxl_v0
fullnonstop
2024-06-04T03:12:10Z
0
0
null
[ "license:apache-2.0", "region:us" ]
null
2024-05-31T13:01:15Z
--- license: apache-2.0 ---
HuggingFaceFW/ablation-exp-dedup-global_minhash-350BT
HuggingFaceFW
2024-06-04T03:10:19Z
5
0
transformers
[ "transformers", "safetensors", "llama", "text-generation", "arxiv:1910.09700", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2024-06-03T23:35:11Z
--- library_name: transformers tags: [] --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
Zoyd/nyunai_nyun-llama3-62B-4_25bpw_exl2
Zoyd
2024-06-04T03:08:17Z
5
0
transformers
[ "transformers", "safetensors", "llama", "text-generation", "license:llama3", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "exl2", "region:us" ]
text-generation
2024-06-03T21:15:12Z
--- license: llama3 --- **Exllamav2** quant (**exl2** / **4.25 bpw**) made with ExLlamaV2 v0.1.3 Other EXL2 quants: | **Quant** | **Model Size** | **lm_head** | | ----- | ---------- | ------- | |<center>**[2.2](https://huggingface.co/Zoyd/nyunai_nyun-llama3-62B-2_2bpw_exl2)**</center> | <center>18625 MB</center> | <center>6</center> | |<center>**[2.5](https://huggingface.co/Zoyd/nyunai_nyun-llama3-62B-2_5bpw_exl2)**</center> | <center>20645 MB</center> | <center>6</center> | |<center>**[3.0](https://huggingface.co/Zoyd/nyunai_nyun-llama3-62B-3_0bpw_exl2)**</center> | <center>24211 MB</center> | <center>6</center> | |<center>**[3.5](https://huggingface.co/Zoyd/nyunai_nyun-llama3-62B-3_5bpw_exl2)**</center> | <center>27784 MB</center> | <center>6</center> | |<center>**[3.75](https://huggingface.co/Zoyd/nyunai_nyun-llama3-62B-3_75bpw_exl2)**</center> | <center>29572 MB</center> | <center>6</center> | |<center>**[4.0](https://huggingface.co/Zoyd/nyunai_nyun-llama3-62B-4_0bpw_exl2)**</center> | <center>31359 MB</center> | <center>6</center> | |<center>**[4.25](https://huggingface.co/Zoyd/nyunai_nyun-llama3-62B-4_25bpw_exl2)**</center> | <center>33139 MB</center> | <center>6</center> | |<center>**[5.0](https://huggingface.co/Zoyd/nyunai_nyun-llama3-62B-5_0bpw_exl2)**</center> | <center>38500 MB</center> | <center>6</center> | |<center>**[6.0](https://huggingface.co/Zoyd/nyunai_nyun-llama3-62B-6_0bpw_exl2)**</center> | <center>45805 MB</center> | <center>8</center> | |<center>**[6.5](https://huggingface.co/Zoyd/nyunai_nyun-llama3-62B-6_5bpw_exl2)**</center> | <center>49410 MB</center> | <center>8</center> | |<center>**[8.0](https://huggingface.co/Zoyd/nyunai_nyun-llama3-62B-8_0bpw_exl2)**</center> | <center>54655 MB</center> | <center>8</center> |
Abhinay45/outputs
Abhinay45
2024-06-04T03:08:04Z
0
0
peft
[ "peft", "tensorboard", "safetensors", "trl", "sft", "unsloth", "generated_from_trainer", "dataset:yahma/alpaca-cleaned", "base_model:unsloth/llama-3-8b-bnb-4bit", "base_model:adapter:unsloth/llama-3-8b-bnb-4bit", "license:llama2", "region:us" ]
null
2024-06-04T03:05:36Z
--- license: llama2 library_name: peft tags: - trl - sft - unsloth - generated_from_trainer base_model: unsloth/llama-3-8b-bnb-4bit datasets: - yahma/alpaca-cleaned model-index: - name: Alpaca + Llama-3 8b Unsloth 2x faster finetuning. results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # Alpaca + Llama-3 8b Unsloth 2x faster finetuning. This model is a fine-tuned version of [unsloth/llama-3-8b-bnb-4bit](https://huggingface.co/unsloth/llama-3-8b-bnb-4bit) on the alpaca dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0002 - train_batch_size: 2 - eval_batch_size: 8 - seed: 3407 - gradient_accumulation_steps: 4 - total_train_batch_size: 8 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 5 - training_steps: 60 - mixed_precision_training: Native AMP ### Training results ### Framework versions - PEFT 0.11.1 - Transformers 4.41.1 - Pytorch 2.3.0+cu121 - Datasets 2.19.2 - Tokenizers 0.19.1
ryota39/Tora-7B-v0.2
ryota39
2024-06-04T02:59:17Z
9
1
transformers
[ "transformers", "safetensors", "mistral", "text-generation", "license:cc-by-nc-4.0", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2024-05-06T05:44:27Z
--- license: cc-by-nc-4.0 --- ## License 非商用ライセンスで公開します。 ## Chat Vector ``` Tora-7B-v0.2 = NTQAI/chatntq-ja-7b-v1.0 + (NousResearch/Hermes-2-Pro-Mistral-7B - mistralai/Mistral-7B-v0.1) ``` ## 実装 @jovyan様の実装を参考に下記のコードでモデルを作成しました。 ```python import torch from transformers import AutoModelForCausalLM def build_chat_vector_model( base_model_name, inst_model_name, target_model_name, skip_layers, ): base_model = AutoModelForCausalLM.from_pretrained( base_model_name, torch_dtype=torch.bfloat16, device_map="cpu", ) inst_model = AutoModelForCausalLM.from_pretrained( inst_model_name, torch_dtype=torch.bfloat16, device_map="cpu", ) target_model = AutoModelForCausalLM.from_pretrained( target_model_name, torch_dtype=torch.bfloat16, device_map="cuda", ) # 英語ベースモデル for k, v in base_model.state_dict().items(): print(k, v.shape) # 日本語継続事前学習モデル for k, v in target_model.state_dict().items(): print(k, v.shape) # 除外対象 skip_layers = ["model.embed_tokens.weight", "lm_head.weight"] for k, v in target_model.state_dict().items(): # layernormも除外 if (k in skip_layers) or ("layernorm" in k): continue chat_vector = inst_model.state_dict()[k] - base_model.state_dict()[k] new_v = v + chat_vector.to(v.device) v.copy_(new_v) target_model.save_pretrained("./chat_model") return if __name__ == '__main__': base_model_name = "mistralai/Mistral-7B-v0.1" inst_model_name = "NousResearch/Hermes-2-Pro-Mistral-7B" target_model_name = "NTQAI/chatntq-ja-7b-v1.0" skip_layers = ["model.embed_tokens.weight", "lm_head.weight"] build_chat_vector_model( base_model_name=base_model_name, inst_model_name=inst_model_name, target_model_name=target_model_name, skip_layers=skip_layers ) ``` ## Benchmark (Japanese MT bench) |model|category|score|ver| |:---|:---|:---|:---| |Tora-7B-v0.2|Writing|3.8|single-turn| |Tora-7B-v0.2|Roleplay|7.1|single-turn| |Tora-7B-v0.2|Reasoning|6.3|single-turn| |Tora-7B-v0.2|Math|3.0|single-turn| |Tora-7B-v0.2|Coding|2.2|single-turn| |Tora-7B-v0.2|Extraction|6.6|single-turn| |Tora-7B-v0.2|STEM|7.2|single-turn| |Tora-7B-v0.2|Humanities|8.2|single-turn| ![image/png](https://cdn-uploads.huggingface.co/production/uploads/651e3f30ca333f3c8df692b8/_CBS90NRrYUMXzsFC1LIV.png) ## 謝辞 ChatVectorの記事を執筆してくださった@jovyan様に深くお礼申し上げます。 ## 参考 [Chat Vectorを使って日本語LLMをチャットモデルに改造する](https://qiita.com/jovyan/items/ee6affa5ee5bdaada6b4)
turnipseason/latext5
turnipseason
2024-06-04T02:58:53Z
108
0
transformers
[ "transformers", "safetensors", "mt5", "text2text-generation", "math", "normalization", "ru", "license:mit", "autotrain_compatible", "endpoints_compatible", "region:us" ]
text2text-generation
2024-05-26T02:36:06Z
--- license: mit language: - ru library_name: transformers pipeline_tag: text2text-generation tags: - math - normalization --- ### Описание: Модель для нормализации русскоязычных текстов, содержащих математические сущности, в формат LaTeX. Модель является дообученной на переведённом&аугментированном датасете "[Mathematics Stack Exchange API Q&A Data](https://zenodo.org/records/1414384)" версией модели [cointegrated/rut5-small](https://huggingface.co/cointegrated/rut5-small). ### Description: This is a model for mathematical text normalization in Russian, based on the [cointegrated/rut5-small](https://huggingface.co/cointegrated/rut5-small) paraphraser. The model was created by finetuning the paraphraser on a translated&augmented "[Mathematics Stack Exchange API Q&A Data](https://zenodo.org/records/1414384)" dataset. Пример использования: --- Usage example: --- ``` python import torch from transformers import AutoTokenizer, AutoModelForSeq2SeqLM from IPython.display import display, Math, Latex model_dir = "turnipseason/latext5" model = AutoModelForSeq2SeqLM.from_pretrained(model_dir) tokenizer = AutoTokenizer.from_pretrained(model_dir) device = torch.device('cuda' if torch.cuda.is_available() else 'cpu') model.to(device) def get_latex(text): inputs = tokenizer(text, return_tensors='pt').to(device) with torch.no_grad(): hypotheses = model.generate( **inputs, do_sample=True, num_return_sequences=1, repetition_penalty=1.2, max_length=len(text), num_beams=10, early_stopping=True ) for h in hypotheses: display(Latex(tokenizer.decode(h, skip_special_tokens=True))) text = '''лямбда прописная квадрат минус три равно десять игрек куб При этом шинус икс равен интеграл от экспоненты до трёх игрек штрих''' get_latex(text) ```
yzhuang/Mistral-7B-Instruct-v0.1_fictional_arc_challenge_English_v1
yzhuang
2024-06-04T02:58:32Z
6
0
transformers
[ "transformers", "tensorboard", "safetensors", "mistral", "text-generation", "trl", "sft", "generated_from_trainer", "conversational", "dataset:generator", "base_model:mistralai/Mistral-7B-Instruct-v0.1", "base_model:finetune:mistralai/Mistral-7B-Instruct-v0.1", "license:apache-2.0", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2024-06-04T00:48:16Z
--- license: apache-2.0 base_model: mistralai/Mistral-7B-Instruct-v0.1 tags: - trl - sft - generated_from_trainer datasets: - generator model-index: - name: Mistral-7B-Instruct-v0.1_fictional_arc_challenge_English_v1 results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # Mistral-7B-Instruct-v0.1_fictional_arc_challenge_English_v1 This model is a fine-tuned version of [mistralai/Mistral-7B-Instruct-v0.1](https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.1) on the generator dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 1 - eval_batch_size: 2 - seed: 42 - gradient_accumulation_steps: 16 - total_train_batch_size: 16 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 72 ### Training results ### Framework versions - Transformers 4.39.3 - Pytorch 2.2.2 - Datasets 2.18.0 - Tokenizers 0.15.2
John6666/pony-pencil-sdxl
John6666
2024-06-04T02:57:30Z
19
1
diffusers
[ "diffusers", "safetensors", "text-to-image", "stable-diffusion", "stable-diffusion-xl", "anime", "license:other", "autotrain_compatible", "endpoints_compatible", "diffusers:StableDiffusionXLPipeline", "region:us" ]
text-to-image
2024-05-24T12:12:06Z
--- license: other license_name: faipl-1.0-sd license_link: https://freedevproject.org/faipl-1.0-sd/ tags: - text-to-image - stable-diffusion - stable-diffusion-xl - anime --- Original model is [here](https://huggingface.co/bluepen5805/pony_pencil-XL).
GuiTap/xlm-roberta-base-finetuned-ner-lenerBr
GuiTap
2024-06-04T02:57:09Z
3
0
transformers
[ "transformers", "tensorboard", "safetensors", "xlm-roberta", "token-classification", "generated_from_trainer", "dataset:lener_br", "base_model:FacebookAI/xlm-roberta-base", "base_model:finetune:FacebookAI/xlm-roberta-base", "license:mit", "model-index", "autotrain_compatible", "endpoints_compatible", "region:us" ]
token-classification
2024-06-03T00:48:45Z
--- license: mit base_model: FacebookAI/xlm-roberta-base tags: - generated_from_trainer datasets: - lener_br metrics: - precision - recall - f1 - accuracy model-index: - name: xlm-roberta-base-finetuned-ner-lenerBr results: - task: name: Token Classification type: token-classification dataset: name: lener_br type: lener_br config: lener_br split: validation args: lener_br metrics: - name: Precision type: precision value: 0.7397260273972602 - name: Recall type: recall value: 0.9211682605324373 - name: F1 type: f1 value: 0.8205364337515828 - name: Accuracy type: accuracy value: 0.970340819101409 --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # xlm-roberta-base-finetuned-ner-lenerBr This model is a fine-tuned version of [FacebookAI/xlm-roberta-base](https://huggingface.co/FacebookAI/xlm-roberta-base) on the lener_br dataset. It achieves the following results on the evaluation set: - Loss: 0.1294 - Precision: 0.7397 - Recall: 0.9212 - F1: 0.8205 - Accuracy: 0.9703 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:| | No log | 1.0 | 245 | 0.1569 | 0.7358 | 0.7788 | 0.7567 | 0.9534 | | No log | 2.0 | 490 | 0.1310 | 0.6909 | 0.8927 | 0.7790 | 0.9632 | | 0.1674 | 3.0 | 735 | 0.1148 | 0.7174 | 0.9119 | 0.8030 | 0.9677 | | 0.1674 | 4.0 | 980 | 0.1550 | 0.7209 | 0.8979 | 0.7997 | 0.9658 | | 0.0276 | 5.0 | 1225 | 0.1441 | 0.7183 | 0.9173 | 0.8057 | 0.9682 | | 0.0276 | 6.0 | 1470 | 0.1482 | 0.7326 | 0.8752 | 0.7976 | 0.9665 | | 0.0154 | 7.0 | 1715 | 0.1209 | 0.7418 | 0.9284 | 0.8247 | 0.9710 | | 0.0154 | 8.0 | 1960 | 0.1266 | 0.7375 | 0.9243 | 0.8204 | 0.9708 | | 0.0096 | 9.0 | 2205 | 0.1394 | 0.7356 | 0.9147 | 0.8154 | 0.9690 | | 0.0096 | 10.0 | 2450 | 0.1294 | 0.7397 | 0.9212 | 0.8205 | 0.9703 | ### Framework versions - Transformers 4.41.1 - Pytorch 2.1.2 - Datasets 2.19.1 - Tokenizers 0.19.1
ehottl/distilbert-base-uncased-distilled-clinc
ehottl
2024-06-04T02:56:15Z
111
0
transformers
[ "transformers", "safetensors", "distilbert", "text-classification", "generated_from_trainer", "base_model:distilbert/distilbert-base-uncased", "base_model:finetune:distilbert/distilbert-base-uncased", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
text-classification
2024-06-04T02:54:19Z
--- license: apache-2.0 base_model: distilbert-base-uncased tags: - generated_from_trainer metrics: - accuracy model-index: - name: distilbert-base-uncased-distilled-clinc results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # distilbert-base-uncased-distilled-clinc This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.2792 - Accuracy: 0.9439 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 48 - eval_batch_size: 48 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 9 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 2.2695 | 1.0 | 318 | 1.6200 | 0.7197 | | 1.264 | 2.0 | 636 | 0.8322 | 0.8616 | | 0.6826 | 3.0 | 954 | 0.4907 | 0.9077 | | 0.4228 | 4.0 | 1272 | 0.3628 | 0.9326 | | 0.3128 | 5.0 | 1590 | 0.3137 | 0.9413 | | 0.2644 | 6.0 | 1908 | 0.2946 | 0.9439 | | 0.2424 | 7.0 | 2226 | 0.2846 | 0.9439 | | 0.2299 | 8.0 | 2544 | 0.2806 | 0.9439 | | 0.2253 | 9.0 | 2862 | 0.2792 | 0.9439 | ### Framework versions - Transformers 4.39.3 - Pytorch 2.1.2.post303 - Datasets 2.19.1 - Tokenizers 0.15.2
Zery/MV-LLaVA-7B
Zery
2024-06-04T02:55:57Z
21
3
transformers
[ "transformers", "pytorch", "share4v", "text-generation", "image-text-to-text", "en", "dataset:Zery/BS-Objaverse", "dataset:Lin-Chen/ShareGPT4V", "arxiv:2406.00093", "license:apache-2.0", "autotrain_compatible", "region:us" ]
image-text-to-text
2024-05-13T07:18:35Z
--- inference: false pipeline_tag: image-text-to-text license: apache-2.0 datasets: - Zery/BS-Objaverse - Lin-Chen/ShareGPT4V language: - en --- <br> <br> # MV-LLaVA-7B Model Card ## Model details **Model type:** MV-LLaVA-7B is an open-source chatbot for 3D multi-view images trained by fine-tuning CLIP vision tower and LLaMA/Vicuna on GPT4-Vision-assisted [BS-Objaverse](https://huggingface.co/datasets/Zery/BS-Objaverse) data and [ShareGPT4V](https://huggingface.co/datasets/Lin-Chen/ShareGPT4V) data. **Model date:** MV-LLaVA-7B was trained in Apr, 2024. **Paper or resources for more information:** [[Project](https://sunzey.github.io/Bootstrap3D/)] [[Paper](https://huggingface.co/papers/2406.00093)] [[Code](https://github.com/SunzeY/Bootstrap3D)] ## Usage You can directly utilize this model as we provide in our [[repository](https://github.com/SunzeY/Bootstrap3D/tree/main/MV_LLaVA)]. ## License Llama 2 is licensed under the LLAMA 2 Community License, Copyright (c) Meta Platforms, Inc. All Rights Reserved. ## Intended use **Primary intended uses:** The primary use of ShareGPT4V-7B is research on large multimodal models and chatbots for 3D content. **Primary intended users:** The primary intended users of the model are researchers and hobbyists in computer vision, natural language processing, machine learning, and artificial intelligence. ## Training dataset - 1.2M ShareGPT4V-PT data - 30K GPT4-Vision-generated multi-view image-text pairs - LLaVA instruction-tuning data
apwic/nerui-lora-r8-4
apwic
2024-06-04T02:39:58Z
0
0
null
[ "tensorboard", "generated_from_trainer", "id", "base_model:indolem/indobert-base-uncased", "base_model:finetune:indolem/indobert-base-uncased", "license:mit", "region:us" ]
null
2024-05-28T14:35:15Z
--- language: - id license: mit base_model: indolem/indobert-base-uncased tags: - generated_from_trainer model-index: - name: nerui-lora-r8-4 results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # nerui-lora-r8-4 This model is a fine-tuned version of [indolem/indobert-base-uncased](https://huggingface.co/indolem/indobert-base-uncased) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.0437 - Location Precision: 0.8739 - Location Recall: 0.9417 - Location F1: 0.9065 - Location Number: 103 - Organization Precision: 0.9152 - Organization Recall: 0.8830 - Organization F1: 0.8988 - Organization Number: 171 - Person Precision: 0.9695 - Person Recall: 0.9695 - Person F1: 0.9695 - Person Number: 131 - Overall Precision: 0.9214 - Overall Recall: 0.9259 - Overall F1: 0.9236 - Overall Accuracy: 0.9870 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 64 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 100.0 ### Training results | Training Loss | Epoch | Step | Validation Loss | Location Precision | Location Recall | Location F1 | Location Number | Organization Precision | Organization Recall | Organization F1 | Organization Number | Person Precision | Person Recall | Person F1 | Person Number | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy | |:-------------:|:-----:|:----:|:---------------:|:------------------:|:---------------:|:-----------:|:---------------:|:----------------------:|:-------------------:|:---------------:|:-------------------:|:----------------:|:-------------:|:---------:|:-------------:|:-----------------:|:--------------:|:----------:|:----------------:| | 1.1566 | 1.0 | 96 | 0.6952 | 0.0 | 0.0 | 0.0 | 103 | 0.0 | 0.0 | 0.0 | 171 | 0.0 | 0.0 | 0.0 | 131 | 0.0 | 0.0 | 0.0 | 0.8373 | | 0.6676 | 2.0 | 192 | 0.5653 | 0.0 | 0.0 | 0.0 | 103 | 0.0 | 0.0 | 0.0 | 171 | 0.0 | 0.0 | 0.0 | 131 | 0.0 | 0.0 | 0.0 | 0.8376 | | 0.5559 | 3.0 | 288 | 0.4487 | 0.0 | 0.0 | 0.0 | 103 | 0.375 | 0.0351 | 0.0642 | 171 | 0.2188 | 0.0534 | 0.0859 | 131 | 0.26 | 0.0321 | 0.0571 | 0.8456 | | 0.4455 | 4.0 | 384 | 0.3389 | 0.2083 | 0.0485 | 0.0787 | 103 | 0.3509 | 0.2339 | 0.2807 | 171 | 0.3816 | 0.4427 | 0.4099 | 131 | 0.3552 | 0.2543 | 0.2964 | 0.8818 | | 0.3416 | 5.0 | 480 | 0.2583 | 0.3971 | 0.2621 | 0.3158 | 103 | 0.4923 | 0.5614 | 0.5246 | 171 | 0.5176 | 0.6718 | 0.5847 | 131 | 0.4873 | 0.5210 | 0.5036 | 0.9207 | | 0.2637 | 6.0 | 576 | 0.2006 | 0.6316 | 0.5825 | 0.6061 | 103 | 0.6490 | 0.7895 | 0.7124 | 171 | 0.7124 | 0.8321 | 0.7676 | 131 | 0.6667 | 0.7506 | 0.7062 | 0.9489 | | 0.2115 | 7.0 | 672 | 0.1649 | 0.7273 | 0.6990 | 0.7129 | 103 | 0.6946 | 0.8246 | 0.7540 | 171 | 0.8542 | 0.9389 | 0.8945 | 131 | 0.7534 | 0.8296 | 0.7897 | 0.9586 | | 0.1785 | 8.0 | 768 | 0.1343 | 0.8316 | 0.7670 | 0.7980 | 103 | 0.7461 | 0.8421 | 0.7912 | 171 | 0.9 | 0.9618 | 0.9299 | 131 | 0.8154 | 0.8617 | 0.8379 | 0.9652 | | 0.1541 | 9.0 | 864 | 0.1175 | 0.8384 | 0.8058 | 0.8218 | 103 | 0.7737 | 0.8596 | 0.8144 | 171 | 0.8936 | 0.9618 | 0.9265 | 131 | 0.8279 | 0.8790 | 0.8527 | 0.9682 | | 0.1387 | 10.0 | 960 | 0.1095 | 0.8235 | 0.8155 | 0.8195 | 103 | 0.7853 | 0.8772 | 0.8287 | 171 | 0.8944 | 0.9695 | 0.9304 | 131 | 0.8299 | 0.8914 | 0.8595 | 0.9696 | | 0.1275 | 11.0 | 1056 | 0.0995 | 0.85 | 0.8252 | 0.8374 | 103 | 0.7937 | 0.8772 | 0.8333 | 171 | 0.9 | 0.9618 | 0.9299 | 131 | 0.8415 | 0.8914 | 0.8657 | 0.9710 | | 0.1212 | 12.0 | 1152 | 0.0935 | 0.8641 | 0.8641 | 0.8641 | 103 | 0.7917 | 0.8889 | 0.8375 | 171 | 0.9 | 0.9618 | 0.9299 | 131 | 0.8437 | 0.9062 | 0.8738 | 0.9724 | | 0.1164 | 13.0 | 1248 | 0.0875 | 0.8627 | 0.8544 | 0.8585 | 103 | 0.8010 | 0.8947 | 0.8453 | 171 | 0.9 | 0.9618 | 0.9299 | 131 | 0.8476 | 0.9062 | 0.8759 | 0.9724 | | 0.1105 | 14.0 | 1344 | 0.0820 | 0.8922 | 0.8835 | 0.8878 | 103 | 0.8466 | 0.8713 | 0.8588 | 171 | 0.9265 | 0.9618 | 0.9438 | 131 | 0.8841 | 0.9037 | 0.8938 | 0.9768 | | 0.1063 | 15.0 | 1440 | 0.0793 | 0.9175 | 0.8641 | 0.89 | 103 | 0.7908 | 0.9064 | 0.8447 | 171 | 0.9065 | 0.9618 | 0.9333 | 131 | 0.8565 | 0.9136 | 0.8841 | 0.9751 | | 0.1018 | 16.0 | 1536 | 0.0783 | 0.8762 | 0.8932 | 0.8846 | 103 | 0.8010 | 0.9181 | 0.8556 | 171 | 0.9065 | 0.9618 | 0.9333 | 131 | 0.8523 | 0.9259 | 0.8876 | 0.9749 | | 0.0986 | 17.0 | 1632 | 0.0725 | 0.9109 | 0.8932 | 0.9020 | 103 | 0.8280 | 0.9006 | 0.8627 | 171 | 0.9407 | 0.9695 | 0.9549 | 131 | 0.8839 | 0.9210 | 0.9021 | 0.9779 | | 0.093 | 18.0 | 1728 | 0.0693 | 0.9010 | 0.8835 | 0.8922 | 103 | 0.8432 | 0.9123 | 0.8764 | 171 | 0.9333 | 0.9618 | 0.9474 | 131 | 0.8860 | 0.9210 | 0.9031 | 0.9779 | | 0.0897 | 19.0 | 1824 | 0.0699 | 0.8762 | 0.8932 | 0.8846 | 103 | 0.8470 | 0.9064 | 0.8757 | 171 | 0.9270 | 0.9695 | 0.9478 | 131 | 0.88 | 0.9235 | 0.9012 | 0.9782 | | 0.0876 | 20.0 | 1920 | 0.0679 | 0.8846 | 0.8932 | 0.8889 | 103 | 0.8201 | 0.9064 | 0.8611 | 171 | 0.9065 | 0.9618 | 0.9333 | 131 | 0.8634 | 0.9210 | 0.8913 | 0.9765 | | 0.0846 | 21.0 | 2016 | 0.0654 | 0.8679 | 0.8932 | 0.8804 | 103 | 0.8378 | 0.9064 | 0.8708 | 171 | 0.9197 | 0.9618 | 0.9403 | 131 | 0.8715 | 0.9210 | 0.8956 | 0.9785 | | 0.0843 | 22.0 | 2112 | 0.0664 | 0.8932 | 0.8932 | 0.8932 | 103 | 0.8325 | 0.9298 | 0.8785 | 171 | 0.9197 | 0.9618 | 0.9403 | 131 | 0.8747 | 0.9309 | 0.9019 | 0.9787 | | 0.0823 | 23.0 | 2208 | 0.0611 | 0.8679 | 0.8932 | 0.8804 | 103 | 0.8492 | 0.8889 | 0.8686 | 171 | 0.9621 | 0.9695 | 0.9658 | 131 | 0.8897 | 0.9160 | 0.9027 | 0.9801 | | 0.0808 | 24.0 | 2304 | 0.0627 | 0.8505 | 0.8835 | 0.8667 | 103 | 0.8415 | 0.9006 | 0.8701 | 171 | 0.9549 | 0.9695 | 0.9621 | 131 | 0.8794 | 0.9185 | 0.8986 | 0.9798 | | 0.0809 | 25.0 | 2400 | 0.0598 | 0.875 | 0.8835 | 0.8792 | 103 | 0.8424 | 0.9064 | 0.8732 | 171 | 0.9474 | 0.9618 | 0.9545 | 131 | 0.8836 | 0.9185 | 0.9007 | 0.9807 | | 0.078 | 26.0 | 2496 | 0.0581 | 0.9010 | 0.8835 | 0.8922 | 103 | 0.8441 | 0.9181 | 0.8796 | 171 | 0.9403 | 0.9618 | 0.9509 | 131 | 0.8884 | 0.9235 | 0.9056 | 0.9818 | | 0.0774 | 27.0 | 2592 | 0.0582 | 0.92 | 0.8932 | 0.9064 | 103 | 0.8503 | 0.9298 | 0.8883 | 171 | 0.9403 | 0.9618 | 0.9509 | 131 | 0.8955 | 0.9309 | 0.9128 | 0.9812 | | 0.0732 | 28.0 | 2688 | 0.0623 | 0.9020 | 0.8932 | 0.8976 | 103 | 0.8659 | 0.9064 | 0.8857 | 171 | 0.9474 | 0.9618 | 0.9545 | 131 | 0.9010 | 0.9210 | 0.9109 | 0.9815 | | 0.0746 | 29.0 | 2784 | 0.0553 | 0.9109 | 0.8932 | 0.9020 | 103 | 0.8827 | 0.9240 | 0.9029 | 171 | 0.9621 | 0.9695 | 0.9658 | 131 | 0.9150 | 0.9309 | 0.9229 | 0.9829 | | 0.0695 | 30.0 | 2880 | 0.0536 | 0.9109 | 0.8932 | 0.9020 | 103 | 0.8827 | 0.9240 | 0.9029 | 171 | 0.9474 | 0.9618 | 0.9545 | 131 | 0.9104 | 0.9284 | 0.9193 | 0.9832 | | 0.0691 | 31.0 | 2976 | 0.0533 | 0.8762 | 0.8932 | 0.8846 | 103 | 0.8807 | 0.9064 | 0.8934 | 171 | 0.9474 | 0.9618 | 0.9545 | 131 | 0.9010 | 0.9210 | 0.9109 | 0.9826 | | 0.0665 | 32.0 | 3072 | 0.0518 | 0.8835 | 0.8835 | 0.8835 | 103 | 0.8652 | 0.9006 | 0.8825 | 171 | 0.9474 | 0.9618 | 0.9545 | 131 | 0.8961 | 0.9160 | 0.9060 | 0.9823 | | 0.0649 | 33.0 | 3168 | 0.0527 | 0.8288 | 0.8932 | 0.8598 | 103 | 0.9018 | 0.8596 | 0.8802 | 171 | 0.9474 | 0.9618 | 0.9545 | 131 | 0.8968 | 0.9012 | 0.8990 | 0.9809 | | 0.0645 | 34.0 | 3264 | 0.0506 | 0.8835 | 0.8835 | 0.8835 | 103 | 0.8966 | 0.9123 | 0.9043 | 171 | 0.9474 | 0.9618 | 0.9545 | 131 | 0.9098 | 0.9210 | 0.9153 | 0.9843 | | 0.063 | 35.0 | 3360 | 0.0515 | 0.8505 | 0.8835 | 0.8667 | 103 | 0.8889 | 0.8889 | 0.8889 | 171 | 0.9474 | 0.9618 | 0.9545 | 131 | 0.8978 | 0.9111 | 0.9044 | 0.9826 | | 0.0637 | 36.0 | 3456 | 0.0508 | 0.8505 | 0.8835 | 0.8667 | 103 | 0.8830 | 0.8830 | 0.8830 | 171 | 0.9474 | 0.9618 | 0.9545 | 131 | 0.8954 | 0.9086 | 0.9020 | 0.9818 | | 0.0614 | 37.0 | 3552 | 0.0495 | 0.9010 | 0.8835 | 0.8922 | 103 | 0.8729 | 0.9240 | 0.8977 | 171 | 0.9474 | 0.9618 | 0.9545 | 131 | 0.9036 | 0.9259 | 0.9146 | 0.9829 | | 0.0599 | 38.0 | 3648 | 0.0495 | 0.8585 | 0.8835 | 0.8708 | 103 | 0.8982 | 0.8772 | 0.8876 | 171 | 0.9474 | 0.9618 | 0.9545 | 131 | 0.9039 | 0.9062 | 0.9051 | 0.9820 | | 0.06 | 39.0 | 3744 | 0.0495 | 0.8519 | 0.8932 | 0.8720 | 103 | 0.8728 | 0.8830 | 0.8779 | 171 | 0.9695 | 0.9695 | 0.9695 | 131 | 0.8981 | 0.9136 | 0.9058 | 0.9820 | | 0.0576 | 40.0 | 3840 | 0.0480 | 0.8667 | 0.8835 | 0.8750 | 103 | 0.8994 | 0.8889 | 0.8941 | 171 | 0.9621 | 0.9695 | 0.9658 | 131 | 0.9113 | 0.9136 | 0.9125 | 0.9837 | | 0.0597 | 41.0 | 3936 | 0.0485 | 0.8679 | 0.8932 | 0.8804 | 103 | 0.875 | 0.9006 | 0.8876 | 171 | 0.9621 | 0.9695 | 0.9658 | 131 | 0.9010 | 0.9210 | 0.9109 | 0.9829 | | 0.0581 | 42.0 | 4032 | 0.0473 | 0.8598 | 0.8932 | 0.8762 | 103 | 0.8736 | 0.8889 | 0.8812 | 171 | 0.9621 | 0.9695 | 0.9658 | 131 | 0.8983 | 0.9160 | 0.9071 | 0.9829 | | 0.0597 | 43.0 | 4128 | 0.0479 | 0.8679 | 0.8932 | 0.8804 | 103 | 0.8736 | 0.8889 | 0.8812 | 171 | 0.9695 | 0.9695 | 0.9695 | 131 | 0.9027 | 0.9160 | 0.9093 | 0.9826 | | 0.0568 | 44.0 | 4224 | 0.0481 | 0.8519 | 0.8932 | 0.8720 | 103 | 0.8982 | 0.8772 | 0.8876 | 171 | 0.9621 | 0.9695 | 0.9658 | 131 | 0.9066 | 0.9111 | 0.9089 | 0.9826 | | 0.0561 | 45.0 | 4320 | 0.0470 | 0.8519 | 0.8932 | 0.8720 | 103 | 0.8876 | 0.8772 | 0.8824 | 171 | 0.9695 | 0.9695 | 0.9695 | 131 | 0.9044 | 0.9111 | 0.9077 | 0.9834 | | 0.0552 | 46.0 | 4416 | 0.0478 | 0.8519 | 0.8932 | 0.8720 | 103 | 0.9036 | 0.8772 | 0.8902 | 171 | 0.9695 | 0.9695 | 0.9695 | 131 | 0.9111 | 0.9111 | 0.9111 | 0.9837 | | 0.0562 | 47.0 | 4512 | 0.0461 | 0.8762 | 0.8932 | 0.8846 | 103 | 0.8644 | 0.8947 | 0.8793 | 171 | 0.9695 | 0.9695 | 0.9695 | 131 | 0.9007 | 0.9185 | 0.9095 | 0.9840 | | 0.0533 | 48.0 | 4608 | 0.0474 | 0.8545 | 0.9126 | 0.8826 | 103 | 0.9085 | 0.8713 | 0.8896 | 171 | 0.9695 | 0.9695 | 0.9695 | 131 | 0.9136 | 0.9136 | 0.9136 | 0.9837 | | 0.0522 | 49.0 | 4704 | 0.0461 | 0.8468 | 0.9126 | 0.8785 | 103 | 0.8772 | 0.8772 | 0.8772 | 171 | 0.9695 | 0.9695 | 0.9695 | 131 | 0.8983 | 0.9160 | 0.9071 | 0.9843 | | 0.052 | 50.0 | 4800 | 0.0464 | 0.8559 | 0.9223 | 0.8879 | 103 | 0.8793 | 0.8947 | 0.8870 | 171 | 0.9695 | 0.9695 | 0.9695 | 131 | 0.9014 | 0.9259 | 0.9135 | 0.9840 | | 0.054 | 51.0 | 4896 | 0.0467 | 0.8571 | 0.9320 | 0.8930 | 103 | 0.9030 | 0.8713 | 0.8869 | 171 | 0.9695 | 0.9695 | 0.9695 | 131 | 0.9118 | 0.9185 | 0.9151 | 0.9854 | | 0.0525 | 52.0 | 4992 | 0.0460 | 0.8829 | 0.9515 | 0.9159 | 103 | 0.8671 | 0.8772 | 0.8721 | 171 | 0.9695 | 0.9695 | 0.9695 | 131 | 0.9036 | 0.9259 | 0.9146 | 0.9840 | | 0.0501 | 53.0 | 5088 | 0.0466 | 0.8739 | 0.9417 | 0.9065 | 103 | 0.8922 | 0.8713 | 0.8817 | 171 | 0.9695 | 0.9695 | 0.9695 | 131 | 0.9120 | 0.9210 | 0.9165 | 0.9848 | | 0.0498 | 54.0 | 5184 | 0.0449 | 0.8716 | 0.9223 | 0.8962 | 103 | 0.8779 | 0.8830 | 0.8805 | 171 | 0.9695 | 0.9695 | 0.9695 | 131 | 0.9053 | 0.9210 | 0.9131 | 0.9840 | | 0.0504 | 55.0 | 5280 | 0.0456 | 0.8739 | 0.9417 | 0.9065 | 103 | 0.8982 | 0.8772 | 0.8876 | 171 | 0.9695 | 0.9695 | 0.9695 | 131 | 0.9144 | 0.9235 | 0.9189 | 0.9854 | | 0.0486 | 56.0 | 5376 | 0.0453 | 0.8727 | 0.9320 | 0.9014 | 103 | 0.8736 | 0.8889 | 0.8812 | 171 | 0.9695 | 0.9695 | 0.9695 | 131 | 0.9036 | 0.9259 | 0.9146 | 0.9845 | | 0.0497 | 57.0 | 5472 | 0.0457 | 0.8509 | 0.9417 | 0.8940 | 103 | 0.8982 | 0.8772 | 0.8876 | 171 | 0.9695 | 0.9695 | 0.9695 | 131 | 0.9078 | 0.9235 | 0.9155 | 0.9845 | | 0.0487 | 58.0 | 5568 | 0.0460 | 0.8739 | 0.9417 | 0.9065 | 103 | 0.9036 | 0.8772 | 0.8902 | 171 | 0.9695 | 0.9695 | 0.9695 | 131 | 0.9167 | 0.9235 | 0.9200 | 0.9854 | | 0.0473 | 59.0 | 5664 | 0.0456 | 0.8559 | 0.9223 | 0.8879 | 103 | 0.8922 | 0.8713 | 0.8817 | 171 | 0.9695 | 0.9695 | 0.9695 | 131 | 0.9071 | 0.9160 | 0.9115 | 0.9851 | | 0.0463 | 60.0 | 5760 | 0.0454 | 0.8559 | 0.9223 | 0.8879 | 103 | 0.8970 | 0.8655 | 0.8810 | 171 | 0.9695 | 0.9695 | 0.9695 | 131 | 0.9091 | 0.9136 | 0.9113 | 0.9854 | | 0.0486 | 61.0 | 5856 | 0.0456 | 0.8739 | 0.9417 | 0.9065 | 103 | 0.8970 | 0.8655 | 0.8810 | 171 | 0.9695 | 0.9695 | 0.9695 | 131 | 0.9140 | 0.9185 | 0.9163 | 0.9856 | | 0.0484 | 62.0 | 5952 | 0.0465 | 0.8571 | 0.9320 | 0.8930 | 103 | 0.8970 | 0.8655 | 0.8810 | 171 | 0.9695 | 0.9695 | 0.9695 | 131 | 0.9093 | 0.9160 | 0.9127 | 0.9851 | | 0.0461 | 63.0 | 6048 | 0.0451 | 0.875 | 0.9515 | 0.9116 | 103 | 0.8988 | 0.8830 | 0.8909 | 171 | 0.9695 | 0.9695 | 0.9695 | 131 | 0.9148 | 0.9284 | 0.9216 | 0.9856 | | 0.0455 | 64.0 | 6144 | 0.0451 | 0.8727 | 0.9320 | 0.9014 | 103 | 0.8976 | 0.8713 | 0.8843 | 171 | 0.9695 | 0.9695 | 0.9695 | 131 | 0.9140 | 0.9185 | 0.9163 | 0.9854 | | 0.0472 | 65.0 | 6240 | 0.0453 | 0.8739 | 0.9417 | 0.9065 | 103 | 0.9030 | 0.8713 | 0.8869 | 171 | 0.9695 | 0.9695 | 0.9695 | 131 | 0.9165 | 0.9210 | 0.9187 | 0.9859 | | 0.0453 | 66.0 | 6336 | 0.0451 | 0.8739 | 0.9417 | 0.9065 | 103 | 0.9146 | 0.8772 | 0.8955 | 171 | 0.9695 | 0.9695 | 0.9695 | 131 | 0.9212 | 0.9235 | 0.9223 | 0.9865 | | 0.045 | 67.0 | 6432 | 0.0450 | 0.8739 | 0.9417 | 0.9065 | 103 | 0.9085 | 0.8713 | 0.8896 | 171 | 0.9695 | 0.9695 | 0.9695 | 131 | 0.9187 | 0.9210 | 0.9199 | 0.9856 | | 0.0466 | 68.0 | 6528 | 0.0440 | 0.8909 | 0.9515 | 0.9202 | 103 | 0.9036 | 0.8772 | 0.8902 | 171 | 0.9695 | 0.9695 | 0.9695 | 131 | 0.9214 | 0.9259 | 0.9236 | 0.9867 | | 0.046 | 69.0 | 6624 | 0.0446 | 0.8649 | 0.9320 | 0.8972 | 103 | 0.9085 | 0.8713 | 0.8896 | 171 | 0.9695 | 0.9695 | 0.9695 | 131 | 0.9163 | 0.9185 | 0.9174 | 0.9854 | | 0.0436 | 70.0 | 6720 | 0.0440 | 0.8649 | 0.9320 | 0.8972 | 103 | 0.9091 | 0.8772 | 0.8929 | 171 | 0.9695 | 0.9695 | 0.9695 | 131 | 0.9165 | 0.9210 | 0.9187 | 0.9862 | | 0.0445 | 71.0 | 6816 | 0.0446 | 0.8673 | 0.9515 | 0.9074 | 103 | 0.8988 | 0.8830 | 0.8909 | 171 | 0.9695 | 0.9695 | 0.9695 | 131 | 0.9126 | 0.9284 | 0.9204 | 0.9862 | | 0.0437 | 72.0 | 6912 | 0.0459 | 0.8727 | 0.9320 | 0.9014 | 103 | 0.9024 | 0.8655 | 0.8836 | 171 | 0.9695 | 0.9695 | 0.9695 | 131 | 0.9160 | 0.9160 | 0.9160 | 0.9854 | | 0.0434 | 73.0 | 7008 | 0.0444 | 0.875 | 0.9515 | 0.9116 | 103 | 0.9091 | 0.8772 | 0.8929 | 171 | 0.9695 | 0.9695 | 0.9695 | 131 | 0.9191 | 0.9259 | 0.9225 | 0.9862 | | 0.0441 | 74.0 | 7104 | 0.0445 | 0.8739 | 0.9417 | 0.9065 | 103 | 0.9091 | 0.8772 | 0.8929 | 171 | 0.9695 | 0.9695 | 0.9695 | 131 | 0.9189 | 0.9235 | 0.9212 | 0.9862 | | 0.0439 | 75.0 | 7200 | 0.0446 | 0.8739 | 0.9417 | 0.9065 | 103 | 0.9091 | 0.8772 | 0.8929 | 171 | 0.9695 | 0.9695 | 0.9695 | 131 | 0.9189 | 0.9235 | 0.9212 | 0.9862 | | 0.042 | 76.0 | 7296 | 0.0447 | 0.8661 | 0.9417 | 0.9023 | 103 | 0.8982 | 0.8772 | 0.8876 | 171 | 0.9695 | 0.9695 | 0.9695 | 131 | 0.9122 | 0.9235 | 0.9178 | 0.9859 | | 0.0428 | 77.0 | 7392 | 0.0449 | 0.8649 | 0.9320 | 0.8972 | 103 | 0.9085 | 0.8713 | 0.8896 | 171 | 0.9695 | 0.9695 | 0.9695 | 131 | 0.9163 | 0.9185 | 0.9174 | 0.9865 | | 0.0435 | 78.0 | 7488 | 0.0444 | 0.8739 | 0.9417 | 0.9065 | 103 | 0.9091 | 0.8772 | 0.8929 | 171 | 0.9695 | 0.9695 | 0.9695 | 131 | 0.9189 | 0.9235 | 0.9212 | 0.9867 | | 0.0416 | 79.0 | 7584 | 0.0439 | 0.8661 | 0.9417 | 0.9023 | 103 | 0.9102 | 0.8889 | 0.8994 | 171 | 0.9695 | 0.9695 | 0.9695 | 131 | 0.9171 | 0.9284 | 0.9227 | 0.9862 | | 0.0414 | 80.0 | 7680 | 0.0436 | 0.8727 | 0.9320 | 0.9014 | 103 | 0.9096 | 0.8830 | 0.8961 | 171 | 0.9695 | 0.9695 | 0.9695 | 131 | 0.9189 | 0.9235 | 0.9212 | 0.9867 | | 0.043 | 81.0 | 7776 | 0.0437 | 0.8727 | 0.9320 | 0.9014 | 103 | 0.9152 | 0.8830 | 0.8988 | 171 | 0.9695 | 0.9695 | 0.9695 | 131 | 0.9212 | 0.9235 | 0.9223 | 0.9870 | | 0.0433 | 82.0 | 7872 | 0.0434 | 0.8818 | 0.9417 | 0.9108 | 103 | 0.9157 | 0.8889 | 0.9021 | 171 | 0.9695 | 0.9695 | 0.9695 | 131 | 0.9238 | 0.9284 | 0.9261 | 0.9873 | | 0.0428 | 83.0 | 7968 | 0.0439 | 0.8661 | 0.9417 | 0.9023 | 103 | 0.9212 | 0.8889 | 0.9048 | 171 | 0.9695 | 0.9695 | 0.9695 | 131 | 0.9216 | 0.9284 | 0.9250 | 0.9867 | | 0.0418 | 84.0 | 8064 | 0.0435 | 0.8739 | 0.9417 | 0.9065 | 103 | 0.9157 | 0.8889 | 0.9021 | 171 | 0.9695 | 0.9695 | 0.9695 | 131 | 0.9216 | 0.9284 | 0.9250 | 0.9867 | | 0.0416 | 85.0 | 8160 | 0.0435 | 0.8727 | 0.9320 | 0.9014 | 103 | 0.9152 | 0.8830 | 0.8988 | 171 | 0.9695 | 0.9695 | 0.9695 | 131 | 0.9212 | 0.9235 | 0.9223 | 0.9870 | | 0.0413 | 86.0 | 8256 | 0.0439 | 0.8739 | 0.9417 | 0.9065 | 103 | 0.9212 | 0.8889 | 0.9048 | 171 | 0.9695 | 0.9695 | 0.9695 | 131 | 0.9238 | 0.9284 | 0.9261 | 0.9873 | | 0.0423 | 87.0 | 8352 | 0.0440 | 0.8727 | 0.9320 | 0.9014 | 103 | 0.9085 | 0.8713 | 0.8896 | 171 | 0.9695 | 0.9695 | 0.9695 | 131 | 0.9185 | 0.9185 | 0.9185 | 0.9865 | | 0.0409 | 88.0 | 8448 | 0.0439 | 0.8818 | 0.9417 | 0.9108 | 103 | 0.9146 | 0.8772 | 0.8955 | 171 | 0.9695 | 0.9695 | 0.9695 | 131 | 0.9235 | 0.9235 | 0.9235 | 0.9870 | | 0.0419 | 89.0 | 8544 | 0.0437 | 0.8661 | 0.9417 | 0.9023 | 103 | 0.9212 | 0.8889 | 0.9048 | 171 | 0.9695 | 0.9695 | 0.9695 | 131 | 0.9216 | 0.9284 | 0.9250 | 0.9870 | | 0.0424 | 90.0 | 8640 | 0.0438 | 0.8661 | 0.9417 | 0.9023 | 103 | 0.9152 | 0.8830 | 0.8988 | 171 | 0.9695 | 0.9695 | 0.9695 | 131 | 0.9191 | 0.9259 | 0.9225 | 0.9867 | | 0.0419 | 91.0 | 8736 | 0.0439 | 0.8739 | 0.9417 | 0.9065 | 103 | 0.9146 | 0.8772 | 0.8955 | 171 | 0.9695 | 0.9695 | 0.9695 | 131 | 0.9212 | 0.9235 | 0.9223 | 0.9867 | | 0.0427 | 92.0 | 8832 | 0.0443 | 0.8739 | 0.9417 | 0.9065 | 103 | 0.9146 | 0.8772 | 0.8955 | 171 | 0.9695 | 0.9695 | 0.9695 | 131 | 0.9212 | 0.9235 | 0.9223 | 0.9867 | | 0.0397 | 93.0 | 8928 | 0.0438 | 0.8739 | 0.9417 | 0.9065 | 103 | 0.9146 | 0.8772 | 0.8955 | 171 | 0.9695 | 0.9695 | 0.9695 | 131 | 0.9212 | 0.9235 | 0.9223 | 0.9867 | | 0.0414 | 94.0 | 9024 | 0.0437 | 0.8739 | 0.9417 | 0.9065 | 103 | 0.9152 | 0.8830 | 0.8988 | 171 | 0.9695 | 0.9695 | 0.9695 | 131 | 0.9214 | 0.9259 | 0.9236 | 0.9870 | | 0.0401 | 95.0 | 9120 | 0.0438 | 0.8818 | 0.9417 | 0.9108 | 103 | 0.9152 | 0.8830 | 0.8988 | 171 | 0.9695 | 0.9695 | 0.9695 | 131 | 0.9236 | 0.9259 | 0.9248 | 0.9873 | | 0.0415 | 96.0 | 9216 | 0.0439 | 0.8727 | 0.9320 | 0.9014 | 103 | 0.9146 | 0.8772 | 0.8955 | 171 | 0.9695 | 0.9695 | 0.9695 | 131 | 0.9210 | 0.9210 | 0.9210 | 0.9867 | | 0.0404 | 97.0 | 9312 | 0.0437 | 0.8818 | 0.9417 | 0.9108 | 103 | 0.9152 | 0.8830 | 0.8988 | 171 | 0.9695 | 0.9695 | 0.9695 | 131 | 0.9236 | 0.9259 | 0.9248 | 0.9873 | | 0.0418 | 98.0 | 9408 | 0.0438 | 0.8739 | 0.9417 | 0.9065 | 103 | 0.9152 | 0.8830 | 0.8988 | 171 | 0.9695 | 0.9695 | 0.9695 | 131 | 0.9214 | 0.9259 | 0.9236 | 0.9870 | | 0.0388 | 99.0 | 9504 | 0.0437 | 0.8739 | 0.9417 | 0.9065 | 103 | 0.9152 | 0.8830 | 0.8988 | 171 | 0.9695 | 0.9695 | 0.9695 | 131 | 0.9214 | 0.9259 | 0.9236 | 0.9870 | | 0.0397 | 100.0 | 9600 | 0.0437 | 0.8739 | 0.9417 | 0.9065 | 103 | 0.9152 | 0.8830 | 0.8988 | 171 | 0.9695 | 0.9695 | 0.9695 | 131 | 0.9214 | 0.9259 | 0.9236 | 0.9870 | ### Framework versions - Transformers 4.39.3 - Pytorch 2.3.0+cu121 - Datasets 2.19.1 - Tokenizers 0.15.2
ahmedesmail16/Paper_compared-beit-base
ahmedesmail16
2024-06-04T02:36:12Z
211
0
transformers
[ "transformers", "tensorboard", "safetensors", "beit", "image-classification", "generated_from_trainer", "base_model:microsoft/beit-base-patch16-224-pt22k-ft22k", "base_model:finetune:microsoft/beit-base-patch16-224-pt22k-ft22k", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
image-classification
2024-06-04T00:17:01Z
--- license: apache-2.0 base_model: microsoft/beit-base-patch16-224-pt22k-ft22k tags: - generated_from_trainer metrics: - accuracy model-index: - name: Paper_compared-beit-base results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # Paper_compared-beit-base This model is a fine-tuned version of [microsoft/beit-base-patch16-224-pt22k-ft22k](https://huggingface.co/microsoft/beit-base-patch16-224-pt22k-ft22k) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.5363 - Accuracy: 0.8409 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:------:|:----:|:---------------:|:--------:| | 1.6803 | 0.9492 | 14 | 0.9171 | 0.7156 | | 0.8219 | 1.9661 | 29 | 0.5230 | 0.8330 | | 0.2323 | 2.9831 | 44 | 0.5110 | 0.8047 | | 0.1112 | 4.0 | 59 | 0.4968 | 0.8138 | | 0.0387 | 4.9492 | 73 | 0.5502 | 0.8093 | | 0.0232 | 5.9661 | 88 | 0.5506 | 0.8296 | | 0.0096 | 6.9831 | 103 | 0.5341 | 0.8431 | | 0.0068 | 8.0 | 118 | 0.6003 | 0.8149 | | 0.0046 | 8.9492 | 132 | 0.5298 | 0.8409 | | 0.0051 | 9.4915 | 140 | 0.5363 | 0.8409 | ### Framework versions - Transformers 4.41.1 - Pytorch 2.1.2 - Datasets 2.19.2 - Tokenizers 0.19.1
ehottl/distilbert-base-uncased-finetuned-clinc
ehottl
2024-06-04T02:36:06Z
113
0
transformers
[ "transformers", "safetensors", "distilbert", "text-classification", "generated_from_trainer", "base_model:distilbert/distilbert-base-uncased", "base_model:finetune:distilbert/distilbert-base-uncased", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
text-classification
2024-06-04T02:24:15Z
--- license: apache-2.0 base_model: distilbert-base-uncased tags: - generated_from_trainer metrics: - accuracy model-index: - name: distilbert-base-uncased-finetuned-clinc results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # distilbert-base-uncased-finetuned-clinc This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.8020 - Accuracy: 0.9158 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 48 - eval_batch_size: 48 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 4.3069 | 1.0 | 318 | 3.3020 | 0.7177 | | 2.6569 | 2.0 | 636 | 1.9007 | 0.8468 | | 1.5836 | 3.0 | 954 | 1.1867 | 0.8881 | | 1.0474 | 4.0 | 1272 | 0.8876 | 0.9116 | | 0.8287 | 5.0 | 1590 | 0.8020 | 0.9158 | ### Framework versions - Transformers 4.39.3 - Pytorch 2.1.2.post303 - Datasets 2.19.1 - Tokenizers 0.15.2
flammenai/Mahou-1.3a-mistral-7B
flammenai
2024-06-04T02:35:32Z
8
1
transformers
[ "transformers", "safetensors", "mistral", "text-generation", "conversational", "dataset:flammenai/MahouMix-v1", "base_model:nbeerbower/Mahou-1.3-M1-mistral-7B", "base_model:finetune:nbeerbower/Mahou-1.3-M1-mistral-7B", "license:apache-2.0", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2024-06-02T02:47:17Z
--- library_name: transformers license: apache-2.0 base_model: - nbeerbower/Mahou-1.3-M1-mistral-7B datasets: - flammenai/MahouMix-v1 --- ![image/png](https://huggingface.co/flammenai/Mahou-1.0-mistral-7B/resolve/main/mahou1.png) # Mahou-1.3a-mistral-7B Mahou is designed to provide short messages in a conversational context. It is capable of casual conversation and character roleplay. ### Chat Format This model has been trained to use ChatML format. Note the additional tokens in [tokenizer_config.json](tokenizer_config.json). ``` <|im_start|>system {{system}}<|im_end|> <|im_start|>{{char}} {{message}}<|im_end|> <|im_start|>{{user}} {{message}}<|im_end|> ``` ### Roleplay Format - Speech without quotes. - Actions in `*asterisks*` ``` *leans against wall cooly* so like, i just casted a super strong spell at magician academy today, not gonna lie, felt badass. ``` ### SillyTavern Settings 1. Use ChatML for the Context Template. 2. Enable Instruct Mode. 3. Use the [Mahou preset](https://huggingface.co/datasets/flammenai/Mahou-ST-ChatML-Instruct/raw/main/Mahou.json). 4. *Recommended* Additonal stopping strings: `["\n", "<|", "</"]` ### Method DPO finetuned for 6 epochs using an A100 on Google Colab. [Fine-tune a Mistral-7b model with Direct Preference Optimization](https://towardsdatascience.com/fine-tune-a-mistral-7b-model-with-direct-preference-optimization-708042745aac) - [Maxime Labonne](https://huggingface.co/mlabonne)
warwavn/vit-base-patch16-224-in21k-finetuned-lora-food101
warwavn
2024-06-04T02:35:22Z
0
0
transformers
[ "transformers", "tensorboard", "safetensors", "arxiv:1910.09700", "endpoints_compatible", "region:us" ]
null
2024-06-04T02:29:39Z
--- library_name: transformers tags: [] --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
RichardErkhov/ibm-granite_-_granite-20b-code-instruct-gguf
RichardErkhov
2024-06-04T02:30:40Z
157
0
null
[ "gguf", "arxiv:2405.04324", "region:us" ]
null
2024-06-04T01:24:21Z
Quantization made by Richard Erkhov. [Github](https://github.com/RichardErkhov) [Discord](https://discord.gg/pvy7H8DZMG) [Request more models](https://github.com/RichardErkhov/quant_request) granite-20b-code-instruct - GGUF - Model creator: https://huggingface.co/ibm-granite/ - Original model: https://huggingface.co/ibm-granite/granite-20b-code-instruct/ | Name | Quant method | Size | | ---- | ---- | ---- | | [granite-20b-code-instruct.Q2_K.gguf](https://huggingface.co/RichardErkhov/ibm-granite_-_granite-20b-code-instruct-gguf/blob/main/granite-20b-code-instruct.Q2_K.gguf) | Q2_K | 7.38GB | | [granite-20b-code-instruct.IQ3_XS.gguf](https://huggingface.co/RichardErkhov/ibm-granite_-_granite-20b-code-instruct-gguf/blob/main/granite-20b-code-instruct.IQ3_XS.gguf) | IQ3_XS | 8.06GB | | [granite-20b-code-instruct.IQ3_S.gguf](https://huggingface.co/RichardErkhov/ibm-granite_-_granite-20b-code-instruct-gguf/blob/main/granite-20b-code-instruct.IQ3_S.gguf) | IQ3_S | 0.79GB | | [granite-20b-code-instruct.Q3_K_S.gguf](https://huggingface.co/RichardErkhov/ibm-granite_-_granite-20b-code-instruct-gguf/blob/main/granite-20b-code-instruct.Q3_K_S.gguf) | Q3_K_S | 0.56GB | | [granite-20b-code-instruct.IQ3_M.gguf](https://huggingface.co/RichardErkhov/ibm-granite_-_granite-20b-code-instruct-gguf/blob/main/granite-20b-code-instruct.IQ3_M.gguf) | IQ3_M | 0.06GB | | [granite-20b-code-instruct.Q3_K.gguf](https://huggingface.co/RichardErkhov/ibm-granite_-_granite-20b-code-instruct-gguf/blob/main/granite-20b-code-instruct.Q3_K.gguf) | Q3_K | 0.04GB | | [granite-20b-code-instruct.Q3_K_M.gguf](https://huggingface.co/RichardErkhov/ibm-granite_-_granite-20b-code-instruct-gguf/blob/main/granite-20b-code-instruct.Q3_K_M.gguf) | Q3_K_M | 0.0GB | | [granite-20b-code-instruct.Q3_K_L.gguf](https://huggingface.co/RichardErkhov/ibm-granite_-_granite-20b-code-instruct-gguf/blob/main/granite-20b-code-instruct.Q3_K_L.gguf) | Q3_K_L | 0.0GB | | [granite-20b-code-instruct.IQ4_XS.gguf](https://huggingface.co/RichardErkhov/ibm-granite_-_granite-20b-code-instruct-gguf/blob/main/granite-20b-code-instruct.IQ4_XS.gguf) | IQ4_XS | 0.0GB | | [granite-20b-code-instruct.Q4_0.gguf](https://huggingface.co/RichardErkhov/ibm-granite_-_granite-20b-code-instruct-gguf/blob/main/granite-20b-code-instruct.Q4_0.gguf) | Q4_0 | 0.0GB | | [granite-20b-code-instruct.IQ4_NL.gguf](https://huggingface.co/RichardErkhov/ibm-granite_-_granite-20b-code-instruct-gguf/blob/main/granite-20b-code-instruct.IQ4_NL.gguf) | IQ4_NL | 0.0GB | | [granite-20b-code-instruct.Q4_K_S.gguf](https://huggingface.co/RichardErkhov/ibm-granite_-_granite-20b-code-instruct-gguf/blob/main/granite-20b-code-instruct.Q4_K_S.gguf) | Q4_K_S | 0.0GB | | [granite-20b-code-instruct.Q4_K.gguf](https://huggingface.co/RichardErkhov/ibm-granite_-_granite-20b-code-instruct-gguf/blob/main/granite-20b-code-instruct.Q4_K.gguf) | Q4_K | 0.0GB | | [granite-20b-code-instruct.Q4_K_M.gguf](https://huggingface.co/RichardErkhov/ibm-granite_-_granite-20b-code-instruct-gguf/blob/main/granite-20b-code-instruct.Q4_K_M.gguf) | Q4_K_M | 0.0GB | | [granite-20b-code-instruct.Q4_1.gguf](https://huggingface.co/RichardErkhov/ibm-granite_-_granite-20b-code-instruct-gguf/blob/main/granite-20b-code-instruct.Q4_1.gguf) | Q4_1 | 0.0GB | | [granite-20b-code-instruct.Q5_0.gguf](https://huggingface.co/RichardErkhov/ibm-granite_-_granite-20b-code-instruct-gguf/blob/main/granite-20b-code-instruct.Q5_0.gguf) | Q5_0 | 0.0GB | | [granite-20b-code-instruct.Q5_K_S.gguf](https://huggingface.co/RichardErkhov/ibm-granite_-_granite-20b-code-instruct-gguf/blob/main/granite-20b-code-instruct.Q5_K_S.gguf) | Q5_K_S | 0.0GB | | [granite-20b-code-instruct.Q5_K.gguf](https://huggingface.co/RichardErkhov/ibm-granite_-_granite-20b-code-instruct-gguf/blob/main/granite-20b-code-instruct.Q5_K.gguf) | Q5_K | 0.0GB | | [granite-20b-code-instruct.Q5_K_M.gguf](https://huggingface.co/RichardErkhov/ibm-granite_-_granite-20b-code-instruct-gguf/blob/main/granite-20b-code-instruct.Q5_K_M.gguf) | Q5_K_M | 0.0GB | | [granite-20b-code-instruct.Q5_1.gguf](https://huggingface.co/RichardErkhov/ibm-granite_-_granite-20b-code-instruct-gguf/blob/main/granite-20b-code-instruct.Q5_1.gguf) | Q5_1 | 0.0GB | | [granite-20b-code-instruct.Q6_K.gguf](https://huggingface.co/RichardErkhov/ibm-granite_-_granite-20b-code-instruct-gguf/blob/main/granite-20b-code-instruct.Q6_K.gguf) | Q6_K | 0.0GB | | [granite-20b-code-instruct.Q8_0.gguf](https://huggingface.co/RichardErkhov/ibm-granite_-_granite-20b-code-instruct-gguf/blob/main/granite-20b-code-instruct.Q8_0.gguf) | Q8_0 | 0.0GB | Original model description: --- pipeline_tag: text-generation base_model: ibm-granite/granite-20b-code-base inference: true license: apache-2.0 datasets: - bigcode/commitpackft - TIGER-Lab/MathInstruct - meta-math/MetaMathQA - glaiveai/glaive-code-assistant-v3 - glaive-function-calling-v2 - bugdaryan/sql-create-context-instruction - garage-bAInd/Open-Platypus - nvidia/HelpSteer metrics: - code_eval library_name: transformers tags: - code - granite model-index: - name: granite-20b-code-instruct results: - task: type: text-generation dataset: type: bigcode/humanevalpack name: HumanEvalSynthesis(Python) metrics: - name: pass@1 type: pass@1 value: 60.4 veriefied: false - task: type: text-generation dataset: type: bigcode/humanevalpack name: HumanEvalSynthesis(JavaScript) metrics: - name: pass@1 type: pass@1 value: 53.7 veriefied: false - task: type: text-generation dataset: type: bigcode/humanevalpack name: HumanEvalSynthesis(Java) metrics: - name: pass@1 type: pass@1 value: 58.5 veriefied: false - task: type: text-generation dataset: type: bigcode/humanevalpack name: HumanEvalSynthesis(Go) metrics: - name: pass@1 type: pass@1 value: 42.1 veriefied: false - task: type: text-generation dataset: type: bigcode/humanevalpack name: HumanEvalSynthesis(C++) metrics: - name: pass@1 type: pass@1 value: 45.7 veriefied: false - task: type: text-generation dataset: type: bigcode/humanevalpack name: HumanEvalSynthesis(Rust) metrics: - name: pass@1 type: pass@1 value: 42.7 veriefied: false - task: type: text-generation dataset: type: bigcode/humanevalpack name: HumanEvalExplain(Python) metrics: - name: pass@1 type: pass@1 value: 44.5 veriefied: false - task: type: text-generation dataset: type: bigcode/humanevalpack name: HumanEvalExplain(JavaScript) metrics: - name: pass@1 type: pass@1 value: 42.7 veriefied: false - task: type: text-generation dataset: type: bigcode/humanevalpack name: HumanEvalExplain(Java) metrics: - name: pass@1 type: pass@1 value: 49.4 veriefied: false - task: type: text-generation dataset: type: bigcode/humanevalpack name: HumanEvalExplain(Go) metrics: - name: pass@1 type: pass@1 value: 32.3 veriefied: false - task: type: text-generation dataset: type: bigcode/humanevalpack name: HumanEvalExplain(C++) metrics: - name: pass@1 type: pass@1 value: 42.1 veriefied: false - task: type: text-generation dataset: type: bigcode/humanevalpack name: HumanEvalExplain(Rust) metrics: - name: pass@1 type: pass@1 value: 18.3 veriefied: false - task: type: text-generation dataset: type: bigcode/humanevalpack name: HumanEvalFix(Python) metrics: - name: pass@1 type: pass@1 value: 43.9 veriefied: false - task: type: text-generation dataset: type: bigcode/humanevalpack name: HumanEvalFix(JavaScript) metrics: - name: pass@1 type: pass@1 value: 43.9 veriefied: false - task: type: text-generation dataset: type: bigcode/humanevalpack name: HumanEvalFix(Java) metrics: - name: pass@1 type: pass@1 value: 45.7 veriefied: false - task: type: text-generation dataset: type: bigcode/humanevalpack name: HumanEvalFix(Go) metrics: - name: pass@1 type: pass@1 value: 41.5 veriefied: false - task: type: text-generation dataset: type: bigcode/humanevalpack name: HumanEvalFix(C++) metrics: - name: pass@1 type: pass@1 value: 41.5 veriefied: false - task: type: text-generation dataset: type: bigcode/humanevalpack name: HumanEvalFix(Rust) metrics: - name: pass@1 type: pass@1 value: 29.9 veriefied: false --- ![image/png](https://cdn-uploads.huggingface.co/production/uploads/62cd5057674cdb524450093d/1hzxoPwqkBJXshKVVe6_9.png) # Granite-20B-Code-Instruct ## Model Summary **Granite-20B-Code-Instruct** is a 20B parameter model fine tuned from *Granite-20B-Code-Base* on a combination of **permissively licensed** instruction data to enhance instruction following capabilities including logical reasoning and problem-solving skills. - **Developers:** IBM Research - **GitHub Repository:** [ibm-granite/granite-code-models](https://github.com/ibm-granite/granite-code-models) - **Paper:** [Granite Code Models: A Family of Open Foundation Models for Code Intelligence](https://arxiv.org/abs/2405.04324) - **Release Date**: May 6th, 2024 - **License:** [Apache 2.0](https://www.apache.org/licenses/LICENSE-2.0). ## Usage ### Intended use The model is designed to respond to coding related instructions and can be used to build coding assitants. <!-- TO DO: Check starcoder2 instruct code example that includes the template https://huggingface.co/bigcode/starcoder2-15b-instruct-v0.1 --> ### Generation This is a simple example of how to use **Granite-20B-Code-Instruct** model. ```python import torch from transformers import AutoModelForCausalLM, AutoTokenizer device = "cuda" # or "cpu" model_path = "ibm-granite/granite-20b-code-instruct" tokenizer = AutoTokenizer.from_pretrained(model_path) # drop device_map if running on CPU model = AutoModelForCausalLM.from_pretrained(model_path, device_map=device) model.eval() # change input text as desired chat = [ { "role": "user", "content": "Write a code to find the maximum value in a list of numbers." }, ] chat = tokenizer.apply_chat_template(chat, tokenize=False, add_generation_prompt=True) # tokenize the text input_tokens = tokenizer(chat, return_tensors="pt") # transfer tokenized inputs to the device for i in input_tokens: input_tokens[i] = input_tokens[i].to(device) # generate output tokens output = model.generate(**input_tokens, max_new_tokens=100) # decode output tokens into text output = tokenizer.batch_decode(output) # loop over the batch to print, in this example the batch size is 1 for i in output: print(i) ``` <!-- TO DO: Check this part --> ## Training Data Granite Code Instruct models are trained on the following types of data. * Code Commits Datasets: we sourced code commits data from the [CommitPackFT](https://huggingface.co/datasets/bigcode/commitpackft) dataset, a filtered version of the full CommitPack dataset. From CommitPackFT dataset, we only consider data for 92 programming languages. Our inclusion criteria boils down to selecting programming languages common across CommitPackFT and the 116 languages that we considered to pretrain the code-base model (*Granite-20B-Code-Base*). * Math Datasets: We consider two high-quality math datasets, [MathInstruct](https://huggingface.co/datasets/TIGER-Lab/MathInstruct) and [MetaMathQA](https://huggingface.co/datasets/meta-math/MetaMathQA). Due to license issues, we filtered out GSM8K-RFT and Camel-Math from MathInstruct dataset. * Code Instruction Datasets: We use [Glaive-Code-Assistant-v3](https://huggingface.co/datasets/glaiveai/glaive-code-assistant-v3), [Glaive-Function-Calling-v2](https://huggingface.co/datasets/glaiveai/glaive-function-calling-v2), [NL2SQL11](https://huggingface.co/datasets/bugdaryan/sql-create-context-instruction) and a small collection of synthetic API calling datasets. * Language Instruction Datasets: We include high-quality datasets such as [HelpSteer](https://huggingface.co/datasets/nvidia/HelpSteer) and an open license-filtered version of [Platypus](https://huggingface.co/datasets/garage-bAInd/Open-Platypus). We also include a collection of hardcoded prompts to ensure our model generates correct outputs given inquiries about its name or developers. ## Infrastructure We train the Granite Code models using two of IBM's super computing clusters, namely Vela and Blue Vela, both outfitted with NVIDIA A100 and H100 GPUs respectively. These clusters provide a scalable and efficient infrastructure for training our models over thousands of GPUs. ## Ethical Considerations and Limitations Granite code instruct models are primarily finetuned using instruction-response pairs across a specific set of programming languages. Thus, their performance may be limited with out-of-domain programming languages. In this situation, it is beneficial providing few-shot examples to steer the model's output. Moreover, developers should perform safety testing and target-specific tuning before deploying these models on critical applications. The model also inherits ethical considerations and limitations from its base model. For more information, please refer to *[Granite-20B-Code-Base](https://huggingface.co/ibm-granite/granite-20b-code-base)* model card.
apwic/nerui-lora-r8-3
apwic
2024-06-04T02:20:47Z
0
0
null
[ "tensorboard", "generated_from_trainer", "id", "base_model:indolem/indobert-base-uncased", "base_model:finetune:indolem/indobert-base-uncased", "license:mit", "region:us" ]
null
2024-05-28T13:59:10Z
--- language: - id license: mit base_model: indolem/indobert-base-uncased tags: - generated_from_trainer model-index: - name: nerui-lora-r8-3 results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # nerui-lora-r8-3 This model is a fine-tuned version of [indolem/indobert-base-uncased](https://huggingface.co/indolem/indobert-base-uncased) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.0484 - Location Precision: 0.9 - Location Recall: 0.9419 - Location F1: 0.9205 - Location Number: 86 - Organization Precision: 0.9364 - Organization Recall: 0.9101 - Organization F1: 0.9231 - Organization Number: 178 - Person Precision: 0.9843 - Person Recall: 0.9766 - Person F1: 0.9804 - Person Number: 128 - Overall Precision: 0.9436 - Overall Recall: 0.9388 - Overall F1: 0.9412 - Overall Accuracy: 0.9846 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 64 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 100.0 ### Training results | Training Loss | Epoch | Step | Validation Loss | Location Precision | Location Recall | Location F1 | Location Number | Organization Precision | Organization Recall | Organization F1 | Organization Number | Person Precision | Person Recall | Person F1 | Person Number | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy | |:-------------:|:-----:|:----:|:---------------:|:------------------:|:---------------:|:-----------:|:---------------:|:----------------------:|:-------------------:|:---------------:|:-------------------:|:----------------:|:-------------:|:---------:|:-------------:|:-----------------:|:--------------:|:----------:|:----------------:| | 1.1489 | 1.0 | 96 | 0.6808 | 0.0 | 0.0 | 0.0 | 86 | 0.0 | 0.0 | 0.0 | 178 | 0.0 | 0.0 | 0.0 | 128 | 0.0 | 0.0 | 0.0 | 0.8435 | | 0.6648 | 2.0 | 192 | 0.5508 | 0.0 | 0.0 | 0.0 | 86 | 0.5 | 0.0056 | 0.0111 | 178 | 0.0 | 0.0 | 0.0 | 128 | 0.3333 | 0.0026 | 0.0051 | 0.8437 | | 0.5545 | 3.0 | 288 | 0.4324 | 0.0 | 0.0 | 0.0 | 86 | 0.3793 | 0.0618 | 0.1063 | 178 | 0.3714 | 0.1016 | 0.1595 | 128 | 0.3636 | 0.0612 | 0.1048 | 0.8543 | | 0.4347 | 4.0 | 384 | 0.3185 | 0.3077 | 0.0465 | 0.0808 | 86 | 0.3876 | 0.2809 | 0.3257 | 178 | 0.4167 | 0.5078 | 0.4577 | 128 | 0.3993 | 0.3036 | 0.3449 | 0.8910 | | 0.3178 | 5.0 | 480 | 0.2349 | 0.5714 | 0.3721 | 0.4507 | 86 | 0.5476 | 0.6461 | 0.5928 | 178 | 0.5890 | 0.75 | 0.6598 | 128 | 0.5664 | 0.6199 | 0.5920 | 0.9320 | | 0.2406 | 6.0 | 576 | 0.1835 | 0.7407 | 0.6977 | 0.7186 | 86 | 0.6716 | 0.7584 | 0.7124 | 178 | 0.7467 | 0.875 | 0.8058 | 128 | 0.7106 | 0.7832 | 0.7451 | 0.9536 | | 0.1942 | 7.0 | 672 | 0.1519 | 0.7701 | 0.7791 | 0.7746 | 86 | 0.7114 | 0.8034 | 0.7546 | 178 | 0.8786 | 0.9609 | 0.9179 | 128 | 0.7780 | 0.8495 | 0.8122 | 0.9625 | | 0.1647 | 8.0 | 768 | 0.1279 | 0.7882 | 0.7791 | 0.7836 | 86 | 0.7487 | 0.8034 | 0.7751 | 178 | 0.8986 | 0.9688 | 0.9323 | 128 | 0.8068 | 0.8520 | 0.8288 | 0.9660 | | 0.1479 | 9.0 | 864 | 0.1130 | 0.7978 | 0.8256 | 0.8114 | 86 | 0.7602 | 0.8371 | 0.7968 | 178 | 0.9118 | 0.9688 | 0.9394 | 128 | 0.8171 | 0.8776 | 0.8462 | 0.9690 | | 0.135 | 10.0 | 960 | 0.1037 | 0.7660 | 0.8372 | 0.8 | 86 | 0.7755 | 0.8539 | 0.8128 | 178 | 0.9179 | 0.9609 | 0.9389 | 128 | 0.8184 | 0.8852 | 0.8505 | 0.9682 | | 0.1317 | 11.0 | 1056 | 0.0951 | 0.7935 | 0.8488 | 0.8202 | 86 | 0.8182 | 0.8596 | 0.8384 | 178 | 0.9466 | 0.9688 | 0.9575 | 128 | 0.8537 | 0.8929 | 0.8728 | 0.9733 | | 0.1196 | 12.0 | 1152 | 0.0904 | 0.7708 | 0.8605 | 0.8132 | 86 | 0.8404 | 0.8876 | 0.8634 | 178 | 0.9328 | 0.9766 | 0.9542 | 128 | 0.8541 | 0.9107 | 0.8815 | 0.9749 | | 0.1108 | 13.0 | 1248 | 0.0824 | 0.7979 | 0.8721 | 0.8333 | 86 | 0.8466 | 0.8989 | 0.8719 | 178 | 0.9466 | 0.9688 | 0.9575 | 128 | 0.8671 | 0.9158 | 0.8908 | 0.9768 | | 0.107 | 14.0 | 1344 | 0.0797 | 0.8 | 0.8837 | 0.8398 | 86 | 0.8729 | 0.8876 | 0.8802 | 178 | 0.9394 | 0.9688 | 0.9538 | 128 | 0.8775 | 0.9133 | 0.895 | 0.9781 | | 0.1063 | 15.0 | 1440 | 0.0760 | 0.7872 | 0.8605 | 0.8222 | 86 | 0.8610 | 0.9045 | 0.8822 | 178 | 0.9394 | 0.9688 | 0.9538 | 128 | 0.8692 | 0.9158 | 0.8919 | 0.9776 | | 0.1 | 16.0 | 1536 | 0.0724 | 0.8462 | 0.8953 | 0.8701 | 86 | 0.8703 | 0.9045 | 0.8871 | 178 | 0.9538 | 0.9688 | 0.9612 | 128 | 0.8916 | 0.9235 | 0.9073 | 0.9795 | | 0.095 | 17.0 | 1632 | 0.0705 | 0.8261 | 0.8837 | 0.8539 | 86 | 0.8710 | 0.9101 | 0.8901 | 178 | 0.9466 | 0.9688 | 0.9575 | 128 | 0.8851 | 0.9235 | 0.9039 | 0.9789 | | 0.0932 | 18.0 | 1728 | 0.0698 | 0.8370 | 0.8953 | 0.8652 | 86 | 0.8944 | 0.9045 | 0.8994 | 178 | 0.9466 | 0.9688 | 0.9575 | 128 | 0.8983 | 0.9235 | 0.9107 | 0.9803 | | 0.0871 | 19.0 | 1824 | 0.0672 | 0.8387 | 0.9070 | 0.8715 | 86 | 0.8944 | 0.9045 | 0.8994 | 178 | 0.9466 | 0.9688 | 0.9575 | 128 | 0.8985 | 0.9260 | 0.9121 | 0.9800 | | 0.0883 | 20.0 | 1920 | 0.0650 | 0.8298 | 0.9070 | 0.8667 | 86 | 0.8944 | 0.9045 | 0.8994 | 178 | 0.9612 | 0.9688 | 0.9650 | 128 | 0.9007 | 0.9260 | 0.9132 | 0.9803 | | 0.0832 | 21.0 | 2016 | 0.0651 | 0.8298 | 0.9070 | 0.8667 | 86 | 0.8994 | 0.9045 | 0.9020 | 178 | 0.9612 | 0.9688 | 0.9650 | 128 | 0.9030 | 0.9260 | 0.9144 | 0.9811 | | 0.0829 | 22.0 | 2112 | 0.0645 | 0.8125 | 0.9070 | 0.8571 | 86 | 0.8663 | 0.9101 | 0.8877 | 178 | 0.9466 | 0.9688 | 0.9575 | 128 | 0.8792 | 0.9286 | 0.9032 | 0.9787 | | 0.0789 | 23.0 | 2208 | 0.0601 | 0.8211 | 0.9070 | 0.8619 | 86 | 0.8994 | 0.9045 | 0.9020 | 178 | 0.9766 | 0.9766 | 0.9766 | 128 | 0.9055 | 0.9286 | 0.9169 | 0.9819 | | 0.078 | 24.0 | 2304 | 0.0612 | 0.8211 | 0.9070 | 0.8619 | 86 | 0.8927 | 0.8876 | 0.8901 | 178 | 0.9766 | 0.9766 | 0.9766 | 128 | 0.9025 | 0.9209 | 0.9116 | 0.9806 | | 0.0756 | 25.0 | 2400 | 0.0594 | 0.8298 | 0.9070 | 0.8667 | 86 | 0.9045 | 0.9045 | 0.9045 | 178 | 0.9615 | 0.9766 | 0.9690 | 128 | 0.9055 | 0.9286 | 0.9169 | 0.9806 | | 0.0767 | 26.0 | 2496 | 0.0588 | 0.7822 | 0.9186 | 0.8449 | 86 | 0.8960 | 0.8708 | 0.8832 | 178 | 0.9766 | 0.9766 | 0.9766 | 128 | 0.8930 | 0.9158 | 0.9043 | 0.9800 | | 0.0721 | 27.0 | 2592 | 0.0561 | 0.8125 | 0.9070 | 0.8571 | 86 | 0.8852 | 0.9101 | 0.8975 | 178 | 0.9766 | 0.9766 | 0.9766 | 128 | 0.8968 | 0.9311 | 0.9136 | 0.9814 | | 0.0719 | 28.0 | 2688 | 0.0559 | 0.8404 | 0.9186 | 0.8778 | 86 | 0.9040 | 0.8989 | 0.9014 | 178 | 0.9766 | 0.9766 | 0.9766 | 128 | 0.9123 | 0.9286 | 0.9204 | 0.9819 | | 0.0702 | 29.0 | 2784 | 0.0543 | 0.8478 | 0.9070 | 0.8764 | 86 | 0.9016 | 0.9270 | 0.9141 | 178 | 0.9766 | 0.9766 | 0.9766 | 128 | 0.9132 | 0.9388 | 0.9258 | 0.9816 | | 0.0711 | 30.0 | 2880 | 0.0539 | 0.8667 | 0.9070 | 0.8864 | 86 | 0.9066 | 0.9270 | 0.9167 | 178 | 0.9690 | 0.9766 | 0.9728 | 128 | 0.9177 | 0.9388 | 0.9281 | 0.9819 | | 0.067 | 31.0 | 2976 | 0.0576 | 0.8061 | 0.9186 | 0.8587 | 86 | 0.9101 | 0.9101 | 0.9101 | 178 | 0.9766 | 0.9766 | 0.9766 | 128 | 0.9059 | 0.9337 | 0.9196 | 0.9819 | | 0.0664 | 32.0 | 3072 | 0.0567 | 0.8211 | 0.9070 | 0.8619 | 86 | 0.9011 | 0.9213 | 0.9111 | 178 | 0.9690 | 0.9766 | 0.9728 | 128 | 0.9039 | 0.9362 | 0.9198 | 0.9814 | | 0.0642 | 33.0 | 3168 | 0.0558 | 0.8316 | 0.9186 | 0.8729 | 86 | 0.9096 | 0.9045 | 0.9070 | 178 | 0.9766 | 0.9766 | 0.9766 | 128 | 0.9125 | 0.9311 | 0.9217 | 0.9825 | | 0.0642 | 34.0 | 3264 | 0.0545 | 0.8587 | 0.9186 | 0.8876 | 86 | 0.9157 | 0.9157 | 0.9157 | 178 | 0.9766 | 0.9766 | 0.9766 | 128 | 0.9221 | 0.9362 | 0.9291 | 0.9835 | | 0.0624 | 35.0 | 3360 | 0.0542 | 0.8681 | 0.9186 | 0.8927 | 86 | 0.9111 | 0.9213 | 0.9162 | 178 | 0.9766 | 0.9766 | 0.9766 | 128 | 0.9223 | 0.9388 | 0.9305 | 0.9830 | | 0.0651 | 36.0 | 3456 | 0.0535 | 0.8778 | 0.9186 | 0.8977 | 86 | 0.9213 | 0.9213 | 0.9213 | 178 | 0.9690 | 0.9766 | 0.9728 | 128 | 0.9270 | 0.9388 | 0.9328 | 0.9833 | | 0.0635 | 37.0 | 3552 | 0.0523 | 0.8864 | 0.9070 | 0.8966 | 86 | 0.9111 | 0.9213 | 0.9162 | 178 | 0.9766 | 0.9766 | 0.9766 | 128 | 0.9268 | 0.9362 | 0.9315 | 0.9833 | | 0.0617 | 38.0 | 3648 | 0.0528 | 0.8587 | 0.9186 | 0.8876 | 86 | 0.9157 | 0.9157 | 0.9157 | 178 | 0.9766 | 0.9766 | 0.9766 | 128 | 0.9221 | 0.9362 | 0.9291 | 0.9838 | | 0.0581 | 39.0 | 3744 | 0.0548 | 0.8061 | 0.9186 | 0.8587 | 86 | 0.9091 | 0.8989 | 0.9040 | 178 | 0.9766 | 0.9766 | 0.9766 | 128 | 0.9055 | 0.9286 | 0.9169 | 0.9827 | | 0.0597 | 40.0 | 3840 | 0.0510 | 0.8778 | 0.9186 | 0.8977 | 86 | 0.9270 | 0.9270 | 0.9270 | 178 | 0.9766 | 0.9766 | 0.9766 | 128 | 0.9318 | 0.9413 | 0.9365 | 0.9846 | | 0.0569 | 41.0 | 3936 | 0.0505 | 0.8778 | 0.9186 | 0.8977 | 86 | 0.9270 | 0.9270 | 0.9270 | 178 | 0.9766 | 0.9766 | 0.9766 | 128 | 0.9318 | 0.9413 | 0.9365 | 0.9849 | | 0.0579 | 42.0 | 4032 | 0.0504 | 0.8778 | 0.9186 | 0.8977 | 86 | 0.9270 | 0.9270 | 0.9270 | 178 | 0.9766 | 0.9766 | 0.9766 | 128 | 0.9318 | 0.9413 | 0.9365 | 0.9843 | | 0.0564 | 43.0 | 4128 | 0.0506 | 0.8681 | 0.9186 | 0.8927 | 86 | 0.9106 | 0.9157 | 0.9132 | 178 | 0.9843 | 0.9766 | 0.9804 | 128 | 0.9244 | 0.9362 | 0.9303 | 0.9843 | | 0.0572 | 44.0 | 4224 | 0.0499 | 0.8681 | 0.9186 | 0.8927 | 86 | 0.9116 | 0.9270 | 0.9192 | 178 | 0.9843 | 0.9766 | 0.9804 | 128 | 0.9248 | 0.9413 | 0.9330 | 0.9849 | | 0.0563 | 45.0 | 4320 | 0.0488 | 0.8681 | 0.9186 | 0.8927 | 86 | 0.9213 | 0.9213 | 0.9213 | 178 | 0.9843 | 0.9766 | 0.9804 | 128 | 0.9293 | 0.9388 | 0.9340 | 0.9843 | | 0.0594 | 46.0 | 4416 | 0.0507 | 0.8681 | 0.9186 | 0.8927 | 86 | 0.9167 | 0.9270 | 0.9218 | 178 | 0.9766 | 0.9766 | 0.9766 | 128 | 0.9248 | 0.9413 | 0.9330 | 0.9841 | | 0.0545 | 47.0 | 4512 | 0.0497 | 0.8681 | 0.9186 | 0.8927 | 86 | 0.9162 | 0.9213 | 0.9188 | 178 | 0.9766 | 0.9766 | 0.9766 | 128 | 0.9246 | 0.9388 | 0.9316 | 0.9846 | | 0.0536 | 48.0 | 4608 | 0.0487 | 0.8681 | 0.9186 | 0.8927 | 86 | 0.9162 | 0.9213 | 0.9188 | 178 | 0.9766 | 0.9766 | 0.9766 | 128 | 0.9246 | 0.9388 | 0.9316 | 0.9849 | | 0.0556 | 49.0 | 4704 | 0.0501 | 0.8681 | 0.9186 | 0.8927 | 86 | 0.9096 | 0.9045 | 0.9070 | 178 | 0.9766 | 0.9766 | 0.9766 | 128 | 0.9217 | 0.9311 | 0.9264 | 0.9833 | | 0.0522 | 50.0 | 4800 | 0.0506 | 0.8791 | 0.9302 | 0.9040 | 86 | 0.9162 | 0.9213 | 0.9188 | 178 | 0.9766 | 0.9766 | 0.9766 | 128 | 0.9271 | 0.9413 | 0.9342 | 0.9854 | | 0.0527 | 51.0 | 4896 | 0.0496 | 0.8791 | 0.9302 | 0.9040 | 86 | 0.9318 | 0.9213 | 0.9266 | 178 | 0.9766 | 0.9766 | 0.9766 | 128 | 0.9342 | 0.9413 | 0.9377 | 0.9852 | | 0.0529 | 52.0 | 4992 | 0.0490 | 0.8791 | 0.9302 | 0.9040 | 86 | 0.9266 | 0.9213 | 0.9239 | 178 | 0.9688 | 0.9688 | 0.9688 | 128 | 0.9293 | 0.9388 | 0.9340 | 0.9852 | | 0.0522 | 53.0 | 5088 | 0.0494 | 0.8791 | 0.9302 | 0.9040 | 86 | 0.9157 | 0.9157 | 0.9157 | 178 | 0.9766 | 0.9766 | 0.9766 | 128 | 0.9270 | 0.9388 | 0.9328 | 0.9846 | | 0.0525 | 54.0 | 5184 | 0.0482 | 0.8889 | 0.9302 | 0.9091 | 86 | 0.9270 | 0.9270 | 0.9270 | 178 | 0.9766 | 0.9766 | 0.9766 | 128 | 0.9343 | 0.9439 | 0.9391 | 0.9860 | | 0.0512 | 55.0 | 5280 | 0.0488 | 0.8696 | 0.9302 | 0.8989 | 86 | 0.9318 | 0.9213 | 0.9266 | 178 | 0.9766 | 0.9766 | 0.9766 | 128 | 0.9318 | 0.9413 | 0.9365 | 0.9854 | | 0.053 | 56.0 | 5376 | 0.0487 | 0.8791 | 0.9302 | 0.9040 | 86 | 0.9205 | 0.9101 | 0.9153 | 178 | 0.9766 | 0.9766 | 0.9766 | 128 | 0.9291 | 0.9362 | 0.9327 | 0.9849 | | 0.0498 | 57.0 | 5472 | 0.0486 | 0.8791 | 0.9302 | 0.9040 | 86 | 0.9209 | 0.9157 | 0.9183 | 178 | 0.9766 | 0.9766 | 0.9766 | 128 | 0.9293 | 0.9388 | 0.9340 | 0.9846 | | 0.0504 | 58.0 | 5568 | 0.0489 | 0.8696 | 0.9302 | 0.8989 | 86 | 0.9318 | 0.9213 | 0.9266 | 178 | 0.9766 | 0.9766 | 0.9766 | 128 | 0.9318 | 0.9413 | 0.9365 | 0.9854 | | 0.0456 | 59.0 | 5664 | 0.0492 | 0.8696 | 0.9302 | 0.8989 | 86 | 0.9148 | 0.9045 | 0.9096 | 178 | 0.9766 | 0.9766 | 0.9766 | 128 | 0.9242 | 0.9337 | 0.9289 | 0.9846 | | 0.0504 | 60.0 | 5760 | 0.0475 | 0.8681 | 0.9186 | 0.8927 | 86 | 0.9153 | 0.9101 | 0.9127 | 178 | 0.9766 | 0.9766 | 0.9766 | 128 | 0.9242 | 0.9337 | 0.9289 | 0.9849 | | 0.0494 | 61.0 | 5856 | 0.0476 | 0.8681 | 0.9186 | 0.8927 | 86 | 0.9314 | 0.9157 | 0.9235 | 178 | 0.9766 | 0.9766 | 0.9766 | 128 | 0.9315 | 0.9362 | 0.9338 | 0.9852 | | 0.046 | 62.0 | 5952 | 0.0478 | 0.8901 | 0.9419 | 0.9153 | 86 | 0.9318 | 0.9213 | 0.9266 | 178 | 0.9843 | 0.9766 | 0.9804 | 128 | 0.9391 | 0.9439 | 0.9415 | 0.9860 | | 0.0463 | 63.0 | 6048 | 0.0485 | 0.8696 | 0.9302 | 0.8989 | 86 | 0.9162 | 0.9213 | 0.9188 | 178 | 0.9688 | 0.9688 | 0.9688 | 128 | 0.9223 | 0.9388 | 0.9305 | 0.9849 | | 0.0452 | 64.0 | 6144 | 0.0482 | 0.8791 | 0.9302 | 0.9040 | 86 | 0.9213 | 0.9213 | 0.9213 | 178 | 0.9766 | 0.9766 | 0.9766 | 128 | 0.9295 | 0.9413 | 0.9354 | 0.9852 | | 0.0446 | 65.0 | 6240 | 0.0492 | 0.8791 | 0.9302 | 0.9040 | 86 | 0.9111 | 0.9213 | 0.9162 | 178 | 0.9843 | 0.9766 | 0.9804 | 128 | 0.9271 | 0.9413 | 0.9342 | 0.9854 | | 0.0463 | 66.0 | 6336 | 0.0495 | 0.8587 | 0.9186 | 0.8876 | 86 | 0.9101 | 0.9101 | 0.9101 | 178 | 0.9766 | 0.9766 | 0.9766 | 128 | 0.9196 | 0.9337 | 0.9266 | 0.9843 | | 0.0466 | 67.0 | 6432 | 0.0491 | 0.8791 | 0.9302 | 0.9040 | 86 | 0.9101 | 0.9101 | 0.9101 | 178 | 0.9766 | 0.9766 | 0.9766 | 128 | 0.9244 | 0.9362 | 0.9303 | 0.9846 | | 0.0451 | 68.0 | 6528 | 0.0499 | 0.8791 | 0.9302 | 0.9040 | 86 | 0.9111 | 0.9213 | 0.9162 | 178 | 0.9766 | 0.9766 | 0.9766 | 128 | 0.9248 | 0.9413 | 0.9330 | 0.9852 | | 0.047 | 69.0 | 6624 | 0.0493 | 0.8696 | 0.9302 | 0.8989 | 86 | 0.9209 | 0.9157 | 0.9183 | 178 | 0.9766 | 0.9766 | 0.9766 | 128 | 0.9270 | 0.9388 | 0.9328 | 0.9852 | | 0.0435 | 70.0 | 6720 | 0.0485 | 0.8791 | 0.9302 | 0.9040 | 86 | 0.9157 | 0.9157 | 0.9157 | 178 | 0.9766 | 0.9766 | 0.9766 | 128 | 0.9270 | 0.9388 | 0.9328 | 0.9849 | | 0.045 | 71.0 | 6816 | 0.0490 | 0.8791 | 0.9302 | 0.9040 | 86 | 0.9111 | 0.9213 | 0.9162 | 178 | 0.9766 | 0.9766 | 0.9766 | 128 | 0.9248 | 0.9413 | 0.9330 | 0.9852 | | 0.0458 | 72.0 | 6912 | 0.0497 | 0.8901 | 0.9419 | 0.9153 | 86 | 0.9257 | 0.9101 | 0.9178 | 178 | 0.9766 | 0.9766 | 0.9766 | 128 | 0.9340 | 0.9388 | 0.9364 | 0.9849 | | 0.0442 | 73.0 | 7008 | 0.0495 | 0.8901 | 0.9419 | 0.9153 | 86 | 0.9157 | 0.9157 | 0.9157 | 178 | 0.9766 | 0.9766 | 0.9766 | 128 | 0.9295 | 0.9413 | 0.9354 | 0.9854 | | 0.0442 | 74.0 | 7104 | 0.0490 | 0.8901 | 0.9419 | 0.9153 | 86 | 0.9153 | 0.9101 | 0.9127 | 178 | 0.9766 | 0.9766 | 0.9766 | 128 | 0.9293 | 0.9388 | 0.9340 | 0.9852 | | 0.0437 | 75.0 | 7200 | 0.0487 | 0.8681 | 0.9186 | 0.8927 | 86 | 0.9209 | 0.9157 | 0.9183 | 178 | 0.9766 | 0.9766 | 0.9766 | 128 | 0.9268 | 0.9362 | 0.9315 | 0.9841 | | 0.0458 | 76.0 | 7296 | 0.0493 | 0.8791 | 0.9302 | 0.9040 | 86 | 0.9209 | 0.9157 | 0.9183 | 178 | 0.9843 | 0.9766 | 0.9804 | 128 | 0.9316 | 0.9388 | 0.9352 | 0.9843 | | 0.0448 | 77.0 | 7392 | 0.0487 | 0.8681 | 0.9186 | 0.8927 | 86 | 0.9153 | 0.9101 | 0.9127 | 178 | 0.9843 | 0.9766 | 0.9804 | 128 | 0.9266 | 0.9337 | 0.9301 | 0.9838 | | 0.0451 | 78.0 | 7488 | 0.0495 | 0.8791 | 0.9302 | 0.9040 | 86 | 0.9209 | 0.9157 | 0.9183 | 178 | 0.9766 | 0.9766 | 0.9766 | 128 | 0.9293 | 0.9388 | 0.9340 | 0.9843 | | 0.0449 | 79.0 | 7584 | 0.0498 | 0.8791 | 0.9302 | 0.9040 | 86 | 0.9213 | 0.9213 | 0.9213 | 178 | 0.9843 | 0.9766 | 0.9804 | 128 | 0.9318 | 0.9413 | 0.9365 | 0.9846 | | 0.0436 | 80.0 | 7680 | 0.0493 | 0.8696 | 0.9302 | 0.8989 | 86 | 0.9205 | 0.9101 | 0.9153 | 178 | 0.9843 | 0.9766 | 0.9804 | 128 | 0.9291 | 0.9362 | 0.9327 | 0.9843 | | 0.044 | 81.0 | 7776 | 0.0494 | 0.8804 | 0.9419 | 0.9101 | 86 | 0.9209 | 0.9157 | 0.9183 | 178 | 0.9843 | 0.9766 | 0.9804 | 128 | 0.9318 | 0.9413 | 0.9365 | 0.9852 | | 0.0438 | 82.0 | 7872 | 0.0485 | 0.8696 | 0.9302 | 0.8989 | 86 | 0.9257 | 0.9101 | 0.9178 | 178 | 0.9766 | 0.9766 | 0.9766 | 128 | 0.9291 | 0.9362 | 0.9327 | 0.9846 | | 0.0434 | 83.0 | 7968 | 0.0482 | 0.8901 | 0.9419 | 0.9153 | 86 | 0.9162 | 0.9213 | 0.9188 | 178 | 0.9766 | 0.9766 | 0.9766 | 128 | 0.9296 | 0.9439 | 0.9367 | 0.9857 | | 0.0418 | 84.0 | 8064 | 0.0485 | 0.8696 | 0.9302 | 0.8989 | 86 | 0.9101 | 0.9101 | 0.9101 | 178 | 0.9766 | 0.9766 | 0.9766 | 128 | 0.9221 | 0.9362 | 0.9291 | 0.9846 | | 0.0424 | 85.0 | 8160 | 0.0484 | 0.8901 | 0.9419 | 0.9153 | 86 | 0.9310 | 0.9101 | 0.9205 | 178 | 0.9766 | 0.9766 | 0.9766 | 128 | 0.9364 | 0.9388 | 0.9376 | 0.9849 | | 0.042 | 86.0 | 8256 | 0.0482 | 0.8901 | 0.9419 | 0.9153 | 86 | 0.9266 | 0.9213 | 0.9239 | 178 | 0.9843 | 0.9766 | 0.9804 | 128 | 0.9367 | 0.9439 | 0.9403 | 0.9857 | | 0.0431 | 87.0 | 8352 | 0.0482 | 0.8804 | 0.9419 | 0.9101 | 86 | 0.9257 | 0.9101 | 0.9178 | 178 | 0.9843 | 0.9766 | 0.9804 | 128 | 0.9340 | 0.9388 | 0.9364 | 0.9852 | | 0.0417 | 88.0 | 8448 | 0.0482 | 0.8901 | 0.9419 | 0.9153 | 86 | 0.9257 | 0.9101 | 0.9178 | 178 | 0.9843 | 0.9766 | 0.9804 | 128 | 0.9364 | 0.9388 | 0.9376 | 0.9849 | | 0.0421 | 89.0 | 8544 | 0.0482 | 0.8901 | 0.9419 | 0.9153 | 86 | 0.9261 | 0.9157 | 0.9209 | 178 | 0.9843 | 0.9766 | 0.9804 | 128 | 0.9365 | 0.9413 | 0.9389 | 0.9854 | | 0.0412 | 90.0 | 8640 | 0.0485 | 0.8901 | 0.9419 | 0.9153 | 86 | 0.9257 | 0.9101 | 0.9178 | 178 | 0.9843 | 0.9766 | 0.9804 | 128 | 0.9364 | 0.9388 | 0.9376 | 0.9852 | | 0.0407 | 91.0 | 8736 | 0.0484 | 0.8901 | 0.9419 | 0.9153 | 86 | 0.9310 | 0.9101 | 0.9205 | 178 | 0.9843 | 0.9766 | 0.9804 | 128 | 0.9388 | 0.9388 | 0.9388 | 0.9849 | | 0.0405 | 92.0 | 8832 | 0.0487 | 0.9 | 0.9419 | 0.9205 | 86 | 0.9364 | 0.9101 | 0.9231 | 178 | 0.9843 | 0.9766 | 0.9804 | 128 | 0.9436 | 0.9388 | 0.9412 | 0.9846 | | 0.0447 | 93.0 | 8928 | 0.0487 | 0.9 | 0.9419 | 0.9205 | 86 | 0.9364 | 0.9101 | 0.9231 | 178 | 0.9843 | 0.9766 | 0.9804 | 128 | 0.9436 | 0.9388 | 0.9412 | 0.9846 | | 0.0402 | 94.0 | 9024 | 0.0487 | 0.9 | 0.9419 | 0.9205 | 86 | 0.9310 | 0.9101 | 0.9205 | 178 | 0.9843 | 0.9766 | 0.9804 | 128 | 0.9412 | 0.9388 | 0.9400 | 0.9849 | | 0.0406 | 95.0 | 9120 | 0.0485 | 0.9 | 0.9419 | 0.9205 | 86 | 0.9364 | 0.9101 | 0.9231 | 178 | 0.9843 | 0.9766 | 0.9804 | 128 | 0.9436 | 0.9388 | 0.9412 | 0.9846 | | 0.0413 | 96.0 | 9216 | 0.0485 | 0.9 | 0.9419 | 0.9205 | 86 | 0.9364 | 0.9101 | 0.9231 | 178 | 0.9843 | 0.9766 | 0.9804 | 128 | 0.9436 | 0.9388 | 0.9412 | 0.9846 | | 0.0404 | 97.0 | 9312 | 0.0484 | 0.9 | 0.9419 | 0.9205 | 86 | 0.9368 | 0.9157 | 0.9261 | 178 | 0.9843 | 0.9766 | 0.9804 | 128 | 0.9437 | 0.9413 | 0.9425 | 0.9852 | | 0.0403 | 98.0 | 9408 | 0.0485 | 0.9 | 0.9419 | 0.9205 | 86 | 0.9364 | 0.9101 | 0.9231 | 178 | 0.9843 | 0.9766 | 0.9804 | 128 | 0.9436 | 0.9388 | 0.9412 | 0.9846 | | 0.0403 | 99.0 | 9504 | 0.0484 | 0.9 | 0.9419 | 0.9205 | 86 | 0.9364 | 0.9101 | 0.9231 | 178 | 0.9843 | 0.9766 | 0.9804 | 128 | 0.9436 | 0.9388 | 0.9412 | 0.9846 | | 0.0417 | 100.0 | 9600 | 0.0484 | 0.9 | 0.9419 | 0.9205 | 86 | 0.9364 | 0.9101 | 0.9231 | 178 | 0.9843 | 0.9766 | 0.9804 | 128 | 0.9436 | 0.9388 | 0.9412 | 0.9846 | ### Framework versions - Transformers 4.39.3 - Pytorch 2.3.0+cu121 - Datasets 2.19.1 - Tokenizers 0.15.2
hdve/Qwen-Qwen1.5-1.8B-1717467486
hdve
2024-06-04T02:20:23Z
140
0
transformers
[ "transformers", "safetensors", "qwen2", "text-generation", "conversational", "arxiv:1910.09700", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2024-06-04T02:18:38Z
--- library_name: transformers tags: [] --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
rubenamtz0/llama-3-8b-lora-law2entity
rubenamtz0
2024-06-04T02:19:37Z
15
1
peft
[ "peft", "safetensors", "gguf", "llama", "axolotl", "generated_from_trainer", "dataset:rubenamtz0/law_entity_recognition", "base_model:meta-llama/Meta-Llama-3-8B", "base_model:adapter:meta-llama/Meta-Llama-3-8B", "license:llama3", "8-bit", "bitsandbytes", "region:us" ]
null
2024-06-02T01:21:16Z
--- license: llama3 library_name: peft tags: - axolotl - generated_from_trainer base_model: meta-llama/Meta-Llama-3-8B model-index: - name: llama-3-8b-lora-law2entity results: [] datasets: - rubenamtz0/law_entity_recognition --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> [<img src="https://raw.githubusercontent.com/OpenAccess-AI-Collective/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/OpenAccess-AI-Collective/axolotl) <details><summary>See axolotl config</summary> axolotl version: `0.4.1` ```yaml base_model: meta-llama/Meta-Llama-3-8B model_type: LlamaForCausalLM tokenizer_type: AutoTokenizer load_in_8bit: true load_in_4bit: false strict: false datasets: - path: rubenamtz0/law_entity_recognition type: alpaca dataset_prepared_path: val_set_size: 0.1 output_dir: ./outputs/lora-law hub_model_id: rubenamtz0/llama-3-8b-lora-law2entity sequence_len: 4096 sample_packing: true pad_to_sequence_len: true adapter: lora lora_model_dir: lora_r: 32 lora_alpha: 16 lora_dropout: 0.05 lora_target_linear: true lora_fan_in_fan_out: wandb_project: entity-relationship-claim-ft wandb_entity: wandb_watch: wandb_name: wandb_log_model: gradient_accumulation_steps: 4 micro_batch_size: 2 num_epochs: 4 optimizer: adamw_bnb_8bit lr_scheduler: cosine learning_rate: 0.0002 train_on_inputs: false group_by_length: false bf16: auto fp16: tf32: false gradient_checkpointing: true early_stopping_patience: resume_from_checkpoint: local_rank: logging_steps: 1 xformers_attention: flash_attention: true s2_attention: warmup_steps: 10 evals_per_epoch: 4 eval_table_size: eval_max_new_tokens: 128 saves_per_epoch: 1 debug: deepspeed: weight_decay: 0.0 fsdp: fsdp_config: special_tokens: pad_token: <|end_of_text|> ``` </details><br> # llama-3-8b-lora-law2entity This model is a fine-tuned version of [meta-llama/Meta-Llama-3-8B](https://huggingface.co/meta-llama/Meta-Llama-3-8B) on the rubenamtz0/law_entity_recognition dataset. It achieves the following results on the evaluation set: - Loss: 0.1490 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0002 - train_batch_size: 2 - eval_batch_size: 2 - seed: 42 - distributed_type: multi-GPU - num_devices: 3 - gradient_accumulation_steps: 4 - total_train_batch_size: 24 - total_eval_batch_size: 6 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - lr_scheduler_warmup_steps: 10 - num_epochs: 4 ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:------:|:----:|:---------------:| | 0.2735 | 0.05 | 1 | 0.2923 | | 0.2852 | 0.25 | 5 | 0.2742 | | 0.2007 | 0.5 | 10 | 0.2015 | | 0.1742 | 0.75 | 15 | 0.1807 | | 0.1854 | 1.0 | 20 | 0.1688 | | 0.159 | 1.1125 | 25 | 0.1630 | | 0.1444 | 1.3625 | 30 | 0.1592 | | 0.1479 | 1.6125 | 35 | 0.1565 | | 0.1505 | 1.8625 | 40 | 0.1538 | | 0.1369 | 2.1125 | 45 | 0.1518 | | 0.1348 | 2.2125 | 50 | 0.1512 | | 0.1287 | 2.4625 | 55 | 0.1510 | | 0.1359 | 2.7125 | 60 | 0.1498 | | 0.1367 | 2.9625 | 65 | 0.1491 | | 0.1218 | 3.075 | 70 | 0.1491 | | 0.1285 | 3.325 | 75 | 0.1493 | | 0.1307 | 3.575 | 80 | 0.1490 | ### Framework versions - PEFT 0.11.1 - Transformers 4.41.1 - Pytorch 2.1.2+cu118 - Datasets 2.19.1 - Tokenizers 0.19.1
kiatkock/sentiment_pc_oversampler
kiatkock
2024-06-04T02:15:32Z
108
0
transformers
[ "transformers", "tensorboard", "safetensors", "bert", "text-classification", "generated_from_trainer", "base_model:ahmedrachid/FinancialBERT-Sentiment-Analysis", "base_model:finetune:ahmedrachid/FinancialBERT-Sentiment-Analysis", "autotrain_compatible", "endpoints_compatible", "region:us" ]
text-classification
2024-05-30T07:03:44Z
--- base_model: ahmedrachid/FinancialBERT-Sentiment-Analysis tags: - generated_from_trainer metrics: - accuracy - f1 model-index: - name: sentiment_pc_oversampler results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # sentiment_pc_oversampler This model is a fine-tuned version of [ahmedrachid/FinancialBERT-Sentiment-Analysis](https://huggingface.co/ahmedrachid/FinancialBERT-Sentiment-Analysis) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.3909 - Accuracy: 0.9291 - F1: 0.9288 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 2 - total_train_batch_size: 32 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | |:-------------:|:------:|:----:|:---------------:|:--------:|:------:| | No log | 0.1134 | 50 | 0.5293 | 0.8154 | 0.8173 | | No log | 0.2268 | 100 | 0.4512 | 0.8222 | 0.8224 | | No log | 0.3401 | 150 | 0.4212 | 0.8356 | 0.8364 | | No log | 0.4535 | 200 | 0.3978 | 0.8395 | 0.8400 | | No log | 0.5669 | 250 | 0.3745 | 0.8631 | 0.8642 | | No log | 0.6803 | 300 | 0.3593 | 0.8667 | 0.8675 | | No log | 0.7937 | 350 | 0.3203 | 0.8821 | 0.8826 | | No log | 0.9070 | 400 | 0.3130 | 0.8880 | 0.8889 | | No log | 1.0204 | 450 | 0.3052 | 0.8903 | 0.8904 | | 0.3514 | 1.1338 | 500 | 0.3216 | 0.8948 | 0.8954 | | 0.3514 | 1.2472 | 550 | 0.3178 | 0.8979 | 0.8981 | | 0.3514 | 1.3605 | 600 | 0.3366 | 0.8874 | 0.8877 | | 0.3514 | 1.4739 | 650 | 0.3108 | 0.8951 | 0.8950 | | 0.3514 | 1.5873 | 700 | 0.2551 | 0.9198 | 0.9200 | | 0.3514 | 1.7007 | 750 | 0.3358 | 0.8911 | 0.8907 | | 0.3514 | 1.8141 | 800 | 0.2812 | 0.9127 | 0.9125 | | 0.3514 | 1.9274 | 850 | 0.2443 | 0.9240 | 0.9239 | | 0.3514 | 2.0408 | 900 | 0.3059 | 0.9183 | 0.9182 | | 0.3514 | 2.1542 | 950 | 0.3161 | 0.9155 | 0.9152 | | 0.1587 | 2.2676 | 1000 | 0.2733 | 0.9237 | 0.9235 | | 0.1587 | 2.3810 | 1050 | 0.3252 | 0.9141 | 0.9137 | | 0.1587 | 2.4943 | 1100 | 0.3257 | 0.9141 | 0.9140 | | 0.1587 | 2.6077 | 1150 | 0.2836 | 0.9254 | 0.9253 | | 0.1587 | 2.7211 | 1200 | 0.3176 | 0.9166 | 0.9163 | | 0.1587 | 2.8345 | 1250 | 0.3335 | 0.9232 | 0.9228 | | 0.1587 | 2.9478 | 1300 | 0.3076 | 0.9257 | 0.9254 | | 0.1587 | 3.0612 | 1350 | 0.3169 | 0.9269 | 0.9264 | | 0.1587 | 3.1746 | 1400 | 0.3627 | 0.9240 | 0.9238 | | 0.1587 | 3.2880 | 1450 | 0.4074 | 0.9127 | 0.9118 | | 0.0731 | 3.4014 | 1500 | 0.3580 | 0.9251 | 0.9247 | | 0.0731 | 3.5147 | 1550 | 0.3802 | 0.9240 | 0.9235 | | 0.0731 | 3.6281 | 1600 | 0.3705 | 0.9257 | 0.9253 | | 0.0731 | 3.7415 | 1650 | 0.3177 | 0.9362 | 0.9361 | | 0.0731 | 3.8549 | 1700 | 0.3563 | 0.9314 | 0.9310 | | 0.0731 | 3.9683 | 1750 | 0.4248 | 0.9158 | 0.9154 | | 0.0731 | 4.0816 | 1800 | 0.3535 | 0.9314 | 0.9310 | | 0.0731 | 4.1950 | 1850 | 0.3568 | 0.9308 | 0.9305 | | 0.0731 | 4.3084 | 1900 | 0.4044 | 0.9266 | 0.9264 | | 0.0731 | 4.4218 | 1950 | 0.3598 | 0.9331 | 0.9327 | | 0.0358 | 4.5351 | 2000 | 0.3909 | 0.9291 | 0.9288 | | 0.0358 | 4.6485 | 2050 | 0.3725 | 0.9325 | 0.9322 | | 0.0358 | 4.7619 | 2100 | 0.3953 | 0.9305 | 0.9303 | | 0.0358 | 4.8753 | 2150 | 0.3902 | 0.9305 | 0.9302 | | 0.0358 | 4.9887 | 2200 | 0.3960 | 0.9286 | 0.9282 | ### Framework versions - Transformers 4.41.2 - Pytorch 2.3.0+cu121 - Datasets 2.19.2 - Tokenizers 0.19.1
apwic/nerui-lora-r16-2
apwic
2024-06-04T02:11:49Z
0
0
null
[ "tensorboard", "generated_from_trainer", "id", "base_model:indolem/indobert-base-uncased", "base_model:finetune:indolem/indobert-base-uncased", "license:mit", "region:us" ]
null
2024-05-28T13:41:36Z
--- language: - id license: mit base_model: indolem/indobert-base-uncased tags: - generated_from_trainer model-index: - name: nerui-lora-r16-2 results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # nerui-lora-r16-2 This model is a fine-tuned version of [indolem/indobert-base-uncased](https://huggingface.co/indolem/indobert-base-uncased) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.0417 - Location Precision: 0.8713 - Location Recall: 0.9462 - Location F1: 0.9072 - Location Number: 93 - Organization Precision: 0.8909 - Organization Recall: 0.8855 - Organization F1: 0.8882 - Organization Number: 166 - Person Precision: 0.9787 - Person Recall: 0.9718 - Person F1: 0.9753 - Person Number: 142 - Overall Precision: 0.9165 - Overall Recall: 0.9302 - Overall F1: 0.9233 - Overall Accuracy: 0.9868 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 64 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 100.0 ### Training results | Training Loss | Epoch | Step | Validation Loss | Location Precision | Location Recall | Location F1 | Location Number | Organization Precision | Organization Recall | Organization F1 | Organization Number | Person Precision | Person Recall | Person F1 | Person Number | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy | |:-------------:|:-----:|:----:|:---------------:|:------------------:|:---------------:|:-----------:|:---------------:|:----------------------:|:-------------------:|:---------------:|:-------------------:|:----------------:|:-------------:|:---------:|:-------------:|:-----------------:|:--------------:|:----------:|:----------------:| | 1.0607 | 1.0 | 96 | 0.6772 | 0.0 | 0.0 | 0.0 | 93 | 0.0 | 0.0 | 0.0 | 166 | 0.0 | 0.0 | 0.0 | 142 | 0.0 | 0.0 | 0.0 | 0.8343 | | 0.6351 | 2.0 | 192 | 0.5251 | 0.0 | 0.0 | 0.0 | 93 | 0.5 | 0.0120 | 0.0235 | 166 | 0.0 | 0.0 | 0.0 | 142 | 0.3333 | 0.0050 | 0.0098 | 0.8348 | | 0.4897 | 3.0 | 288 | 0.3649 | 0.0 | 0.0 | 0.0 | 93 | 0.3529 | 0.2169 | 0.2687 | 166 | 0.3286 | 0.3239 | 0.3262 | 142 | 0.3267 | 0.2045 | 0.2515 | 0.8763 | | 0.335 | 4.0 | 384 | 0.2323 | 0.3684 | 0.3011 | 0.3314 | 93 | 0.5099 | 0.6205 | 0.5598 | 166 | 0.5683 | 0.7324 | 0.6400 | 142 | 0.5098 | 0.5860 | 0.5452 | 0.9289 | | 0.2342 | 5.0 | 480 | 0.1642 | 0.5895 | 0.6022 | 0.5957 | 93 | 0.6396 | 0.7590 | 0.6942 | 166 | 0.8269 | 0.9085 | 0.8658 | 142 | 0.6942 | 0.7756 | 0.7326 | 0.9564 | | 0.1832 | 6.0 | 576 | 0.1316 | 0.7027 | 0.8387 | 0.7647 | 93 | 0.7432 | 0.8193 | 0.7794 | 166 | 0.9257 | 0.9648 | 0.9448 | 142 | 0.7941 | 0.8753 | 0.8327 | 0.9657 | | 0.1526 | 7.0 | 672 | 0.1085 | 0.7692 | 0.8602 | 0.8122 | 93 | 0.7433 | 0.8373 | 0.7875 | 166 | 0.9079 | 0.9718 | 0.9388 | 142 | 0.8059 | 0.8903 | 0.8460 | 0.9690 | | 0.136 | 8.0 | 768 | 0.0910 | 0.75 | 0.8710 | 0.8060 | 93 | 0.8011 | 0.8494 | 0.8246 | 166 | 0.9262 | 0.9718 | 0.9485 | 142 | 0.8314 | 0.8978 | 0.8633 | 0.9734 | | 0.1234 | 9.0 | 864 | 0.0817 | 0.7981 | 0.8925 | 0.8426 | 93 | 0.8229 | 0.8675 | 0.8446 | 166 | 0.9133 | 0.9648 | 0.9384 | 142 | 0.8485 | 0.9077 | 0.8771 | 0.9753 | | 0.1123 | 10.0 | 960 | 0.0774 | 0.7981 | 0.8925 | 0.8426 | 93 | 0.8207 | 0.9096 | 0.8629 | 166 | 0.9388 | 0.9718 | 0.9550 | 142 | 0.8552 | 0.9277 | 0.8900 | 0.9772 | | 0.1042 | 11.0 | 1056 | 0.0683 | 0.8039 | 0.8817 | 0.8410 | 93 | 0.8371 | 0.8976 | 0.8663 | 166 | 0.9448 | 0.9648 | 0.9547 | 142 | 0.8659 | 0.9177 | 0.8910 | 0.9789 | | 0.1 | 12.0 | 1152 | 0.0661 | 0.8317 | 0.9032 | 0.8660 | 93 | 0.8436 | 0.9096 | 0.8754 | 166 | 0.9514 | 0.9648 | 0.9580 | 142 | 0.8774 | 0.9277 | 0.9018 | 0.9800 | | 0.0949 | 13.0 | 1248 | 0.0622 | 0.8416 | 0.9140 | 0.8763 | 93 | 0.8571 | 0.9036 | 0.8798 | 166 | 0.9580 | 0.9648 | 0.9614 | 142 | 0.8878 | 0.9277 | 0.9073 | 0.9811 | | 0.091 | 14.0 | 1344 | 0.0597 | 0.8173 | 0.9140 | 0.8629 | 93 | 0.8788 | 0.8735 | 0.8761 | 166 | 0.9580 | 0.9648 | 0.9614 | 142 | 0.8908 | 0.9152 | 0.9028 | 0.9802 | | 0.0852 | 15.0 | 1440 | 0.0593 | 0.84 | 0.9032 | 0.8705 | 93 | 0.8306 | 0.9157 | 0.8711 | 166 | 0.9650 | 0.9718 | 0.9684 | 142 | 0.8779 | 0.9327 | 0.9045 | 0.9800 | | 0.0874 | 16.0 | 1536 | 0.0591 | 0.7838 | 0.9355 | 0.8529 | 93 | 0.8538 | 0.8795 | 0.8665 | 166 | 0.9514 | 0.9648 | 0.9580 | 142 | 0.8685 | 0.9227 | 0.8948 | 0.9797 | | 0.0817 | 17.0 | 1632 | 0.0538 | 0.8350 | 0.9247 | 0.8776 | 93 | 0.8876 | 0.9036 | 0.8955 | 166 | 0.9580 | 0.9648 | 0.9614 | 142 | 0.8988 | 0.9302 | 0.9142 | 0.9830 | | 0.0784 | 18.0 | 1728 | 0.0511 | 0.8350 | 0.9247 | 0.8776 | 93 | 0.8830 | 0.9096 | 0.8961 | 166 | 0.9580 | 0.9648 | 0.9614 | 142 | 0.8969 | 0.9327 | 0.9144 | 0.9833 | | 0.0764 | 19.0 | 1824 | 0.0523 | 0.7890 | 0.9247 | 0.8515 | 93 | 0.8841 | 0.8735 | 0.8788 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.8892 | 0.9202 | 0.9044 | 0.9822 | | 0.0735 | 20.0 | 1920 | 0.0524 | 0.8018 | 0.9570 | 0.8725 | 93 | 0.8889 | 0.8675 | 0.8780 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.8940 | 0.9252 | 0.9093 | 0.9819 | | 0.074 | 21.0 | 2016 | 0.0519 | 0.8 | 0.9462 | 0.8670 | 93 | 0.8788 | 0.8735 | 0.8761 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.8897 | 0.9252 | 0.9071 | 0.9822 | | 0.0695 | 22.0 | 2112 | 0.0529 | 0.7857 | 0.9462 | 0.8585 | 93 | 0.8353 | 0.8554 | 0.8452 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.8679 | 0.9177 | 0.8921 | 0.9805 | | 0.0673 | 23.0 | 2208 | 0.0519 | 0.8056 | 0.9355 | 0.8657 | 93 | 0.9045 | 0.8554 | 0.8793 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9017 | 0.9152 | 0.9084 | 0.9824 | | 0.0677 | 24.0 | 2304 | 0.0530 | 0.7982 | 0.9355 | 0.8614 | 93 | 0.9045 | 0.8554 | 0.8793 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.8995 | 0.9152 | 0.9073 | 0.9811 | | 0.0649 | 25.0 | 2400 | 0.0501 | 0.8018 | 0.9570 | 0.8725 | 93 | 0.8994 | 0.8614 | 0.8800 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.8981 | 0.9227 | 0.9102 | 0.9822 | | 0.0647 | 26.0 | 2496 | 0.0478 | 0.8365 | 0.9355 | 0.8832 | 93 | 0.9057 | 0.8675 | 0.8862 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9111 | 0.9202 | 0.9156 | 0.9838 | | 0.0579 | 27.0 | 2592 | 0.0466 | 0.8208 | 0.9355 | 0.8744 | 93 | 0.8963 | 0.8855 | 0.8909 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9029 | 0.9277 | 0.9151 | 0.9835 | | 0.0627 | 28.0 | 2688 | 0.0488 | 0.8131 | 0.9355 | 0.8700 | 93 | 0.8855 | 0.8855 | 0.8855 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.8964 | 0.9277 | 0.9118 | 0.9819 | | 0.0601 | 29.0 | 2784 | 0.0487 | 0.8131 | 0.9355 | 0.8700 | 93 | 0.8882 | 0.9096 | 0.8988 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.8974 | 0.9377 | 0.9171 | 0.9827 | | 0.0575 | 30.0 | 2880 | 0.0459 | 0.8286 | 0.9355 | 0.8788 | 93 | 0.8922 | 0.8976 | 0.8949 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9034 | 0.9327 | 0.9178 | 0.9833 | | 0.0569 | 31.0 | 2976 | 0.0455 | 0.8073 | 0.9462 | 0.8713 | 93 | 0.8951 | 0.8735 | 0.8841 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.8983 | 0.9252 | 0.9115 | 0.9841 | | 0.0548 | 32.0 | 3072 | 0.0445 | 0.8224 | 0.9462 | 0.88 | 93 | 0.8889 | 0.8675 | 0.8780 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9002 | 0.9227 | 0.9113 | 0.9846 | | 0.0528 | 33.0 | 3168 | 0.0471 | 0.7946 | 0.9570 | 0.8683 | 93 | 0.8944 | 0.8675 | 0.8807 | 166 | 0.9858 | 0.9789 | 0.9823 | 142 | 0.8986 | 0.9277 | 0.9129 | 0.9827 | | 0.0533 | 34.0 | 3264 | 0.0445 | 0.8073 | 0.9462 | 0.8713 | 93 | 0.8802 | 0.8855 | 0.8829 | 166 | 0.9789 | 0.9789 | 0.9789 | 142 | 0.8947 | 0.9327 | 0.9133 | 0.9833 | | 0.0503 | 35.0 | 3360 | 0.0425 | 0.8286 | 0.9355 | 0.8788 | 93 | 0.8922 | 0.8976 | 0.8949 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9034 | 0.9327 | 0.9178 | 0.9852 | | 0.0531 | 36.0 | 3456 | 0.0447 | 0.7928 | 0.9462 | 0.8627 | 93 | 0.8957 | 0.8795 | 0.8875 | 166 | 0.9648 | 0.9648 | 0.9648 | 142 | 0.8918 | 0.9252 | 0.9082 | 0.9830 | | 0.0493 | 37.0 | 3552 | 0.0442 | 0.8365 | 0.9355 | 0.8832 | 93 | 0.8970 | 0.8916 | 0.8943 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9075 | 0.9302 | 0.9187 | 0.9841 | | 0.05 | 38.0 | 3648 | 0.0423 | 0.87 | 0.9355 | 0.9016 | 93 | 0.9042 | 0.9096 | 0.9069 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9193 | 0.9377 | 0.9284 | 0.9857 | | 0.0489 | 39.0 | 3744 | 0.0416 | 0.8529 | 0.9355 | 0.8923 | 93 | 0.8994 | 0.9157 | 0.9075 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9128 | 0.9401 | 0.9263 | 0.9855 | | 0.0481 | 40.0 | 3840 | 0.0411 | 0.8544 | 0.9462 | 0.8980 | 93 | 0.9068 | 0.8795 | 0.8930 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9163 | 0.9277 | 0.9219 | 0.9852 | | 0.0462 | 41.0 | 3936 | 0.0429 | 0.8286 | 0.9355 | 0.8788 | 93 | 0.9036 | 0.9036 | 0.9036 | 166 | 0.9787 | 0.9718 | 0.9753 | 142 | 0.9102 | 0.9352 | 0.9225 | 0.9855 | | 0.0468 | 42.0 | 4032 | 0.0435 | 0.8302 | 0.9462 | 0.8844 | 93 | 0.9030 | 0.8976 | 0.9003 | 166 | 0.9858 | 0.9789 | 0.9823 | 142 | 0.9126 | 0.9377 | 0.9250 | 0.9846 | | 0.0469 | 43.0 | 4128 | 0.0423 | 0.8878 | 0.9355 | 0.9110 | 93 | 0.8976 | 0.8976 | 0.8976 | 166 | 0.9858 | 0.9789 | 0.9823 | 142 | 0.9259 | 0.9352 | 0.9305 | 0.9860 | | 0.0472 | 44.0 | 4224 | 0.0460 | 0.8148 | 0.9462 | 0.8756 | 93 | 0.8938 | 0.8614 | 0.8773 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9 | 0.9202 | 0.9100 | 0.9830 | | 0.0468 | 45.0 | 4320 | 0.0420 | 0.8713 | 0.9462 | 0.9072 | 93 | 0.9062 | 0.8735 | 0.8896 | 166 | 0.9858 | 0.9789 | 0.9823 | 142 | 0.9254 | 0.9277 | 0.9265 | 0.9852 | | 0.0453 | 46.0 | 4416 | 0.0425 | 0.8462 | 0.9462 | 0.8934 | 93 | 0.8994 | 0.8614 | 0.8800 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9111 | 0.9202 | 0.9156 | 0.9852 | | 0.0428 | 47.0 | 4512 | 0.0432 | 0.8788 | 0.9355 | 0.9062 | 93 | 0.8902 | 0.9277 | 0.9086 | 166 | 0.9787 | 0.9718 | 0.9753 | 142 | 0.9177 | 0.9451 | 0.9312 | 0.9855 | | 0.043 | 48.0 | 4608 | 0.0433 | 0.8381 | 0.9462 | 0.8889 | 93 | 0.8924 | 0.8494 | 0.8704 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9062 | 0.9152 | 0.9107 | 0.9841 | | 0.0443 | 49.0 | 4704 | 0.0437 | 0.8529 | 0.9355 | 0.8923 | 93 | 0.8929 | 0.9036 | 0.8982 | 166 | 0.9648 | 0.9648 | 0.9648 | 142 | 0.9078 | 0.9327 | 0.9200 | 0.9846 | | 0.0466 | 50.0 | 4800 | 0.0430 | 0.8627 | 0.9462 | 0.9026 | 93 | 0.8922 | 0.8976 | 0.8949 | 166 | 0.9787 | 0.9718 | 0.9753 | 142 | 0.9146 | 0.9352 | 0.9248 | 0.9860 | | 0.0419 | 51.0 | 4896 | 0.0430 | 0.8462 | 0.9462 | 0.8934 | 93 | 0.8951 | 0.8735 | 0.8841 | 166 | 0.9787 | 0.9718 | 0.9753 | 142 | 0.9115 | 0.9252 | 0.9183 | 0.9852 | | 0.0421 | 52.0 | 4992 | 0.0404 | 0.9158 | 0.9355 | 0.9255 | 93 | 0.8953 | 0.9277 | 0.9112 | 166 | 0.9787 | 0.9718 | 0.9753 | 142 | 0.9289 | 0.9451 | 0.9370 | 0.9874 | | 0.0409 | 53.0 | 5088 | 0.0431 | 0.8462 | 0.9462 | 0.8934 | 93 | 0.8982 | 0.9036 | 0.9009 | 166 | 0.9787 | 0.9718 | 0.9753 | 142 | 0.9126 | 0.9377 | 0.9250 | 0.9857 | | 0.0391 | 54.0 | 5184 | 0.0417 | 0.8969 | 0.9355 | 0.9158 | 93 | 0.9012 | 0.9337 | 0.9172 | 166 | 0.9787 | 0.9718 | 0.9753 | 142 | 0.9268 | 0.9476 | 0.9371 | 0.9868 | | 0.0383 | 55.0 | 5280 | 0.0402 | 0.8980 | 0.9462 | 0.9215 | 93 | 0.9053 | 0.9217 | 0.9134 | 166 | 0.9787 | 0.9718 | 0.9753 | 142 | 0.9289 | 0.9451 | 0.9370 | 0.9877 | | 0.0399 | 56.0 | 5376 | 0.0431 | 0.8627 | 0.9462 | 0.9026 | 93 | 0.9048 | 0.9157 | 0.9102 | 166 | 0.9787 | 0.9718 | 0.9753 | 142 | 0.9197 | 0.9426 | 0.9310 | 0.9855 | | 0.04 | 57.0 | 5472 | 0.0425 | 0.8544 | 0.9462 | 0.8980 | 93 | 0.9024 | 0.8916 | 0.8970 | 166 | 0.9787 | 0.9718 | 0.9753 | 142 | 0.9167 | 0.9327 | 0.9246 | 0.9855 | | 0.04 | 58.0 | 5568 | 0.0422 | 0.8713 | 0.9462 | 0.9072 | 93 | 0.9146 | 0.9036 | 0.9091 | 166 | 0.9787 | 0.9718 | 0.9753 | 142 | 0.9261 | 0.9377 | 0.9318 | 0.9868 | | 0.0372 | 59.0 | 5664 | 0.0425 | 0.8713 | 0.9462 | 0.9072 | 93 | 0.9036 | 0.9036 | 0.9036 | 166 | 0.9787 | 0.9718 | 0.9753 | 142 | 0.9216 | 0.9377 | 0.9295 | 0.9863 | | 0.0384 | 60.0 | 5760 | 0.0422 | 0.8713 | 0.9462 | 0.9072 | 93 | 0.9146 | 0.9036 | 0.9091 | 166 | 0.9787 | 0.9718 | 0.9753 | 142 | 0.9261 | 0.9377 | 0.9318 | 0.9866 | | 0.0379 | 61.0 | 5856 | 0.0402 | 0.8627 | 0.9462 | 0.9026 | 93 | 0.9091 | 0.9036 | 0.9063 | 166 | 0.9787 | 0.9718 | 0.9753 | 142 | 0.9216 | 0.9377 | 0.9295 | 0.9877 | | 0.0362 | 62.0 | 5952 | 0.0387 | 0.8889 | 0.9462 | 0.9167 | 93 | 0.9036 | 0.9036 | 0.9036 | 166 | 0.9648 | 0.9648 | 0.9648 | 142 | 0.9214 | 0.9352 | 0.9282 | 0.9871 | | 0.036 | 63.0 | 6048 | 0.0424 | 0.8381 | 0.9462 | 0.8889 | 93 | 0.9030 | 0.8976 | 0.9003 | 166 | 0.9787 | 0.9718 | 0.9753 | 142 | 0.9124 | 0.9352 | 0.9236 | 0.9852 | | 0.036 | 64.0 | 6144 | 0.0404 | 0.88 | 0.9462 | 0.9119 | 93 | 0.9024 | 0.8916 | 0.8970 | 166 | 0.9580 | 0.9648 | 0.9614 | 142 | 0.9165 | 0.9302 | 0.9233 | 0.9857 | | 0.033 | 65.0 | 6240 | 0.0419 | 0.8544 | 0.9462 | 0.8980 | 93 | 0.9030 | 0.8976 | 0.9003 | 166 | 0.9787 | 0.9718 | 0.9753 | 142 | 0.9169 | 0.9352 | 0.9259 | 0.9857 | | 0.0348 | 66.0 | 6336 | 0.0396 | 0.88 | 0.9462 | 0.9119 | 93 | 0.9024 | 0.8916 | 0.8970 | 166 | 0.9787 | 0.9718 | 0.9753 | 142 | 0.9235 | 0.9327 | 0.9280 | 0.9868 | | 0.0346 | 67.0 | 6432 | 0.0410 | 0.8627 | 0.9462 | 0.9026 | 93 | 0.8862 | 0.8916 | 0.8889 | 166 | 0.9648 | 0.9648 | 0.9648 | 142 | 0.9075 | 0.9302 | 0.9187 | 0.9849 | | 0.0337 | 68.0 | 6528 | 0.0416 | 0.8544 | 0.9462 | 0.8980 | 93 | 0.9030 | 0.8976 | 0.9003 | 166 | 0.9787 | 0.9718 | 0.9753 | 142 | 0.9169 | 0.9352 | 0.9259 | 0.9857 | | 0.0355 | 69.0 | 6624 | 0.0418 | 0.8627 | 0.9462 | 0.9026 | 93 | 0.8909 | 0.8855 | 0.8882 | 166 | 0.9787 | 0.9718 | 0.9753 | 142 | 0.9142 | 0.9302 | 0.9221 | 0.9855 | | 0.0337 | 70.0 | 6720 | 0.0408 | 0.8713 | 0.9462 | 0.9072 | 93 | 0.9146 | 0.9036 | 0.9091 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9238 | 0.9377 | 0.9307 | 0.9863 | | 0.0351 | 71.0 | 6816 | 0.0411 | 0.8713 | 0.9462 | 0.9072 | 93 | 0.9152 | 0.9096 | 0.9124 | 166 | 0.9787 | 0.9718 | 0.9753 | 142 | 0.9263 | 0.9401 | 0.9332 | 0.9860 | | 0.0337 | 72.0 | 6912 | 0.0411 | 0.9072 | 0.9462 | 0.9263 | 93 | 0.8929 | 0.9036 | 0.8982 | 166 | 0.9787 | 0.9718 | 0.9753 | 142 | 0.9261 | 0.9377 | 0.9318 | 0.9866 | | 0.0317 | 73.0 | 7008 | 0.0415 | 0.8713 | 0.9462 | 0.9072 | 93 | 0.9036 | 0.9036 | 0.9036 | 166 | 0.9787 | 0.9718 | 0.9753 | 142 | 0.9216 | 0.9377 | 0.9295 | 0.9860 | | 0.0308 | 74.0 | 7104 | 0.0442 | 0.8558 | 0.9570 | 0.9036 | 93 | 0.9202 | 0.9036 | 0.9119 | 166 | 0.9787 | 0.9718 | 0.9753 | 142 | 0.9240 | 0.9401 | 0.9320 | 0.9860 | | 0.0331 | 75.0 | 7200 | 0.0416 | 0.9072 | 0.9462 | 0.9263 | 93 | 0.9053 | 0.9217 | 0.9134 | 166 | 0.9787 | 0.9718 | 0.9753 | 142 | 0.9312 | 0.9451 | 0.9381 | 0.9879 | | 0.0307 | 76.0 | 7296 | 0.0426 | 0.8725 | 0.9570 | 0.9128 | 93 | 0.8963 | 0.8855 | 0.8909 | 166 | 0.9787 | 0.9718 | 0.9753 | 142 | 0.9189 | 0.9327 | 0.9257 | 0.9860 | | 0.0311 | 77.0 | 7392 | 0.0411 | 0.8889 | 0.9462 | 0.9167 | 93 | 0.8869 | 0.8976 | 0.8922 | 166 | 0.9787 | 0.9718 | 0.9753 | 142 | 0.9191 | 0.9352 | 0.9271 | 0.9871 | | 0.0321 | 78.0 | 7488 | 0.0421 | 0.8713 | 0.9462 | 0.9072 | 93 | 0.8862 | 0.8916 | 0.8889 | 166 | 0.9787 | 0.9718 | 0.9753 | 142 | 0.9144 | 0.9327 | 0.9235 | 0.9863 | | 0.0314 | 79.0 | 7584 | 0.0419 | 0.88 | 0.9462 | 0.9119 | 93 | 0.8869 | 0.8976 | 0.8922 | 166 | 0.9787 | 0.9718 | 0.9753 | 142 | 0.9169 | 0.9352 | 0.9259 | 0.9866 | | 0.0327 | 80.0 | 7680 | 0.0420 | 0.88 | 0.9462 | 0.9119 | 93 | 0.9096 | 0.9096 | 0.9096 | 166 | 0.9787 | 0.9718 | 0.9753 | 142 | 0.9263 | 0.9401 | 0.9332 | 0.9868 | | 0.0338 | 81.0 | 7776 | 0.0423 | 0.8713 | 0.9462 | 0.9072 | 93 | 0.9091 | 0.9036 | 0.9063 | 166 | 0.9787 | 0.9718 | 0.9753 | 142 | 0.9238 | 0.9377 | 0.9307 | 0.9871 | | 0.0326 | 82.0 | 7872 | 0.0430 | 0.8713 | 0.9462 | 0.9072 | 93 | 0.9080 | 0.8916 | 0.8997 | 166 | 0.9787 | 0.9718 | 0.9753 | 142 | 0.9235 | 0.9327 | 0.9280 | 0.9857 | | 0.0311 | 83.0 | 7968 | 0.0420 | 0.8889 | 0.9462 | 0.9167 | 93 | 0.8970 | 0.8916 | 0.8943 | 166 | 0.9787 | 0.9718 | 0.9753 | 142 | 0.9235 | 0.9327 | 0.9280 | 0.9857 | | 0.0319 | 84.0 | 8064 | 0.0435 | 0.8462 | 0.9462 | 0.8934 | 93 | 0.8970 | 0.8916 | 0.8943 | 166 | 0.9787 | 0.9718 | 0.9753 | 142 | 0.9122 | 0.9327 | 0.9223 | 0.9855 | | 0.0312 | 85.0 | 8160 | 0.0414 | 0.88 | 0.9462 | 0.9119 | 93 | 0.8909 | 0.8855 | 0.8882 | 166 | 0.9787 | 0.9718 | 0.9753 | 142 | 0.9187 | 0.9302 | 0.9244 | 0.9863 | | 0.0313 | 86.0 | 8256 | 0.0418 | 0.88 | 0.9462 | 0.9119 | 93 | 0.8862 | 0.8916 | 0.8889 | 166 | 0.9787 | 0.9718 | 0.9753 | 142 | 0.9167 | 0.9327 | 0.9246 | 0.9866 | | 0.0315 | 87.0 | 8352 | 0.0414 | 0.88 | 0.9462 | 0.9119 | 93 | 0.8916 | 0.8916 | 0.8916 | 166 | 0.9787 | 0.9718 | 0.9753 | 142 | 0.9189 | 0.9327 | 0.9257 | 0.9868 | | 0.0314 | 88.0 | 8448 | 0.0415 | 0.88 | 0.9462 | 0.9119 | 93 | 0.9024 | 0.8916 | 0.8970 | 166 | 0.9787 | 0.9718 | 0.9753 | 142 | 0.9235 | 0.9327 | 0.9280 | 0.9866 | | 0.0301 | 89.0 | 8544 | 0.0416 | 0.88 | 0.9462 | 0.9119 | 93 | 0.8970 | 0.8916 | 0.8943 | 166 | 0.9787 | 0.9718 | 0.9753 | 142 | 0.9212 | 0.9327 | 0.9269 | 0.9868 | | 0.0303 | 90.0 | 8640 | 0.0410 | 0.88 | 0.9462 | 0.9119 | 93 | 0.9030 | 0.8976 | 0.9003 | 166 | 0.9787 | 0.9718 | 0.9753 | 142 | 0.9236 | 0.9352 | 0.9294 | 0.9866 | | 0.0292 | 91.0 | 8736 | 0.0412 | 0.8713 | 0.9462 | 0.9072 | 93 | 0.8909 | 0.8855 | 0.8882 | 166 | 0.9787 | 0.9718 | 0.9753 | 142 | 0.9165 | 0.9302 | 0.9233 | 0.9863 | | 0.0292 | 92.0 | 8832 | 0.0424 | 0.88 | 0.9462 | 0.9119 | 93 | 0.9080 | 0.8916 | 0.8997 | 166 | 0.9787 | 0.9718 | 0.9753 | 142 | 0.9257 | 0.9327 | 0.9292 | 0.9868 | | 0.0295 | 93.0 | 8928 | 0.0426 | 0.88 | 0.9462 | 0.9119 | 93 | 0.9080 | 0.8916 | 0.8997 | 166 | 0.9787 | 0.9718 | 0.9753 | 142 | 0.9257 | 0.9327 | 0.9292 | 0.9866 | | 0.0304 | 94.0 | 9024 | 0.0422 | 0.88 | 0.9462 | 0.9119 | 93 | 0.8963 | 0.8855 | 0.8909 | 166 | 0.9787 | 0.9718 | 0.9753 | 142 | 0.9210 | 0.9302 | 0.9256 | 0.9866 | | 0.0304 | 95.0 | 9120 | 0.0415 | 0.8713 | 0.9462 | 0.9072 | 93 | 0.8855 | 0.8855 | 0.8855 | 166 | 0.9787 | 0.9718 | 0.9753 | 142 | 0.9142 | 0.9302 | 0.9221 | 0.9866 | | 0.0312 | 96.0 | 9216 | 0.0415 | 0.88 | 0.9462 | 0.9119 | 93 | 0.8862 | 0.8916 | 0.8889 | 166 | 0.9787 | 0.9718 | 0.9753 | 142 | 0.9167 | 0.9327 | 0.9246 | 0.9868 | | 0.0291 | 97.0 | 9312 | 0.0418 | 0.8713 | 0.9462 | 0.9072 | 93 | 0.8855 | 0.8855 | 0.8855 | 166 | 0.9787 | 0.9718 | 0.9753 | 142 | 0.9142 | 0.9302 | 0.9221 | 0.9866 | | 0.0306 | 98.0 | 9408 | 0.0417 | 0.88 | 0.9462 | 0.9119 | 93 | 0.8916 | 0.8916 | 0.8916 | 166 | 0.9787 | 0.9718 | 0.9753 | 142 | 0.9189 | 0.9327 | 0.9257 | 0.9871 | | 0.0293 | 99.0 | 9504 | 0.0417 | 0.8713 | 0.9462 | 0.9072 | 93 | 0.8909 | 0.8855 | 0.8882 | 166 | 0.9787 | 0.9718 | 0.9753 | 142 | 0.9165 | 0.9302 | 0.9233 | 0.9868 | | 0.0302 | 100.0 | 9600 | 0.0417 | 0.8713 | 0.9462 | 0.9072 | 93 | 0.8909 | 0.8855 | 0.8882 | 166 | 0.9787 | 0.9718 | 0.9753 | 142 | 0.9165 | 0.9302 | 0.9233 | 0.9868 | ### Framework versions - Transformers 4.39.3 - Pytorch 2.3.0+cu121 - Datasets 2.19.1 - Tokenizers 0.15.2
Ignacio10043/vessels
Ignacio10043
2024-06-04T02:07:53Z
0
0
null
[ "text-classification", "dataset:nvidia/ChatQA-Training-Data", "dataset:HuggingFaceFW/fineweb", "license:apache-2.0", "region:us" ]
text-classification
2024-06-04T02:02:45Z
--- license: apache-2.0 datasets: - nvidia/ChatQA-Training-Data - HuggingFaceFW/fineweb metrics: - accuracy - code_eval pipeline_tag: text-classification ---
apwic/nerui-lora-r8-2
apwic
2024-06-04T02:03:00Z
0
0
null
[ "tensorboard", "generated_from_trainer", "id", "base_model:indolem/indobert-base-uncased", "base_model:finetune:indolem/indobert-base-uncased", "license:mit", "region:us" ]
null
2024-05-28T13:23:49Z
--- language: - id license: mit base_model: indolem/indobert-base-uncased tags: - generated_from_trainer model-index: - name: nerui-lora-r8-2 results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # nerui-lora-r8-2 This model is a fine-tuned version of [indolem/indobert-base-uncased](https://huggingface.co/indolem/indobert-base-uncased) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.0395 - Location Precision: 0.88 - Location Recall: 0.9462 - Location F1: 0.9119 - Location Number: 93 - Organization Precision: 0.9048 - Organization Recall: 0.9157 - Organization F1: 0.9102 - Organization Number: 166 - Person Precision: 0.9718 - Person Recall: 0.9718 - Person F1: 0.9718 - Person Number: 142 - Overall Precision: 0.9220 - Overall Recall: 0.9426 - Overall F1: 0.9322 - Overall Accuracy: 0.9874 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 64 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 100.0 ### Training results | Training Loss | Epoch | Step | Validation Loss | Location Precision | Location Recall | Location F1 | Location Number | Organization Precision | Organization Recall | Organization F1 | Organization Number | Person Precision | Person Recall | Person F1 | Person Number | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy | |:-------------:|:-----:|:----:|:---------------:|:------------------:|:---------------:|:-----------:|:---------------:|:----------------------:|:-------------------:|:---------------:|:-------------------:|:----------------:|:-------------:|:---------:|:-------------:|:-----------------:|:--------------:|:----------:|:----------------:| | 1.1704 | 1.0 | 96 | 0.7085 | 0.0 | 0.0 | 0.0 | 93 | 0.0 | 0.0 | 0.0 | 166 | 0.0 | 0.0 | 0.0 | 142 | 0.0 | 0.0 | 0.0 | 0.8343 | | 0.668 | 2.0 | 192 | 0.5723 | 0.0 | 0.0 | 0.0 | 93 | 0.5 | 0.0060 | 0.0119 | 166 | 0.0 | 0.0 | 0.0 | 142 | 0.3333 | 0.0025 | 0.0050 | 0.8348 | | 0.5537 | 3.0 | 288 | 0.4494 | 0.0 | 0.0 | 0.0 | 93 | 0.4167 | 0.0602 | 0.1053 | 166 | 0.2353 | 0.0563 | 0.0909 | 142 | 0.3 | 0.0449 | 0.0781 | 0.8455 | | 0.4382 | 4.0 | 384 | 0.3281 | 0.2727 | 0.0645 | 0.1043 | 93 | 0.3710 | 0.2771 | 0.3172 | 166 | 0.3882 | 0.4648 | 0.4231 | 142 | 0.3734 | 0.2943 | 0.3291 | 0.8883 | | 0.32 | 5.0 | 480 | 0.2350 | 0.3857 | 0.2903 | 0.3313 | 93 | 0.5231 | 0.6145 | 0.5651 | 166 | 0.5886 | 0.7254 | 0.6498 | 142 | 0.5273 | 0.5786 | 0.5517 | 0.9292 | | 0.2426 | 6.0 | 576 | 0.1839 | 0.5745 | 0.5806 | 0.5775 | 93 | 0.6158 | 0.7530 | 0.6775 | 166 | 0.7636 | 0.8873 | 0.8208 | 142 | 0.6602 | 0.7606 | 0.7068 | 0.9512 | | 0.1962 | 7.0 | 672 | 0.1463 | 0.7188 | 0.7419 | 0.7302 | 93 | 0.6804 | 0.7952 | 0.7333 | 166 | 0.8903 | 0.9718 | 0.9293 | 142 | 0.7618 | 0.8454 | 0.8014 | 0.9619 | | 0.1696 | 8.0 | 768 | 0.1200 | 0.7732 | 0.8065 | 0.7895 | 93 | 0.7312 | 0.8193 | 0.7727 | 166 | 0.9133 | 0.9648 | 0.9384 | 142 | 0.8037 | 0.8678 | 0.8345 | 0.9682 | | 0.1508 | 9.0 | 864 | 0.1069 | 0.8 | 0.8602 | 0.8290 | 93 | 0.7473 | 0.8373 | 0.7898 | 166 | 0.9079 | 0.9718 | 0.9388 | 142 | 0.8151 | 0.8903 | 0.8510 | 0.9695 | | 0.1359 | 10.0 | 960 | 0.0937 | 0.7980 | 0.8495 | 0.8229 | 93 | 0.7581 | 0.8494 | 0.8011 | 166 | 0.9195 | 0.9648 | 0.9416 | 142 | 0.8226 | 0.8903 | 0.8551 | 0.9712 | | 0.126 | 11.0 | 1056 | 0.0873 | 0.7843 | 0.8602 | 0.8205 | 93 | 0.7772 | 0.8614 | 0.8171 | 166 | 0.9133 | 0.9648 | 0.9384 | 142 | 0.8257 | 0.8978 | 0.8602 | 0.9726 | | 0.1191 | 12.0 | 1152 | 0.0826 | 0.7885 | 0.8817 | 0.8325 | 93 | 0.7861 | 0.8855 | 0.8329 | 166 | 0.9195 | 0.9648 | 0.9416 | 142 | 0.8318 | 0.9127 | 0.8704 | 0.9739 | | 0.1126 | 13.0 | 1248 | 0.0742 | 0.8235 | 0.9032 | 0.8615 | 93 | 0.8167 | 0.8855 | 0.8497 | 166 | 0.9320 | 0.9648 | 0.9481 | 142 | 0.8578 | 0.9177 | 0.8867 | 0.9770 | | 0.1061 | 14.0 | 1344 | 0.0707 | 0.85 | 0.9140 | 0.8808 | 93 | 0.8439 | 0.8795 | 0.8614 | 166 | 0.9320 | 0.9648 | 0.9481 | 142 | 0.8762 | 0.9177 | 0.8965 | 0.9789 | | 0.1003 | 15.0 | 1440 | 0.0703 | 0.86 | 0.9247 | 0.8912 | 93 | 0.8278 | 0.8976 | 0.8613 | 166 | 0.9448 | 0.9648 | 0.9547 | 142 | 0.8753 | 0.9277 | 0.9007 | 0.9783 | | 0.1008 | 16.0 | 1536 | 0.0686 | 0.8529 | 0.9355 | 0.8923 | 93 | 0.8287 | 0.9036 | 0.8646 | 166 | 0.9320 | 0.9648 | 0.9481 | 142 | 0.8698 | 0.9327 | 0.9001 | 0.9778 | | 0.0957 | 17.0 | 1632 | 0.0617 | 0.86 | 0.9247 | 0.8912 | 93 | 0.8613 | 0.8976 | 0.8791 | 166 | 0.9514 | 0.9648 | 0.9580 | 142 | 0.8921 | 0.9277 | 0.9095 | 0.9802 | | 0.0923 | 18.0 | 1728 | 0.0594 | 0.8687 | 0.9247 | 0.8958 | 93 | 0.8713 | 0.8976 | 0.8843 | 166 | 0.9580 | 0.9648 | 0.9614 | 142 | 0.9007 | 0.9277 | 0.9140 | 0.9819 | | 0.0894 | 19.0 | 1824 | 0.0591 | 0.8529 | 0.9355 | 0.8923 | 93 | 0.8497 | 0.8855 | 0.8673 | 166 | 0.9448 | 0.9648 | 0.9547 | 142 | 0.8833 | 0.9252 | 0.9038 | 0.9800 | | 0.0852 | 20.0 | 1920 | 0.0565 | 0.8365 | 0.9355 | 0.8832 | 93 | 0.8690 | 0.8795 | 0.8743 | 166 | 0.9448 | 0.9648 | 0.9547 | 142 | 0.8873 | 0.9227 | 0.9046 | 0.9813 | | 0.0857 | 21.0 | 2016 | 0.0591 | 0.8286 | 0.9355 | 0.8788 | 93 | 0.8514 | 0.8976 | 0.8739 | 166 | 0.9580 | 0.9648 | 0.9614 | 142 | 0.8818 | 0.9302 | 0.9053 | 0.9816 | | 0.0817 | 22.0 | 2112 | 0.0585 | 0.8286 | 0.9355 | 0.8788 | 93 | 0.8506 | 0.8916 | 0.8706 | 166 | 0.9448 | 0.9648 | 0.9547 | 142 | 0.8774 | 0.9277 | 0.9018 | 0.9808 | | 0.0792 | 23.0 | 2208 | 0.0544 | 0.8431 | 0.9247 | 0.8821 | 93 | 0.8675 | 0.8675 | 0.8675 | 166 | 0.9580 | 0.9648 | 0.9614 | 142 | 0.8929 | 0.9152 | 0.9039 | 0.9811 | | 0.0788 | 24.0 | 2304 | 0.0548 | 0.8269 | 0.9247 | 0.8731 | 93 | 0.8675 | 0.8675 | 0.8675 | 166 | 0.9580 | 0.9648 | 0.9614 | 142 | 0.8886 | 0.9152 | 0.9017 | 0.9811 | | 0.0772 | 25.0 | 2400 | 0.0541 | 0.8365 | 0.9355 | 0.8832 | 93 | 0.875 | 0.8855 | 0.8802 | 166 | 0.9514 | 0.9648 | 0.9580 | 142 | 0.8918 | 0.9252 | 0.9082 | 0.9816 | | 0.0755 | 26.0 | 2496 | 0.0507 | 0.8776 | 0.9247 | 0.9005 | 93 | 0.8772 | 0.9036 | 0.8902 | 166 | 0.9514 | 0.9648 | 0.9580 | 142 | 0.9031 | 0.9302 | 0.9165 | 0.9835 | | 0.0717 | 27.0 | 2592 | 0.0506 | 0.8687 | 0.9247 | 0.8958 | 93 | 0.8678 | 0.9096 | 0.8882 | 166 | 0.9580 | 0.9648 | 0.9614 | 142 | 0.8990 | 0.9327 | 0.9155 | 0.9841 | | 0.0725 | 28.0 | 2688 | 0.0518 | 0.8350 | 0.9247 | 0.8776 | 93 | 0.8765 | 0.8976 | 0.8869 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.8988 | 0.9302 | 0.9142 | 0.9833 | | 0.0713 | 29.0 | 2784 | 0.0505 | 0.8431 | 0.9247 | 0.8821 | 93 | 0.8817 | 0.8976 | 0.8896 | 166 | 0.9580 | 0.9648 | 0.9614 | 142 | 0.8986 | 0.9277 | 0.9129 | 0.9833 | | 0.0671 | 30.0 | 2880 | 0.0477 | 0.8687 | 0.9247 | 0.8958 | 93 | 0.8889 | 0.9157 | 0.9021 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9126 | 0.9377 | 0.9250 | 0.9846 | | 0.0666 | 31.0 | 2976 | 0.0480 | 0.8350 | 0.9247 | 0.8776 | 93 | 0.8855 | 0.8855 | 0.8855 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9027 | 0.9252 | 0.9138 | 0.9838 | | 0.0638 | 32.0 | 3072 | 0.0482 | 0.8515 | 0.9247 | 0.8866 | 93 | 0.8922 | 0.8976 | 0.8949 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9098 | 0.9302 | 0.9199 | 0.9844 | | 0.0647 | 33.0 | 3168 | 0.0482 | 0.8350 | 0.9247 | 0.8776 | 93 | 0.8862 | 0.8916 | 0.8889 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9029 | 0.9277 | 0.9151 | 0.9835 | | 0.0642 | 34.0 | 3264 | 0.0486 | 0.8431 | 0.9247 | 0.8821 | 93 | 0.8779 | 0.9096 | 0.8935 | 166 | 0.9650 | 0.9718 | 0.9684 | 142 | 0.8993 | 0.9352 | 0.9169 | 0.9833 | | 0.0603 | 35.0 | 3360 | 0.0463 | 0.8515 | 0.9247 | 0.8866 | 93 | 0.8929 | 0.9036 | 0.8982 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9100 | 0.9327 | 0.9212 | 0.9852 | | 0.0627 | 36.0 | 3456 | 0.0483 | 0.8350 | 0.9247 | 0.8776 | 93 | 0.8876 | 0.9036 | 0.8955 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9034 | 0.9327 | 0.9178 | 0.9846 | | 0.0606 | 37.0 | 3552 | 0.0461 | 0.8776 | 0.9247 | 0.9005 | 93 | 0.8902 | 0.9277 | 0.9086 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9153 | 0.9426 | 0.9287 | 0.9855 | | 0.0602 | 38.0 | 3648 | 0.0457 | 0.8958 | 0.9247 | 0.9101 | 93 | 0.8953 | 0.9277 | 0.9112 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9220 | 0.9426 | 0.9322 | 0.9857 | | 0.058 | 39.0 | 3744 | 0.0452 | 0.8866 | 0.9247 | 0.9053 | 93 | 0.8902 | 0.9277 | 0.9086 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9175 | 0.9426 | 0.9299 | 0.9860 | | 0.0579 | 40.0 | 3840 | 0.0443 | 0.8958 | 0.9247 | 0.9101 | 93 | 0.9 | 0.9217 | 0.9107 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9240 | 0.9401 | 0.9320 | 0.9863 | | 0.0551 | 41.0 | 3936 | 0.0439 | 0.8958 | 0.9247 | 0.9101 | 93 | 0.8960 | 0.9337 | 0.9145 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9221 | 0.9451 | 0.9335 | 0.9868 | | 0.0568 | 42.0 | 4032 | 0.0435 | 0.8788 | 0.9355 | 0.9062 | 93 | 0.8941 | 0.9157 | 0.9048 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9173 | 0.9401 | 0.9286 | 0.9866 | | 0.0557 | 43.0 | 4128 | 0.0440 | 0.8969 | 0.9355 | 0.9158 | 93 | 0.9042 | 0.9096 | 0.9069 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9261 | 0.9377 | 0.9318 | 0.9860 | | 0.0582 | 44.0 | 4224 | 0.0446 | 0.8529 | 0.9355 | 0.8923 | 93 | 0.9024 | 0.8916 | 0.8970 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9142 | 0.9302 | 0.9221 | 0.9844 | | 0.0548 | 45.0 | 4320 | 0.0424 | 0.8878 | 0.9355 | 0.9110 | 93 | 0.9107 | 0.9217 | 0.9162 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9265 | 0.9426 | 0.9345 | 0.9866 | | 0.0533 | 46.0 | 4416 | 0.0424 | 0.8788 | 0.9355 | 0.9062 | 93 | 0.8988 | 0.9096 | 0.9042 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9193 | 0.9377 | 0.9284 | 0.9866 | | 0.0516 | 47.0 | 4512 | 0.0428 | 0.8687 | 0.9247 | 0.8958 | 93 | 0.8864 | 0.9398 | 0.9123 | 166 | 0.9650 | 0.9718 | 0.9684 | 142 | 0.9091 | 0.9476 | 0.9280 | 0.9860 | | 0.0501 | 48.0 | 4608 | 0.0430 | 0.8788 | 0.9355 | 0.9062 | 93 | 0.9042 | 0.9096 | 0.9069 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9216 | 0.9377 | 0.9295 | 0.9863 | | 0.053 | 49.0 | 4704 | 0.0433 | 0.8788 | 0.9355 | 0.9062 | 93 | 0.9053 | 0.9217 | 0.9134 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9220 | 0.9426 | 0.9322 | 0.9866 | | 0.0483 | 50.0 | 4800 | 0.0416 | 0.9062 | 0.9355 | 0.9206 | 93 | 0.9048 | 0.9157 | 0.9102 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9286 | 0.9401 | 0.9343 | 0.9871 | | 0.0505 | 51.0 | 4896 | 0.0418 | 0.8980 | 0.9462 | 0.9215 | 93 | 0.9096 | 0.9096 | 0.9096 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9286 | 0.9401 | 0.9343 | 0.9866 | | 0.05 | 52.0 | 4992 | 0.0403 | 0.9255 | 0.9355 | 0.9305 | 93 | 0.8895 | 0.9217 | 0.9053 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9265 | 0.9426 | 0.9345 | 0.9879 | | 0.0493 | 53.0 | 5088 | 0.0422 | 0.8969 | 0.9355 | 0.9158 | 93 | 0.9048 | 0.9157 | 0.9102 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9263 | 0.9401 | 0.9332 | 0.9860 | | 0.0487 | 54.0 | 5184 | 0.0408 | 0.9158 | 0.9355 | 0.9255 | 93 | 0.9053 | 0.9217 | 0.9134 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9310 | 0.9426 | 0.9368 | 0.9877 | | 0.0485 | 55.0 | 5280 | 0.0402 | 0.9158 | 0.9355 | 0.9255 | 93 | 0.9112 | 0.9277 | 0.9194 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9335 | 0.9451 | 0.9393 | 0.9874 | | 0.0491 | 56.0 | 5376 | 0.0432 | 0.8878 | 0.9355 | 0.9110 | 93 | 0.8960 | 0.9337 | 0.9145 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9201 | 0.9476 | 0.9337 | 0.9863 | | 0.0495 | 57.0 | 5472 | 0.0409 | 0.8980 | 0.9462 | 0.9215 | 93 | 0.9048 | 0.9157 | 0.9102 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9265 | 0.9426 | 0.9345 | 0.9866 | | 0.0495 | 58.0 | 5568 | 0.0425 | 0.8980 | 0.9462 | 0.9215 | 93 | 0.9048 | 0.9157 | 0.9102 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9265 | 0.9426 | 0.9345 | 0.9866 | | 0.0462 | 59.0 | 5664 | 0.0412 | 0.8980 | 0.9462 | 0.9215 | 93 | 0.9102 | 0.9157 | 0.9129 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9287 | 0.9426 | 0.9356 | 0.9871 | | 0.048 | 60.0 | 5760 | 0.0409 | 0.9072 | 0.9462 | 0.9263 | 93 | 0.9157 | 0.9157 | 0.9157 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9333 | 0.9426 | 0.9380 | 0.9868 | | 0.048 | 61.0 | 5856 | 0.0396 | 0.8980 | 0.9462 | 0.9215 | 93 | 0.9102 | 0.9157 | 0.9129 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9287 | 0.9426 | 0.9356 | 0.9879 | | 0.0461 | 62.0 | 5952 | 0.0403 | 0.8969 | 0.9355 | 0.9158 | 93 | 0.8935 | 0.9096 | 0.9015 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9216 | 0.9377 | 0.9295 | 0.9871 | | 0.0459 | 63.0 | 6048 | 0.0405 | 0.8889 | 0.9462 | 0.9167 | 93 | 0.9162 | 0.9217 | 0.9189 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9289 | 0.9451 | 0.9370 | 0.9871 | | 0.0461 | 64.0 | 6144 | 0.0394 | 0.8969 | 0.9355 | 0.9158 | 93 | 0.8882 | 0.9096 | 0.8988 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9193 | 0.9377 | 0.9284 | 0.9874 | | 0.0431 | 65.0 | 6240 | 0.0408 | 0.8980 | 0.9462 | 0.9215 | 93 | 0.9091 | 0.9036 | 0.9063 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9284 | 0.9377 | 0.9330 | 0.9874 | | 0.0448 | 66.0 | 6336 | 0.0396 | 0.9072 | 0.9462 | 0.9263 | 93 | 0.9053 | 0.9217 | 0.9134 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9289 | 0.9451 | 0.9370 | 0.9877 | | 0.044 | 67.0 | 6432 | 0.0403 | 0.8980 | 0.9462 | 0.9215 | 93 | 0.9091 | 0.9036 | 0.9063 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9284 | 0.9377 | 0.9330 | 0.9871 | | 0.0439 | 68.0 | 6528 | 0.0404 | 0.8889 | 0.9462 | 0.9167 | 93 | 0.9207 | 0.9096 | 0.9152 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9309 | 0.9401 | 0.9355 | 0.9874 | | 0.0451 | 69.0 | 6624 | 0.0416 | 0.88 | 0.9462 | 0.9119 | 93 | 0.9207 | 0.9096 | 0.9152 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9286 | 0.9401 | 0.9343 | 0.9868 | | 0.0429 | 70.0 | 6720 | 0.0403 | 0.88 | 0.9462 | 0.9119 | 93 | 0.9053 | 0.9217 | 0.9134 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9221 | 0.9451 | 0.9335 | 0.9877 | | 0.0447 | 71.0 | 6816 | 0.0402 | 0.8980 | 0.9462 | 0.9215 | 93 | 0.9167 | 0.9277 | 0.9222 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9314 | 0.9476 | 0.9394 | 0.9877 | | 0.0437 | 72.0 | 6912 | 0.0398 | 0.8889 | 0.9462 | 0.9167 | 93 | 0.9152 | 0.9096 | 0.9124 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9286 | 0.9401 | 0.9343 | 0.9871 | | 0.041 | 73.0 | 7008 | 0.0399 | 0.8878 | 0.9355 | 0.9110 | 93 | 0.9107 | 0.9217 | 0.9162 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9265 | 0.9426 | 0.9345 | 0.9874 | | 0.0425 | 74.0 | 7104 | 0.0406 | 0.8969 | 0.9355 | 0.9158 | 93 | 0.9112 | 0.9277 | 0.9194 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9289 | 0.9451 | 0.9370 | 0.9871 | | 0.0426 | 75.0 | 7200 | 0.0395 | 0.8878 | 0.9355 | 0.9110 | 93 | 0.9 | 0.9217 | 0.9107 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9220 | 0.9426 | 0.9322 | 0.9871 | | 0.0398 | 76.0 | 7296 | 0.0402 | 0.8889 | 0.9462 | 0.9167 | 93 | 0.9212 | 0.9157 | 0.9184 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9310 | 0.9426 | 0.9368 | 0.9874 | | 0.0407 | 77.0 | 7392 | 0.0392 | 0.8788 | 0.9355 | 0.9062 | 93 | 0.9096 | 0.9096 | 0.9096 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9238 | 0.9377 | 0.9307 | 0.9874 | | 0.0411 | 78.0 | 7488 | 0.0394 | 0.8788 | 0.9355 | 0.9062 | 93 | 0.8988 | 0.9096 | 0.9042 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9193 | 0.9377 | 0.9284 | 0.9868 | | 0.0417 | 79.0 | 7584 | 0.0395 | 0.8788 | 0.9355 | 0.9062 | 93 | 0.9053 | 0.9217 | 0.9134 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9220 | 0.9426 | 0.9322 | 0.9871 | | 0.0412 | 80.0 | 7680 | 0.0396 | 0.8788 | 0.9355 | 0.9062 | 93 | 0.8895 | 0.9217 | 0.9053 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9153 | 0.9426 | 0.9287 | 0.9877 | | 0.0431 | 81.0 | 7776 | 0.0399 | 0.87 | 0.9355 | 0.9016 | 93 | 0.8941 | 0.9157 | 0.9048 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9150 | 0.9401 | 0.9274 | 0.9871 | | 0.042 | 82.0 | 7872 | 0.0401 | 0.8889 | 0.9462 | 0.9167 | 93 | 0.9162 | 0.9217 | 0.9189 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9289 | 0.9451 | 0.9370 | 0.9871 | | 0.0412 | 83.0 | 7968 | 0.0403 | 0.8889 | 0.9462 | 0.9167 | 93 | 0.9162 | 0.9217 | 0.9189 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9289 | 0.9451 | 0.9370 | 0.9871 | | 0.0413 | 84.0 | 8064 | 0.0409 | 0.8889 | 0.9462 | 0.9167 | 93 | 0.9102 | 0.9157 | 0.9129 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9265 | 0.9426 | 0.9345 | 0.9871 | | 0.0405 | 85.0 | 8160 | 0.0397 | 0.8713 | 0.9462 | 0.9072 | 93 | 0.9107 | 0.9217 | 0.9162 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9221 | 0.9451 | 0.9335 | 0.9879 | | 0.0405 | 86.0 | 8256 | 0.0397 | 0.8713 | 0.9462 | 0.9072 | 93 | 0.9107 | 0.9217 | 0.9162 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9221 | 0.9451 | 0.9335 | 0.9879 | | 0.0401 | 87.0 | 8352 | 0.0398 | 0.8713 | 0.9462 | 0.9072 | 93 | 0.9048 | 0.9157 | 0.9102 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9197 | 0.9426 | 0.9310 | 0.9877 | | 0.041 | 88.0 | 8448 | 0.0398 | 0.8713 | 0.9462 | 0.9072 | 93 | 0.9107 | 0.9217 | 0.9162 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9221 | 0.9451 | 0.9335 | 0.9879 | | 0.0397 | 89.0 | 8544 | 0.0396 | 0.87 | 0.9355 | 0.9016 | 93 | 0.9048 | 0.9157 | 0.9102 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9195 | 0.9401 | 0.9297 | 0.9877 | | 0.0398 | 90.0 | 8640 | 0.0396 | 0.87 | 0.9355 | 0.9016 | 93 | 0.9053 | 0.9217 | 0.9134 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9197 | 0.9426 | 0.9310 | 0.9879 | | 0.039 | 91.0 | 8736 | 0.0395 | 0.88 | 0.9462 | 0.9119 | 93 | 0.9162 | 0.9217 | 0.9189 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9267 | 0.9451 | 0.9358 | 0.9877 | | 0.0385 | 92.0 | 8832 | 0.0398 | 0.88 | 0.9462 | 0.9119 | 93 | 0.9048 | 0.9157 | 0.9102 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9220 | 0.9426 | 0.9322 | 0.9871 | | 0.0385 | 93.0 | 8928 | 0.0398 | 0.88 | 0.9462 | 0.9119 | 93 | 0.9048 | 0.9157 | 0.9102 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9220 | 0.9426 | 0.9322 | 0.9874 | | 0.0398 | 94.0 | 9024 | 0.0397 | 0.88 | 0.9462 | 0.9119 | 93 | 0.9048 | 0.9157 | 0.9102 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9220 | 0.9426 | 0.9322 | 0.9871 | | 0.0382 | 95.0 | 9120 | 0.0396 | 0.88 | 0.9462 | 0.9119 | 93 | 0.9048 | 0.9157 | 0.9102 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9220 | 0.9426 | 0.9322 | 0.9874 | | 0.0408 | 96.0 | 9216 | 0.0394 | 0.88 | 0.9462 | 0.9119 | 93 | 0.9048 | 0.9157 | 0.9102 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9220 | 0.9426 | 0.9322 | 0.9874 | | 0.0372 | 97.0 | 9312 | 0.0395 | 0.88 | 0.9462 | 0.9119 | 93 | 0.9048 | 0.9157 | 0.9102 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9220 | 0.9426 | 0.9322 | 0.9871 | | 0.0392 | 98.0 | 9408 | 0.0395 | 0.8788 | 0.9355 | 0.9062 | 93 | 0.8994 | 0.9157 | 0.9075 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9195 | 0.9401 | 0.9297 | 0.9871 | | 0.0393 | 99.0 | 9504 | 0.0395 | 0.88 | 0.9462 | 0.9119 | 93 | 0.9048 | 0.9157 | 0.9102 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9220 | 0.9426 | 0.9322 | 0.9874 | | 0.0399 | 100.0 | 9600 | 0.0395 | 0.88 | 0.9462 | 0.9119 | 93 | 0.9048 | 0.9157 | 0.9102 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9220 | 0.9426 | 0.9322 | 0.9874 | ### Framework versions - Transformers 4.39.3 - Pytorch 2.3.0+cu121 - Datasets 2.19.1 - Tokenizers 0.15.2
Trofish/korean_syllable_roberta_128
Trofish
2024-06-04T01:56:53Z
110
0
transformers
[ "transformers", "safetensors", "roberta", "fill-mask", "ko", "dataset:klue/klue", "arxiv:2105.09680", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
fill-mask
2024-05-08T03:10:37Z
--- license: apache-2.0 datasets: - klue/klue language: - ko metrics: - f1 - accuracy - pearsonr --- # RoBERTa-base Korean ## 모델 설명 이 RoBERTa 모델은 다양한 한국어 텍스트 데이터셋에서 **음절** 단위로 사전 학습되었습니다. 자체 구축한 한국어 음절 단위 vocab을 사용하였습니다. ## 아키텍처 - **모델 유형**: RoBERTa - **아키텍처**: RobertaForMaskedLM - **모델 크기**: 128 hidden size, 8 hidden layers, 8 attention heads - **max_position_embeddings**: 514 - **intermediate_size**: 2,048 - **vocab_size**: 1,428 ## 학습 데이터 사용된 데이터셋은 다음과 같습니다: - **모두의말뭉치**: 채팅, 게시판, 일상대화, 뉴스, 방송대본, 책 등 - **AIHUB**: SNS, 유튜브 댓글, 도서 문장 - **기타**: 나무위키, 한국어 위키피디아 총 합산된 데이터는 **약 11GB** 입니다. **(4B tokens)** ## 학습 상세 - **BATCH_SIZE**: 112 (GPU당) - **ACCUMULATE**: 36 - **Total_BATCH_SIZE**: 8,064 - **MAX_STEPS**: 12,500 - **TRAIN_STEPS * BATCH_SIZE**: **100M** - **WARMUP_STEPS**: 2,400 - **최적화**: AdamW, LR 1e-3, BETA (0.9, 0.98), eps 1e-6 - **학습률 감쇠**: linear - **사용된 하드웨어**: 2x RTX 8000 GPU ![image/png](https://cdn-uploads.huggingface.co/production/uploads/64a0fd6fd3149e05bc5260dd/TPSI6kksBLzcbloDCUgwc.png) ![image/png](https://cdn-uploads.huggingface.co/production/uploads/64a0fd6fd3149e05bc5260dd/z3_zVWsGsyT7YD9Zr9aeK.png) ## 성능 평가 - **KLUE benchmark test를 통해서 성능을 평가했습니다.** - klue-roberta-base에 비해서 매우 작은 크기라 성능이 낮기는 하지만 hidden size 512인 모델은 크기 대비 좋은 성능을 보였습니다. ![image/png](https://cdn-uploads.huggingface.co/production/uploads/64a0fd6fd3149e05bc5260dd/I8e60cf9w-IQCHDgKiooq.png) ![image/png](https://cdn-uploads.huggingface.co/production/uploads/64a0fd6fd3149e05bc5260dd/hkc5ko9Vo-pkKmtouN7xc.png) ## 사용 방법 ### tokenizer의 경우 wordpiece가 아닌 syllable 단위이기에 AutoTokenizer가 아니라 SyllableTokenizer를 사용해야 합니다. ### (레포에서 제공하고 있는 syllabletokenizer.py를 가져와서 사용해야 합니다.) ```python from transformers import AutoModel, AutoTokenizer from syllabletokenizer import SyllableTokenizer # 모델과 토크나이저 불러오기 model = AutoModelForMaskedLM.from_pretrained("Trofish/korean_syllable_roberta") tokenizer = SyllableTokenizer(vocab_file='vocab.json',**tokenizer_kwargs) # 텍스트를 토큰으로 변환하고 예측 수행 inputs = tokenizer("여기에 한국어 텍스트 입력", return_tensors="pt") outputs = model(**inputs) ``` ## Citation **klue** ``` @misc{park2021klue, title={KLUE: Korean Language Understanding Evaluation}, author={Sungjoon Park and Jihyung Moon and Sungdong Kim and Won Ik Cho and Jiyoon Han and Jangwon Park and Chisung Song and Junseong Kim and Yongsook Song and Taehwan Oh and Joohong Lee and Juhyun Oh and Sungwon Lyu and Younghoon Jeong and Inkwon Lee and Sangwoo Seo and Dongjun Lee and Hyunwoo Kim and Myeonghwa Lee and Seongbo Jang and Seungwon Do and Sunkyoung Kim and Kyungtae Lim and Jongwon Lee and Kyumin Park and Jamin Shin and Seonghyun Kim and Lucy Park and Alice Oh and Jungwoo Ha and Kyunghyun Cho}, year={2021}, eprint={2105.09680}, archivePrefix={arXiv}, primaryClass={cs.CL} } ```
apwic/nerui-lora-r16-1
apwic
2024-06-04T01:54:08Z
0
0
null
[ "tensorboard", "generated_from_trainer", "id", "base_model:indolem/indobert-base-uncased", "base_model:finetune:indolem/indobert-base-uncased", "license:mit", "region:us" ]
null
2024-05-28T13:06:00Z
--- language: - id license: mit base_model: indolem/indobert-base-uncased tags: - generated_from_trainer model-index: - name: nerui-lora-r16-1 results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # nerui-lora-r16-1 This model is a fine-tuned version of [indolem/indobert-base-uncased](https://huggingface.co/indolem/indobert-base-uncased) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.0342 - Location Precision: 0.9316 - Location Recall: 0.9397 - Location F1: 0.9356 - Location Number: 116 - Organization Precision: 0.9484 - Organization Recall: 0.9304 - Organization F1: 0.9393 - Organization Number: 158 - Person Precision: 0.984 - Person Recall: 0.9919 - Person F1: 0.9880 - Person Number: 124 - Overall Precision: 0.9547 - Overall Recall: 0.9523 - Overall F1: 0.9535 - Overall Accuracy: 0.9896 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 64 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 100.0 ### Training results | Training Loss | Epoch | Step | Validation Loss | Location Precision | Location Recall | Location F1 | Location Number | Organization Precision | Organization Recall | Organization F1 | Organization Number | Person Precision | Person Recall | Person F1 | Person Number | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy | |:-------------:|:-----:|:----:|:---------------:|:------------------:|:---------------:|:-----------:|:---------------:|:----------------------:|:-------------------:|:---------------:|:-------------------:|:----------------:|:-------------:|:---------:|:-------------:|:-----------------:|:--------------:|:----------:|:----------------:| | 1.0545 | 1.0 | 96 | 0.6622 | 0.0 | 0.0 | 0.0 | 116 | 0.0 | 0.0 | 0.0 | 158 | 0.0 | 0.0 | 0.0 | 124 | 0.0 | 0.0 | 0.0 | 0.8394 | | 0.64 | 2.0 | 192 | 0.5206 | 0.0 | 0.0 | 0.0 | 116 | 0.5 | 0.0127 | 0.0247 | 158 | 0.0 | 0.0 | 0.0 | 124 | 0.3333 | 0.0050 | 0.0099 | 0.8400 | | 0.503 | 3.0 | 288 | 0.3728 | 0.0833 | 0.0086 | 0.0156 | 116 | 0.3625 | 0.1835 | 0.2437 | 158 | 0.36 | 0.2903 | 0.3214 | 124 | 0.3438 | 0.1658 | 0.2237 | 0.8718 | | 0.3537 | 4.0 | 384 | 0.2518 | 0.3947 | 0.2586 | 0.3125 | 116 | 0.4885 | 0.5380 | 0.5120 | 158 | 0.5521 | 0.7258 | 0.6272 | 124 | 0.4964 | 0.5151 | 0.5055 | 0.9198 | | 0.2513 | 5.0 | 480 | 0.1812 | 0.6111 | 0.5690 | 0.5893 | 116 | 0.5979 | 0.7342 | 0.6591 | 158 | 0.8028 | 0.9194 | 0.8571 | 124 | 0.6667 | 0.7437 | 0.7031 | 0.9498 | | 0.1948 | 6.0 | 576 | 0.1359 | 0.7438 | 0.7759 | 0.7595 | 116 | 0.7368 | 0.7975 | 0.7660 | 158 | 0.8905 | 0.9839 | 0.9349 | 124 | 0.7879 | 0.8492 | 0.8174 | 0.9657 | | 0.1623 | 7.0 | 672 | 0.1109 | 0.7917 | 0.8190 | 0.8051 | 116 | 0.7619 | 0.8101 | 0.7853 | 158 | 0.9104 | 0.9839 | 0.9457 | 124 | 0.8175 | 0.8668 | 0.8415 | 0.9701 | | 0.1397 | 8.0 | 768 | 0.0954 | 0.8083 | 0.8362 | 0.8220 | 116 | 0.7976 | 0.8481 | 0.8221 | 158 | 0.9389 | 0.9919 | 0.9647 | 124 | 0.8449 | 0.8894 | 0.8666 | 0.9739 | | 0.1266 | 9.0 | 864 | 0.0877 | 0.8189 | 0.8966 | 0.8560 | 116 | 0.8155 | 0.8671 | 0.8405 | 158 | 0.9318 | 0.9919 | 0.9609 | 124 | 0.8525 | 0.9146 | 0.8824 | 0.9761 | | 0.1157 | 10.0 | 960 | 0.0731 | 0.8607 | 0.9052 | 0.8824 | 116 | 0.8519 | 0.8734 | 0.8625 | 158 | 0.9609 | 0.9919 | 0.9762 | 124 | 0.8883 | 0.9196 | 0.9037 | 0.9800 | | 0.1111 | 11.0 | 1056 | 0.0673 | 0.8760 | 0.9138 | 0.8945 | 116 | 0.8606 | 0.8987 | 0.8793 | 158 | 0.9685 | 0.9919 | 0.9801 | 124 | 0.8983 | 0.9322 | 0.9149 | 0.9813 | | 0.1044 | 12.0 | 1152 | 0.0635 | 0.8760 | 0.9138 | 0.8945 | 116 | 0.8554 | 0.8987 | 0.8765 | 158 | 0.9685 | 0.9919 | 0.9801 | 124 | 0.8961 | 0.9322 | 0.9138 | 0.9811 | | 0.098 | 13.0 | 1248 | 0.0578 | 0.8898 | 0.9052 | 0.8974 | 116 | 0.8589 | 0.8861 | 0.8723 | 158 | 0.9762 | 0.9919 | 0.9840 | 124 | 0.9042 | 0.9246 | 0.9143 | 0.9816 | | 0.0939 | 14.0 | 1344 | 0.0559 | 0.875 | 0.9052 | 0.8898 | 116 | 0.8642 | 0.8861 | 0.8750 | 158 | 0.9762 | 0.9919 | 0.9840 | 124 | 0.9020 | 0.9246 | 0.9132 | 0.9819 | | 0.091 | 15.0 | 1440 | 0.0558 | 0.8824 | 0.9052 | 0.8936 | 116 | 0.8402 | 0.8987 | 0.8685 | 158 | 0.9685 | 0.9919 | 0.9801 | 124 | 0.8916 | 0.9296 | 0.9102 | 0.9816 | | 0.088 | 16.0 | 1536 | 0.0555 | 0.875 | 0.9052 | 0.8898 | 116 | 0.8452 | 0.8987 | 0.8712 | 158 | 0.9535 | 0.9919 | 0.9723 | 124 | 0.8873 | 0.9296 | 0.9080 | 0.9811 | | 0.0857 | 17.0 | 1632 | 0.0523 | 0.8824 | 0.9052 | 0.8936 | 116 | 0.8868 | 0.8924 | 0.8896 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9156 | 0.9271 | 0.9213 | 0.9846 | | 0.0809 | 18.0 | 1728 | 0.0498 | 0.8678 | 0.9052 | 0.8861 | 116 | 0.8659 | 0.8987 | 0.8820 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9024 | 0.9296 | 0.9158 | 0.9833 | | 0.0773 | 19.0 | 1824 | 0.0482 | 0.8898 | 0.9052 | 0.8974 | 116 | 0.8827 | 0.9051 | 0.8938 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9160 | 0.9322 | 0.9240 | 0.9844 | | 0.0765 | 20.0 | 1920 | 0.0521 | 0.8833 | 0.9138 | 0.8983 | 116 | 0.8571 | 0.9114 | 0.8834 | 158 | 0.9685 | 0.9919 | 0.9801 | 124 | 0.8988 | 0.9372 | 0.9176 | 0.9822 | | 0.0754 | 21.0 | 2016 | 0.0484 | 0.875 | 0.9052 | 0.8898 | 116 | 0.8735 | 0.9177 | 0.8951 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9075 | 0.9372 | 0.9221 | 0.9841 | | 0.072 | 22.0 | 2112 | 0.0469 | 0.875 | 0.9052 | 0.8898 | 116 | 0.8606 | 0.8987 | 0.8793 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9024 | 0.9296 | 0.9158 | 0.9835 | | 0.0689 | 23.0 | 2208 | 0.0440 | 0.8898 | 0.9052 | 0.8974 | 116 | 0.8944 | 0.9114 | 0.9028 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9208 | 0.9347 | 0.9277 | 0.9844 | | 0.0697 | 24.0 | 2304 | 0.0456 | 0.8974 | 0.9052 | 0.9013 | 116 | 0.8968 | 0.8797 | 0.8882 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9244 | 0.9221 | 0.9233 | 0.9846 | | 0.0656 | 25.0 | 2400 | 0.0436 | 0.8983 | 0.9138 | 0.9060 | 116 | 0.8812 | 0.8924 | 0.8868 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9181 | 0.9296 | 0.9238 | 0.9846 | | 0.0658 | 26.0 | 2496 | 0.0427 | 0.8974 | 0.9052 | 0.9013 | 116 | 0.8704 | 0.8924 | 0.8812 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9134 | 0.9271 | 0.9202 | 0.9841 | | 0.065 | 27.0 | 2592 | 0.0421 | 0.9052 | 0.9052 | 0.9052 | 116 | 0.8834 | 0.9114 | 0.8972 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9208 | 0.9347 | 0.9277 | 0.9855 | | 0.0613 | 28.0 | 2688 | 0.0418 | 0.8833 | 0.9138 | 0.8983 | 116 | 0.8882 | 0.9051 | 0.8966 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9163 | 0.9347 | 0.9254 | 0.9855 | | 0.0591 | 29.0 | 2784 | 0.0398 | 0.9060 | 0.9138 | 0.9099 | 116 | 0.8882 | 0.9051 | 0.8966 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9231 | 0.9347 | 0.9288 | 0.9874 | | 0.06 | 30.0 | 2880 | 0.0395 | 0.9060 | 0.9138 | 0.9099 | 116 | 0.8994 | 0.9051 | 0.9022 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9277 | 0.9347 | 0.9312 | 0.9865 | | 0.0566 | 31.0 | 2976 | 0.0386 | 0.8983 | 0.9138 | 0.9060 | 116 | 0.8827 | 0.9051 | 0.8938 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9185 | 0.9347 | 0.9265 | 0.9863 | | 0.0566 | 32.0 | 3072 | 0.0392 | 0.8889 | 0.8966 | 0.8927 | 116 | 0.9045 | 0.8987 | 0.9016 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9248 | 0.9271 | 0.9260 | 0.9857 | | 0.0566 | 33.0 | 3168 | 0.0398 | 0.8992 | 0.9224 | 0.9106 | 116 | 0.9045 | 0.8987 | 0.9016 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9277 | 0.9347 | 0.9312 | 0.9865 | | 0.0568 | 34.0 | 3264 | 0.0396 | 0.9224 | 0.9224 | 0.9224 | 116 | 0.8951 | 0.9177 | 0.9062 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9305 | 0.9422 | 0.9363 | 0.9871 | | 0.0532 | 35.0 | 3360 | 0.0379 | 0.8983 | 0.9138 | 0.9060 | 116 | 0.9051 | 0.9051 | 0.9051 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9277 | 0.9347 | 0.9312 | 0.9871 | | 0.052 | 36.0 | 3456 | 0.0403 | 0.9231 | 0.9310 | 0.9270 | 116 | 0.9012 | 0.9241 | 0.9125 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9332 | 0.9472 | 0.9401 | 0.9879 | | 0.0516 | 37.0 | 3552 | 0.0386 | 0.8983 | 0.9138 | 0.9060 | 116 | 0.9 | 0.9114 | 0.9057 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9256 | 0.9372 | 0.9313 | 0.9874 | | 0.0497 | 38.0 | 3648 | 0.0378 | 0.8992 | 0.9224 | 0.9106 | 116 | 0.8994 | 0.9051 | 0.9022 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9256 | 0.9372 | 0.9313 | 0.9879 | | 0.052 | 39.0 | 3744 | 0.0366 | 0.9138 | 0.9138 | 0.9138 | 116 | 0.9006 | 0.9177 | 0.9091 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9303 | 0.9397 | 0.9350 | 0.9885 | | 0.0472 | 40.0 | 3840 | 0.0367 | 0.9138 | 0.9138 | 0.9138 | 116 | 0.8987 | 0.8987 | 0.8987 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9298 | 0.9322 | 0.9310 | 0.9868 | | 0.0486 | 41.0 | 3936 | 0.0388 | 0.9076 | 0.9310 | 0.9191 | 116 | 0.9074 | 0.9304 | 0.9187 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9310 | 0.9497 | 0.9403 | 0.9882 | | 0.047 | 42.0 | 4032 | 0.0375 | 0.9068 | 0.9224 | 0.9145 | 116 | 0.9161 | 0.8987 | 0.9073 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9347 | 0.9347 | 0.9347 | 0.9874 | | 0.0481 | 43.0 | 4128 | 0.0380 | 0.8983 | 0.9138 | 0.9060 | 116 | 0.9051 | 0.9051 | 0.9051 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9277 | 0.9347 | 0.9312 | 0.9860 | | 0.0468 | 44.0 | 4224 | 0.0391 | 0.9231 | 0.9310 | 0.9270 | 116 | 0.9062 | 0.9177 | 0.9119 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9353 | 0.9447 | 0.94 | 0.9876 | | 0.0473 | 45.0 | 4320 | 0.0366 | 0.8992 | 0.9224 | 0.9106 | 116 | 0.9045 | 0.8987 | 0.9016 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9277 | 0.9347 | 0.9312 | 0.9868 | | 0.0441 | 46.0 | 4416 | 0.0372 | 0.9 | 0.9310 | 0.9153 | 116 | 0.9006 | 0.9177 | 0.9091 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9261 | 0.9447 | 0.9353 | 0.9887 | | 0.0441 | 47.0 | 4512 | 0.0375 | 0.9224 | 0.9224 | 0.9224 | 116 | 0.9068 | 0.9241 | 0.9154 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9353 | 0.9447 | 0.94 | 0.9887 | | 0.0416 | 48.0 | 4608 | 0.0359 | 0.9237 | 0.9397 | 0.9316 | 116 | 0.9363 | 0.9304 | 0.9333 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9475 | 0.9523 | 0.9499 | 0.9898 | | 0.0446 | 49.0 | 4704 | 0.0355 | 0.9153 | 0.9310 | 0.9231 | 116 | 0.8931 | 0.8987 | 0.8959 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9279 | 0.9372 | 0.9325 | 0.9876 | | 0.0425 | 50.0 | 4800 | 0.0366 | 0.9160 | 0.9397 | 0.9277 | 116 | 0.9 | 0.9114 | 0.9057 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9307 | 0.9447 | 0.9377 | 0.9887 | | 0.0422 | 51.0 | 4896 | 0.0364 | 0.9153 | 0.9310 | 0.9231 | 116 | 0.9167 | 0.9051 | 0.9108 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9373 | 0.9397 | 0.9385 | 0.9871 | | 0.0409 | 52.0 | 4992 | 0.0357 | 0.9145 | 0.9224 | 0.9185 | 116 | 0.9074 | 0.9304 | 0.9187 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9332 | 0.9472 | 0.9401 | 0.9896 | | 0.0414 | 53.0 | 5088 | 0.0359 | 0.9231 | 0.9310 | 0.9270 | 116 | 0.9136 | 0.9367 | 0.9250 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9381 | 0.9523 | 0.9451 | 0.9901 | | 0.0403 | 54.0 | 5184 | 0.0353 | 0.9231 | 0.9310 | 0.9270 | 116 | 0.8963 | 0.9304 | 0.9130 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9310 | 0.9497 | 0.9403 | 0.9896 | | 0.0393 | 55.0 | 5280 | 0.0352 | 0.9145 | 0.9224 | 0.9185 | 116 | 0.9136 | 0.9367 | 0.9250 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9356 | 0.9497 | 0.9426 | 0.9898 | | 0.0405 | 56.0 | 5376 | 0.0359 | 0.9237 | 0.9397 | 0.9316 | 116 | 0.9430 | 0.9430 | 0.9430 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9501 | 0.9573 | 0.9537 | 0.9901 | | 0.0404 | 57.0 | 5472 | 0.0370 | 0.9160 | 0.9397 | 0.9277 | 116 | 0.9371 | 0.9430 | 0.9401 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9454 | 0.9573 | 0.9513 | 0.9896 | | 0.0398 | 58.0 | 5568 | 0.0355 | 0.9316 | 0.9397 | 0.9356 | 116 | 0.9308 | 0.9367 | 0.9338 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9476 | 0.9548 | 0.9512 | 0.9904 | | 0.0382 | 59.0 | 5664 | 0.0355 | 0.9397 | 0.9397 | 0.9397 | 116 | 0.9551 | 0.9430 | 0.9490 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9597 | 0.9573 | 0.9585 | 0.9904 | | 0.0396 | 60.0 | 5760 | 0.0344 | 0.9160 | 0.9397 | 0.9277 | 116 | 0.9125 | 0.9241 | 0.9182 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9356 | 0.9497 | 0.9426 | 0.9893 | | 0.0362 | 61.0 | 5856 | 0.0356 | 0.9231 | 0.9310 | 0.9270 | 116 | 0.9226 | 0.9051 | 0.9137 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9421 | 0.9397 | 0.9409 | 0.9879 | | 0.037 | 62.0 | 5952 | 0.0360 | 0.9237 | 0.9397 | 0.9316 | 116 | 0.9167 | 0.9051 | 0.9108 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9398 | 0.9422 | 0.9410 | 0.9882 | | 0.0386 | 63.0 | 6048 | 0.0364 | 0.9310 | 0.9310 | 0.9310 | 116 | 0.9367 | 0.9367 | 0.9367 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9499 | 0.9523 | 0.9511 | 0.9896 | | 0.0365 | 64.0 | 6144 | 0.0360 | 0.9153 | 0.9310 | 0.9231 | 116 | 0.9412 | 0.9114 | 0.9260 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9470 | 0.9422 | 0.9446 | 0.9887 | | 0.0347 | 65.0 | 6240 | 0.0354 | 0.9237 | 0.9397 | 0.9316 | 116 | 0.9416 | 0.9177 | 0.9295 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9496 | 0.9472 | 0.9484 | 0.9887 | | 0.0393 | 66.0 | 6336 | 0.0366 | 0.9397 | 0.9397 | 0.9397 | 116 | 0.9355 | 0.9177 | 0.9265 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9520 | 0.9472 | 0.9496 | 0.9887 | | 0.0359 | 67.0 | 6432 | 0.0348 | 0.9316 | 0.9397 | 0.9356 | 116 | 0.9241 | 0.9241 | 0.9241 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.945 | 0.9497 | 0.9474 | 0.9893 | | 0.0331 | 68.0 | 6528 | 0.0347 | 0.9316 | 0.9397 | 0.9356 | 116 | 0.9177 | 0.9177 | 0.9177 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9425 | 0.9472 | 0.9449 | 0.9890 | | 0.0344 | 69.0 | 6624 | 0.0341 | 0.9391 | 0.9310 | 0.9351 | 116 | 0.9363 | 0.9304 | 0.9333 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9521 | 0.9497 | 0.9509 | 0.9898 | | 0.0349 | 70.0 | 6720 | 0.0345 | 0.9397 | 0.9397 | 0.9397 | 116 | 0.9427 | 0.9367 | 0.9397 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9548 | 0.9548 | 0.9548 | 0.9901 | | 0.0349 | 71.0 | 6816 | 0.0354 | 0.9310 | 0.9310 | 0.9310 | 116 | 0.9299 | 0.9241 | 0.9270 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9472 | 0.9472 | 0.9472 | 0.9885 | | 0.0342 | 72.0 | 6912 | 0.0343 | 0.9237 | 0.9397 | 0.9316 | 116 | 0.9299 | 0.9241 | 0.9270 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.945 | 0.9497 | 0.9474 | 0.9887 | | 0.0333 | 73.0 | 7008 | 0.0354 | 0.9391 | 0.9310 | 0.9351 | 116 | 0.9241 | 0.9241 | 0.9241 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9472 | 0.9472 | 0.9472 | 0.9890 | | 0.0332 | 74.0 | 7104 | 0.0346 | 0.9231 | 0.9310 | 0.9270 | 116 | 0.9241 | 0.9241 | 0.9241 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9425 | 0.9472 | 0.9449 | 0.9893 | | 0.0346 | 75.0 | 7200 | 0.0342 | 0.9310 | 0.9310 | 0.9310 | 116 | 0.9245 | 0.9304 | 0.9274 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.945 | 0.9497 | 0.9474 | 0.9896 | | 0.0334 | 76.0 | 7296 | 0.0346 | 0.9224 | 0.9224 | 0.9224 | 116 | 0.925 | 0.9367 | 0.9308 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9426 | 0.9497 | 0.9462 | 0.9904 | | 0.034 | 77.0 | 7392 | 0.0350 | 0.9397 | 0.9397 | 0.9397 | 116 | 0.9299 | 0.9241 | 0.9270 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9497 | 0.9497 | 0.9497 | 0.9896 | | 0.0341 | 78.0 | 7488 | 0.0340 | 0.9316 | 0.9397 | 0.9356 | 116 | 0.9363 | 0.9304 | 0.9333 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9499 | 0.9523 | 0.9511 | 0.9904 | | 0.033 | 79.0 | 7584 | 0.0348 | 0.9304 | 0.9224 | 0.9264 | 116 | 0.925 | 0.9367 | 0.9308 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.945 | 0.9497 | 0.9474 | 0.9896 | | 0.0308 | 80.0 | 7680 | 0.0337 | 0.9138 | 0.9138 | 0.9138 | 116 | 0.9193 | 0.9367 | 0.9279 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9378 | 0.9472 | 0.9425 | 0.9898 | | 0.031 | 81.0 | 7776 | 0.0341 | 0.9224 | 0.9224 | 0.9224 | 116 | 0.9193 | 0.9367 | 0.9279 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9403 | 0.9497 | 0.9450 | 0.9901 | | 0.0315 | 82.0 | 7872 | 0.0340 | 0.9237 | 0.9397 | 0.9316 | 116 | 0.9363 | 0.9304 | 0.9333 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9475 | 0.9523 | 0.9499 | 0.9904 | | 0.0321 | 83.0 | 7968 | 0.0343 | 0.9391 | 0.9310 | 0.9351 | 116 | 0.9367 | 0.9367 | 0.9367 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9523 | 0.9523 | 0.9523 | 0.9901 | | 0.0317 | 84.0 | 8064 | 0.0340 | 0.9391 | 0.9310 | 0.9351 | 116 | 0.9367 | 0.9367 | 0.9367 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9523 | 0.9523 | 0.9523 | 0.9901 | | 0.0324 | 85.0 | 8160 | 0.0340 | 0.9145 | 0.9224 | 0.9185 | 116 | 0.9187 | 0.9304 | 0.9245 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9378 | 0.9472 | 0.9425 | 0.9893 | | 0.0317 | 86.0 | 8256 | 0.0339 | 0.9316 | 0.9397 | 0.9356 | 116 | 0.9423 | 0.9304 | 0.9363 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9523 | 0.9523 | 0.9523 | 0.9901 | | 0.0308 | 87.0 | 8352 | 0.0347 | 0.9316 | 0.9397 | 0.9356 | 116 | 0.9423 | 0.9304 | 0.9363 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9523 | 0.9523 | 0.9523 | 0.9898 | | 0.0311 | 88.0 | 8448 | 0.0344 | 0.9391 | 0.9310 | 0.9351 | 116 | 0.9367 | 0.9367 | 0.9367 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9523 | 0.9523 | 0.9523 | 0.9898 | | 0.0295 | 89.0 | 8544 | 0.0346 | 0.9391 | 0.9310 | 0.9351 | 116 | 0.9427 | 0.9367 | 0.9397 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9547 | 0.9523 | 0.9535 | 0.9896 | | 0.0304 | 90.0 | 8640 | 0.0343 | 0.9391 | 0.9310 | 0.9351 | 116 | 0.9427 | 0.9367 | 0.9397 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9547 | 0.9523 | 0.9535 | 0.9896 | | 0.0315 | 91.0 | 8736 | 0.0343 | 0.9391 | 0.9310 | 0.9351 | 116 | 0.9427 | 0.9367 | 0.9397 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9547 | 0.9523 | 0.9535 | 0.9896 | | 0.0314 | 92.0 | 8832 | 0.0342 | 0.9391 | 0.9310 | 0.9351 | 116 | 0.9427 | 0.9367 | 0.9397 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9547 | 0.9523 | 0.9535 | 0.9896 | | 0.0322 | 93.0 | 8928 | 0.0340 | 0.9391 | 0.9310 | 0.9351 | 116 | 0.9427 | 0.9367 | 0.9397 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9547 | 0.9523 | 0.9535 | 0.9898 | | 0.0303 | 94.0 | 9024 | 0.0343 | 0.9391 | 0.9310 | 0.9351 | 116 | 0.9367 | 0.9367 | 0.9367 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9523 | 0.9523 | 0.9523 | 0.9898 | | 0.0316 | 95.0 | 9120 | 0.0343 | 0.9391 | 0.9310 | 0.9351 | 116 | 0.9367 | 0.9367 | 0.9367 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9523 | 0.9523 | 0.9523 | 0.9898 | | 0.0317 | 96.0 | 9216 | 0.0342 | 0.9391 | 0.9310 | 0.9351 | 116 | 0.9427 | 0.9367 | 0.9397 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9547 | 0.9523 | 0.9535 | 0.9896 | | 0.0321 | 97.0 | 9312 | 0.0341 | 0.9316 | 0.9397 | 0.9356 | 116 | 0.9484 | 0.9304 | 0.9393 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9547 | 0.9523 | 0.9535 | 0.9898 | | 0.0295 | 98.0 | 9408 | 0.0342 | 0.9316 | 0.9397 | 0.9356 | 116 | 0.9484 | 0.9304 | 0.9393 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9547 | 0.9523 | 0.9535 | 0.9898 | | 0.031 | 99.0 | 9504 | 0.0341 | 0.9316 | 0.9397 | 0.9356 | 116 | 0.9484 | 0.9304 | 0.9393 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9547 | 0.9523 | 0.9535 | 0.9898 | | 0.0299 | 100.0 | 9600 | 0.0342 | 0.9316 | 0.9397 | 0.9356 | 116 | 0.9484 | 0.9304 | 0.9393 | 158 | 0.984 | 0.9919 | 0.9880 | 124 | 0.9547 | 0.9523 | 0.9535 | 0.9896 | ### Framework versions - Transformers 4.39.3 - Pytorch 2.3.0+cu121 - Datasets 2.19.1 - Tokenizers 0.15.2
vaiv/GeM2-Llamion-14B-Base
vaiv
2024-06-04T01:49:19Z
3,505
6
transformers
[ "transformers", "safetensors", "llama", "text-generation", "conversational", "license:apache-2.0", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2024-05-13T08:42:16Z
--- license: apache-2.0 --- # **GeM2-Llamion-14B** We have released **Llamion** as **GeM 2.0**, the second series of generative models developed by VAIV Company to address the our principal business needs. **Llamion** (Llamafied Orion) is derived from transforming the [Orion model](https://huggingface.co/OrionStarAI/Orion-14B-Base) into [the standard LLaMA architecture](https://github.com/huggingface/transformers/blob/main/src/transformers/models/llama/modeling_llama.py) through parameter mapping and offline knowledge transfer. Further technical specifications and study results will be detailed in our upcoming paper, available on this page. <!-- Note that this model has NOT been contaminated to artificially inflate its scores for the Open LLM Leaderboards, unlike some recent models which have been intentionally tainted. --> ![vaiv_png](./vaiv.png) ### Contributors - VAIV Company AI Lab ([vaiv.kr](https://www.vaiv.kr/))
awilliamson/qbank
awilliamson
2024-06-04T01:48:00Z
1
0
peft
[ "peft", "safetensors", "llama", "generated_from_trainer", "base_model:meta-llama/Meta-Llama-3-70B", "base_model:adapter:meta-llama/Meta-Llama-3-70B", "license:llama3", "4-bit", "bitsandbytes", "region:us" ]
null
2024-06-04T01:43:00Z
--- license: llama3 library_name: peft tags: - generated_from_trainer base_model: meta-llama/Meta-Llama-3-70B model-index: - name: output/llama3-70b results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> [<img src="https://raw.githubusercontent.com/OpenAccess-AI-Collective/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/OpenAccess-AI-Collective/axolotl) <details><summary>See axolotl config</summary> axolotl version: `0.4.1` ```yaml base_model: meta-llama/Meta-Llama-3-70B model_type: LlamaForCausalLM tokenizer_type: AutoTokenizer load_in_8bit: false load_in_4bit: true strict: false datasets: - path: awilliamson/qbank_conversations type: chat_template chat_template: llama3 field_messages: conversations message_field_role: from message_field_content: value roles: system: - system user: - user assistant: - assistant chat_template: llama3 adapter: qlora lora_r: 32 lora_alpha: 16 lora_modules_to_save: [embed_tokens, lm_head] lora_dropout: 0.05 lora_target_linear: true dataset_prepared_path: last_run_prepared val_set_size: 0.05 output_dir: ./output/llama3-70b sequence_len: 4096 sample_packing: false pad_to_sequence_len: true wandb_project: llama-70b wandb_watch: wandb_run_id: wandb_log_model: gradient_accumulation_steps: 4 micro_batch_size: 1 num_epochs: 3 optimizer: adamw_torch lr_scheduler: cosine learning_rate: 1e-5 train_on_inputs: false group_by_length: false bf16: auto fp16: tf32: false gradient_checkpointing: true gradient_checkpointing_kwargs: use_reentrant: false early_stopping_patience: resume_from_checkpoint: logging_steps: 1 xformers_attention: flash_attention: true warmup_steps: 15 evals_per_epoch: 5 eval_table_size: saves_per_epoch: 1 save_total_limit: 10 save_steps: debug: weight_decay: 0.00 fsdp: - full_shard - auto_wrap fsdp_config: fsdp_limit_all_gathers: true fsdp_sync_module_states: true fsdp_offload_params: true fsdp_use_orig_params: false fsdp_cpu_ram_efficient_loading: true fsdp_auto_wrap_policy: TRANSFORMER_BASED_WRAP fsdp_transformer_layer_cls_to_wrap: LlamaDecoderLayer fsdp_state_dict_type: FULL_STATE_DICT fsdp_sharding_strategy: FULL_SHARD special_tokens: pad_token: "<|end_of_text|>" ``` </details><br> # output/llama3-70b This model is a fine-tuned version of [meta-llama/Meta-Llama-3-70B](https://huggingface.co/meta-llama/Meta-Llama-3-70B) on the None dataset. It achieves the following results on the evaluation set: - Loss: 1.3901 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 1 - eval_batch_size: 1 - seed: 42 - distributed_type: multi-GPU - num_devices: 4 - gradient_accumulation_steps: 4 - total_train_batch_size: 16 - total_eval_batch_size: 4 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - lr_scheduler_warmup_steps: 15 - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:------:|:----:|:---------------:| | 2.3783 | 0.0388 | 1 | 2.8294 | | 1.2438 | 0.1942 | 5 | 1.4718 | | 1.1973 | 0.3883 | 10 | 1.4697 | | 1.0995 | 0.5825 | 15 | 1.4572 | | 1.181 | 0.7767 | 20 | 1.4470 | | 1.1298 | 0.9709 | 25 | 1.4350 | | 0.9058 | 1.1650 | 30 | 1.4232 | | 0.8712 | 1.3592 | 35 | 1.4126 | | 0.8735 | 1.5534 | 40 | 1.4051 | | 0.8975 | 1.7476 | 45 | 1.4024 | | 0.929 | 1.9417 | 50 | 1.3951 | | 0.9181 | 2.1359 | 55 | 1.3923 | | 0.9171 | 2.3301 | 60 | 1.3917 | | 0.9111 | 2.5243 | 65 | 1.3907 | | 0.9676 | 2.7184 | 70 | 1.3904 | | 0.8497 | 2.9126 | 75 | 1.3901 | ### Framework versions - PEFT 0.11.1 - Transformers 4.41.1 - Pytorch 2.1.2+cu121 - Datasets 2.19.1 - Tokenizers 0.19.1
hdve/Qwen-Qwen1.5-0.5B-1717465528
hdve
2024-06-04T01:46:32Z
139
0
transformers
[ "transformers", "safetensors", "qwen2", "text-generation", "conversational", "arxiv:1910.09700", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2024-06-04T01:46:00Z
--- library_name: transformers tags: [] --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
Carlosslocar/test5
Carlosslocar
2024-06-04T01:38:57Z
5
0
transformers
[ "transformers", "safetensors", "gemma", "text-classification", "arxiv:1910.09700", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-classification
2024-06-04T01:31:48Z
--- library_name: transformers tags: [] --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
ovieyra21/epicpohogasm
ovieyra21
2024-06-04T01:36:08Z
3
0
diffusers
[ "diffusers", "text-to-image", "stable-diffusion", "lora", "template:sd-lora", "base_model:runwayml/stable-diffusion-v1-5", "base_model:adapter:runwayml/stable-diffusion-v1-5", "license:mit", "region:us" ]
text-to-image
2024-05-12T23:12:05Z
--- tags: - text-to-image - stable-diffusion - lora - diffusers - template:sd-lora widget: - text: mabama output: url: images/187cd479-4326-4394-82c9-59cd103cd582.jpeg base_model: runwayml/stable-diffusion-v1-5 instance_prompt: mabama license: mit --- # epiclazygasm_.safetensors <Gallery /> ## Trigger words You should use `mabama` to trigger the image generation. ## Download model Weights for this model are available in Safetensors format. [Download](/ovieyra21/epicpohogasm/tree/main) them in the Files & versions tab.
Charixfox/Llama-3-70b-Uncensored-Lumi-Tess-gradient-AWQ-4bit
Charixfox
2024-06-04T01:35:15Z
21
0
transformers
[ "transformers", "safetensors", "llama", "text-generation", "conversational", "license:other", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "4-bit", "awq", "region:us" ]
text-generation
2024-06-04T00:25:26Z
--- license: other license_name: llama3 license_link: https://llama.meta.com/llama3/license/ ---
Carlosslocar/test4
Carlosslocar
2024-06-04T01:30:15Z
5
0
transformers
[ "transformers", "safetensors", "gemma", "text-classification", "arxiv:1910.09700", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-classification
2024-06-03T15:31:56Z
--- library_name: transformers tags: [] --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
gitgato/dog-lora
gitgato
2024-06-04T01:30:02Z
1
0
diffusers
[ "diffusers", "text-to-image", "stable-diffusion", "lora", "template:sd-lora", "base_model:runwayml/stable-diffusion-v1-5", "base_model:adapter:runwayml/stable-diffusion-v1-5", "license:creativeml-openrail-m", "region:us" ]
text-to-image
2024-06-04T01:23:54Z
--- tags: - text-to-image - stable-diffusion - lora - diffusers - template:sd-lora widget: - text: photo of a dog parameters: negative_prompt: Low quality output: url: images/photo_of_a_dog_1.jpeg base_model: runwayml/stable-diffusion-v1-5 instance_prompt: photo of a dog license: creativeml-openrail-m --- # LoRA-DOG <Gallery /> ## Trigger words You should use `photo of a dog` to trigger the image generation. ## Download model Weights for this model are available in Safetensors format. [Download](/gitgato/dog-lora/tree/main) them in the Files & versions tab.
apwic/nerui-lora-r8-0
apwic
2024-06-04T01:26:47Z
0
0
null
[ "tensorboard", "generated_from_trainer", "id", "base_model:indolem/indobert-base-uncased", "base_model:finetune:indolem/indobert-base-uncased", "license:mit", "region:us" ]
null
2024-05-28T12:12:41Z
--- language: - id license: mit base_model: indolem/indobert-base-uncased tags: - generated_from_trainer model-index: - name: nerui-lora-r8-0 results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # nerui-lora-r8-0 This model is a fine-tuned version of [indolem/indobert-base-uncased](https://huggingface.co/indolem/indobert-base-uncased) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.0463 - Location Precision: 0.8462 - Location Recall: 0.9362 - Location F1: 0.8889 - Location Number: 94 - Organization Precision: 0.8667 - Organization Recall: 0.8563 - Organization F1: 0.8614 - Organization Number: 167 - Person Precision: 1.0 - Person Recall: 0.9854 - Person F1: 0.9926 - Person Number: 137 - Overall Precision: 0.9059 - Overall Recall: 0.9196 - Overall F1: 0.9127 - Overall Accuracy: 0.9848 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 64 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 100.0 ### Training results | Training Loss | Epoch | Step | Validation Loss | Location Precision | Location Recall | Location F1 | Location Number | Organization Precision | Organization Recall | Organization F1 | Organization Number | Person Precision | Person Recall | Person F1 | Person Number | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy | |:-------------:|:-----:|:----:|:---------------:|:------------------:|:---------------:|:-----------:|:---------------:|:----------------------:|:-------------------:|:---------------:|:-------------------:|:----------------:|:-------------:|:---------:|:-------------:|:-----------------:|:--------------:|:----------:|:----------------:| | 1.1434 | 1.0 | 96 | 0.7069 | 0.0 | 0.0 | 0.0 | 94 | 0.0 | 0.0 | 0.0 | 167 | 0.0 | 0.0 | 0.0 | 137 | 0.0 | 0.0 | 0.0 | 0.8343 | | 0.6699 | 2.0 | 192 | 0.5760 | 0.0 | 0.0 | 0.0 | 94 | 1.0 | 0.0060 | 0.0119 | 167 | 0.0 | 0.0 | 0.0 | 137 | 0.25 | 0.0025 | 0.0050 | 0.8348 | | 0.5654 | 3.0 | 288 | 0.4641 | 0.0 | 0.0 | 0.0 | 94 | 0.4118 | 0.0419 | 0.0761 | 167 | 0.2414 | 0.0511 | 0.0843 | 137 | 0.3043 | 0.0352 | 0.0631 | 0.8420 | | 0.4481 | 4.0 | 384 | 0.3466 | 0.2353 | 0.0426 | 0.0721 | 94 | 0.3578 | 0.2335 | 0.2826 | 167 | 0.3774 | 0.4380 | 0.4054 | 137 | 0.3614 | 0.2588 | 0.3016 | 0.8793 | | 0.3376 | 5.0 | 480 | 0.2613 | 0.4058 | 0.2979 | 0.3436 | 94 | 0.5105 | 0.5808 | 0.5434 | 167 | 0.5081 | 0.6861 | 0.5839 | 137 | 0.4932 | 0.5503 | 0.5202 | 0.9202 | | 0.2611 | 6.0 | 576 | 0.2025 | 0.5909 | 0.5532 | 0.5714 | 94 | 0.5588 | 0.6826 | 0.6146 | 167 | 0.6905 | 0.8467 | 0.7607 | 137 | 0.6130 | 0.7085 | 0.6573 | 0.9406 | | 0.2071 | 7.0 | 672 | 0.1615 | 0.7021 | 0.7021 | 0.7021 | 94 | 0.6649 | 0.7605 | 0.7095 | 167 | 0.8224 | 0.9124 | 0.8651 | 137 | 0.7277 | 0.7990 | 0.7617 | 0.9555 | | 0.1767 | 8.0 | 768 | 0.1337 | 0.7872 | 0.7872 | 0.7872 | 94 | 0.7120 | 0.7844 | 0.7464 | 167 | 0.9306 | 0.9781 | 0.9537 | 137 | 0.8033 | 0.8518 | 0.8268 | 0.9644 | | 0.1601 | 9.0 | 864 | 0.1165 | 0.7980 | 0.8404 | 0.8187 | 94 | 0.7351 | 0.8144 | 0.7727 | 167 | 0.9306 | 0.9781 | 0.9537 | 137 | 0.8154 | 0.8769 | 0.8450 | 0.9671 | | 0.1406 | 10.0 | 960 | 0.1041 | 0.7573 | 0.8298 | 0.7919 | 94 | 0.7816 | 0.8144 | 0.7977 | 167 | 0.9371 | 0.9781 | 0.9571 | 137 | 0.8286 | 0.8744 | 0.8509 | 0.9693 | | 0.1283 | 11.0 | 1056 | 0.0951 | 0.8021 | 0.8191 | 0.8105 | 94 | 0.7865 | 0.8383 | 0.8116 | 167 | 0.9371 | 0.9781 | 0.9571 | 137 | 0.8417 | 0.8819 | 0.8613 | 0.9704 | | 0.1229 | 12.0 | 1152 | 0.0895 | 0.8019 | 0.9043 | 0.8500 | 94 | 0.8 | 0.8383 | 0.8187 | 167 | 0.9375 | 0.9854 | 0.9609 | 137 | 0.8471 | 0.9045 | 0.8748 | 0.9715 | | 0.1116 | 13.0 | 1248 | 0.0831 | 0.83 | 0.8830 | 0.8557 | 94 | 0.8314 | 0.8563 | 0.8437 | 167 | 0.9371 | 0.9781 | 0.9571 | 137 | 0.8675 | 0.9045 | 0.8856 | 0.9743 | | 0.1077 | 14.0 | 1344 | 0.0769 | 0.8571 | 0.8936 | 0.875 | 94 | 0.8409 | 0.8862 | 0.8630 | 167 | 0.9504 | 0.9781 | 0.9640 | 137 | 0.8819 | 0.9196 | 0.9004 | 0.9760 | | 0.1045 | 15.0 | 1440 | 0.0758 | 0.8333 | 0.9043 | 0.8673 | 94 | 0.8430 | 0.8683 | 0.8555 | 167 | 0.9371 | 0.9781 | 0.9571 | 137 | 0.8729 | 0.9146 | 0.8933 | 0.9760 | | 0.1 | 16.0 | 1536 | 0.0753 | 0.8365 | 0.9255 | 0.8788 | 94 | 0.8111 | 0.8743 | 0.8415 | 167 | 0.9437 | 0.9781 | 0.9606 | 137 | 0.8615 | 0.9221 | 0.8908 | 0.9746 | | 0.0961 | 17.0 | 1632 | 0.0690 | 0.8586 | 0.9043 | 0.8808 | 94 | 0.8563 | 0.8922 | 0.8739 | 167 | 0.9571 | 0.9781 | 0.9675 | 137 | 0.8910 | 0.9246 | 0.9075 | 0.9785 | | 0.0981 | 18.0 | 1728 | 0.0676 | 0.86 | 0.9149 | 0.8866 | 94 | 0.8523 | 0.8982 | 0.8746 | 167 | 0.9504 | 0.9781 | 0.9640 | 137 | 0.8873 | 0.9296 | 0.9080 | 0.9782 | | 0.0916 | 19.0 | 1824 | 0.0653 | 0.8333 | 0.9043 | 0.8673 | 94 | 0.8647 | 0.8802 | 0.8724 | 167 | 0.9640 | 0.9781 | 0.9710 | 137 | 0.8905 | 0.9196 | 0.9048 | 0.9790 | | 0.0899 | 20.0 | 1920 | 0.0637 | 0.8586 | 0.9043 | 0.8808 | 94 | 0.8563 | 0.8922 | 0.8739 | 167 | 0.9640 | 0.9781 | 0.9710 | 137 | 0.8932 | 0.9246 | 0.9086 | 0.9790 | | 0.0856 | 21.0 | 2016 | 0.0656 | 0.8113 | 0.9149 | 0.8600 | 94 | 0.8580 | 0.8683 | 0.8631 | 167 | 0.9571 | 0.9781 | 0.9675 | 137 | 0.8795 | 0.9171 | 0.8979 | 0.9773 | | 0.0844 | 22.0 | 2112 | 0.0621 | 0.8416 | 0.9043 | 0.8718 | 94 | 0.8563 | 0.8922 | 0.8739 | 167 | 0.9571 | 0.9781 | 0.9675 | 137 | 0.8867 | 0.9246 | 0.9053 | 0.9782 | | 0.0816 | 23.0 | 2208 | 0.0608 | 0.85 | 0.9043 | 0.8763 | 94 | 0.8647 | 0.8802 | 0.8724 | 167 | 0.9571 | 0.9781 | 0.9675 | 137 | 0.8927 | 0.9196 | 0.9059 | 0.9798 | | 0.0803 | 24.0 | 2304 | 0.0591 | 0.8586 | 0.9043 | 0.8808 | 94 | 0.8671 | 0.8982 | 0.8824 | 167 | 0.9571 | 0.9781 | 0.9675 | 137 | 0.8956 | 0.9271 | 0.9111 | 0.9796 | | 0.0793 | 25.0 | 2400 | 0.0577 | 0.85 | 0.9043 | 0.8763 | 94 | 0.8824 | 0.8982 | 0.8902 | 167 | 0.9710 | 0.9781 | 0.9745 | 137 | 0.9044 | 0.9271 | 0.9156 | 0.9818 | | 0.0744 | 26.0 | 2496 | 0.0576 | 0.8529 | 0.9255 | 0.8878 | 94 | 0.8706 | 0.8862 | 0.8783 | 167 | 0.9710 | 0.9781 | 0.9745 | 137 | 0.9 | 0.9271 | 0.9134 | 0.9818 | | 0.0761 | 27.0 | 2592 | 0.0571 | 0.8416 | 0.9043 | 0.8718 | 94 | 0.8757 | 0.8862 | 0.8810 | 167 | 0.9640 | 0.9781 | 0.9710 | 137 | 0.8973 | 0.9221 | 0.9095 | 0.9807 | | 0.0724 | 28.0 | 2688 | 0.0559 | 0.8586 | 0.9043 | 0.8808 | 94 | 0.8655 | 0.8862 | 0.8757 | 167 | 0.9710 | 0.9781 | 0.9745 | 137 | 0.8995 | 0.9221 | 0.9107 | 0.9809 | | 0.071 | 29.0 | 2784 | 0.0542 | 0.8687 | 0.9149 | 0.8912 | 94 | 0.8655 | 0.8862 | 0.8757 | 167 | 0.9783 | 0.9854 | 0.9818 | 137 | 0.9044 | 0.9271 | 0.9156 | 0.9818 | | 0.0705 | 30.0 | 2880 | 0.0549 | 0.8462 | 0.9362 | 0.8889 | 94 | 0.8690 | 0.8743 | 0.8716 | 167 | 0.9854 | 0.9854 | 0.9854 | 137 | 0.9022 | 0.9271 | 0.9145 | 0.9818 | | 0.0702 | 31.0 | 2976 | 0.0517 | 0.8687 | 0.9149 | 0.8912 | 94 | 0.8817 | 0.8922 | 0.8869 | 167 | 1.0 | 0.9854 | 0.9926 | 137 | 0.9181 | 0.9296 | 0.9238 | 0.9834 | | 0.065 | 32.0 | 3072 | 0.0532 | 0.8396 | 0.9468 | 0.89 | 94 | 0.8951 | 0.8683 | 0.8815 | 167 | 0.9926 | 0.9854 | 0.9890 | 137 | 0.9134 | 0.9271 | 0.9202 | 0.9826 | | 0.0639 | 33.0 | 3168 | 0.0533 | 0.8286 | 0.9255 | 0.8744 | 94 | 0.8780 | 0.8623 | 0.8701 | 167 | 0.9926 | 0.9854 | 0.9890 | 137 | 0.9037 | 0.9196 | 0.9116 | 0.9815 | | 0.0642 | 34.0 | 3264 | 0.0520 | 0.8529 | 0.9255 | 0.8878 | 94 | 0.875 | 0.8802 | 0.8776 | 167 | 0.9926 | 0.9854 | 0.9890 | 137 | 0.9089 | 0.9271 | 0.9179 | 0.9820 | | 0.0652 | 35.0 | 3360 | 0.0518 | 0.8515 | 0.9149 | 0.8821 | 94 | 0.8690 | 0.8743 | 0.8716 | 167 | 0.9926 | 0.9854 | 0.9890 | 137 | 0.9062 | 0.9221 | 0.9141 | 0.9815 | | 0.0627 | 36.0 | 3456 | 0.0533 | 0.87 | 0.9255 | 0.8969 | 94 | 0.8655 | 0.8862 | 0.8757 | 167 | 0.9854 | 0.9854 | 0.9854 | 137 | 0.9069 | 0.9296 | 0.9181 | 0.9818 | | 0.0606 | 37.0 | 3552 | 0.0503 | 0.8878 | 0.9255 | 0.9062 | 94 | 0.8698 | 0.8802 | 0.8750 | 167 | 0.9926 | 0.9854 | 0.9890 | 137 | 0.9156 | 0.9271 | 0.9213 | 0.9826 | | 0.0611 | 38.0 | 3648 | 0.0497 | 0.87 | 0.9255 | 0.8969 | 94 | 0.8848 | 0.8743 | 0.8795 | 167 | 0.9854 | 0.9854 | 0.9854 | 137 | 0.9154 | 0.9246 | 0.92 | 0.9829 | | 0.0645 | 39.0 | 3744 | 0.0511 | 0.8431 | 0.9149 | 0.8776 | 94 | 0.8780 | 0.8623 | 0.8701 | 167 | 0.9926 | 0.9854 | 0.9890 | 137 | 0.9080 | 0.9171 | 0.9125 | 0.9823 | | 0.061 | 40.0 | 3840 | 0.0487 | 0.8687 | 0.9149 | 0.8912 | 94 | 0.8765 | 0.8922 | 0.8843 | 167 | 1.0 | 0.9854 | 0.9926 | 137 | 0.9158 | 0.9296 | 0.9227 | 0.9840 | | 0.0591 | 41.0 | 3936 | 0.0491 | 0.8515 | 0.9149 | 0.8821 | 94 | 0.8802 | 0.8802 | 0.8802 | 167 | 1.0 | 0.9854 | 0.9926 | 137 | 0.9132 | 0.9246 | 0.9189 | 0.9834 | | 0.058 | 42.0 | 4032 | 0.0480 | 0.8687 | 0.9149 | 0.8912 | 94 | 0.8757 | 0.8862 | 0.8810 | 167 | 1.0 | 0.9854 | 0.9926 | 137 | 0.9156 | 0.9271 | 0.9213 | 0.9840 | | 0.0587 | 43.0 | 4128 | 0.0494 | 0.8350 | 0.9149 | 0.8731 | 94 | 0.8720 | 0.8563 | 0.8640 | 167 | 1.0 | 0.9854 | 0.9926 | 137 | 0.9055 | 0.9146 | 0.91 | 0.9820 | | 0.0562 | 44.0 | 4224 | 0.0482 | 0.8515 | 0.9149 | 0.8821 | 94 | 0.8788 | 0.8683 | 0.8735 | 167 | 1.0 | 0.9854 | 0.9926 | 137 | 0.9127 | 0.9196 | 0.9161 | 0.9829 | | 0.0565 | 45.0 | 4320 | 0.0471 | 0.8529 | 0.9255 | 0.8878 | 94 | 0.8795 | 0.8743 | 0.8769 | 167 | 1.0 | 0.9854 | 0.9926 | 137 | 0.9132 | 0.9246 | 0.9189 | 0.9837 | | 0.0541 | 46.0 | 4416 | 0.0482 | 0.8365 | 0.9255 | 0.8788 | 94 | 0.8795 | 0.8743 | 0.8769 | 167 | 1.0 | 0.9854 | 0.9926 | 137 | 0.9086 | 0.9246 | 0.9166 | 0.9831 | | 0.0547 | 47.0 | 4512 | 0.0487 | 0.8350 | 0.9149 | 0.8731 | 94 | 0.8720 | 0.8563 | 0.8640 | 167 | 1.0 | 0.9854 | 0.9926 | 137 | 0.9055 | 0.9146 | 0.91 | 0.9823 | | 0.0537 | 48.0 | 4608 | 0.0480 | 0.8269 | 0.9149 | 0.8687 | 94 | 0.8659 | 0.8503 | 0.8580 | 167 | 1.0 | 0.9854 | 0.9926 | 137 | 0.9007 | 0.9121 | 0.9064 | 0.9829 | | 0.0525 | 49.0 | 4704 | 0.0477 | 0.8416 | 0.9043 | 0.8718 | 94 | 0.8882 | 0.8563 | 0.8720 | 167 | 1.0 | 0.9854 | 0.9926 | 137 | 0.9144 | 0.9121 | 0.9132 | 0.9826 | | 0.0513 | 50.0 | 4800 | 0.0472 | 0.86 | 0.9149 | 0.8866 | 94 | 0.8596 | 0.8802 | 0.8698 | 167 | 1.0 | 0.9854 | 0.9926 | 137 | 0.9064 | 0.9246 | 0.9154 | 0.9845 | | 0.0507 | 51.0 | 4896 | 0.0481 | 0.8286 | 0.9255 | 0.8744 | 94 | 0.875 | 0.8383 | 0.8563 | 167 | 1.0 | 0.9854 | 0.9926 | 137 | 0.905 | 0.9095 | 0.9073 | 0.9820 | | 0.0499 | 52.0 | 4992 | 0.0472 | 0.87 | 0.9255 | 0.8969 | 94 | 0.8757 | 0.8862 | 0.8810 | 167 | 1.0 | 0.9854 | 0.9926 | 137 | 0.9158 | 0.9296 | 0.9227 | 0.9837 | | 0.0519 | 53.0 | 5088 | 0.0471 | 0.8614 | 0.9255 | 0.8923 | 94 | 0.8743 | 0.8743 | 0.8743 | 167 | 1.0 | 0.9854 | 0.9926 | 137 | 0.9132 | 0.9246 | 0.9189 | 0.9840 | | 0.0523 | 54.0 | 5184 | 0.0483 | 0.8286 | 0.9255 | 0.8744 | 94 | 0.8545 | 0.8443 | 0.8494 | 167 | 1.0 | 0.9854 | 0.9926 | 137 | 0.8963 | 0.9121 | 0.9041 | 0.9826 | | 0.0507 | 55.0 | 5280 | 0.0465 | 0.8447 | 0.9255 | 0.8832 | 94 | 0.8614 | 0.8563 | 0.8589 | 167 | 1.0 | 0.9854 | 0.9926 | 137 | 0.9035 | 0.9171 | 0.9102 | 0.9831 | | 0.0506 | 56.0 | 5376 | 0.0465 | 0.8447 | 0.9255 | 0.8832 | 94 | 0.8614 | 0.8563 | 0.8589 | 167 | 1.0 | 0.9854 | 0.9926 | 137 | 0.9035 | 0.9171 | 0.9102 | 0.9831 | | 0.0504 | 57.0 | 5472 | 0.0475 | 0.8208 | 0.9255 | 0.8700 | 94 | 0.8452 | 0.8503 | 0.8478 | 167 | 1.0 | 0.9854 | 0.9926 | 137 | 0.8900 | 0.9146 | 0.9021 | 0.9831 | | 0.0484 | 58.0 | 5568 | 0.0462 | 0.8302 | 0.9362 | 0.88 | 94 | 0.8659 | 0.8503 | 0.8580 | 167 | 1.0 | 0.9854 | 0.9926 | 137 | 0.9012 | 0.9171 | 0.9091 | 0.9837 | | 0.0487 | 59.0 | 5664 | 0.0457 | 0.8447 | 0.9255 | 0.8832 | 94 | 0.8727 | 0.8623 | 0.8675 | 167 | 1.0 | 0.9854 | 0.9926 | 137 | 0.9082 | 0.9196 | 0.9139 | 0.9837 | | 0.0463 | 60.0 | 5760 | 0.0475 | 0.8365 | 0.9255 | 0.8788 | 94 | 0.8623 | 0.8623 | 0.8623 | 167 | 1.0 | 0.9854 | 0.9926 | 137 | 0.9015 | 0.9196 | 0.9104 | 0.9848 | | 0.0462 | 61.0 | 5856 | 0.0469 | 0.8529 | 0.9255 | 0.8878 | 94 | 0.8655 | 0.8862 | 0.8757 | 167 | 1.0 | 0.9854 | 0.9926 | 137 | 0.9069 | 0.9296 | 0.9181 | 0.9848 | | 0.0497 | 62.0 | 5952 | 0.0469 | 0.8544 | 0.9362 | 0.8934 | 94 | 0.8521 | 0.8623 | 0.8571 | 167 | 1.0 | 0.9854 | 0.9926 | 137 | 0.9017 | 0.9221 | 0.9118 | 0.9845 | | 0.0465 | 63.0 | 6048 | 0.0469 | 0.8515 | 0.9149 | 0.8821 | 94 | 0.8683 | 0.8683 | 0.8683 | 167 | 1.0 | 0.9854 | 0.9926 | 137 | 0.9082 | 0.9196 | 0.9139 | 0.9848 | | 0.0468 | 64.0 | 6144 | 0.0470 | 0.86 | 0.9149 | 0.8866 | 94 | 0.8841 | 0.8683 | 0.8761 | 167 | 1.0 | 0.9854 | 0.9926 | 137 | 0.9173 | 0.9196 | 0.9184 | 0.9843 | | 0.0455 | 65.0 | 6240 | 0.0467 | 0.8462 | 0.9362 | 0.8889 | 94 | 0.8675 | 0.8623 | 0.8649 | 167 | 1.0 | 0.9854 | 0.9926 | 137 | 0.9062 | 0.9221 | 0.9141 | 0.9845 | | 0.0456 | 66.0 | 6336 | 0.0463 | 0.8431 | 0.9149 | 0.8776 | 94 | 0.8712 | 0.8503 | 0.8606 | 167 | 1.0 | 0.9854 | 0.9926 | 137 | 0.9075 | 0.9121 | 0.9098 | 0.9834 | | 0.0436 | 67.0 | 6432 | 0.0457 | 0.8365 | 0.9255 | 0.8788 | 94 | 0.8773 | 0.8563 | 0.8667 | 167 | 1.0 | 0.9854 | 0.9926 | 137 | 0.9080 | 0.9171 | 0.9125 | 0.9837 | | 0.0442 | 68.0 | 6528 | 0.0464 | 0.8365 | 0.9255 | 0.8788 | 94 | 0.8720 | 0.8563 | 0.8640 | 167 | 1.0 | 0.9854 | 0.9926 | 137 | 0.9057 | 0.9171 | 0.9114 | 0.9837 | | 0.0463 | 69.0 | 6624 | 0.0463 | 0.8447 | 0.9255 | 0.8832 | 94 | 0.8720 | 0.8563 | 0.8640 | 167 | 1.0 | 0.9854 | 0.9926 | 137 | 0.9080 | 0.9171 | 0.9125 | 0.9840 | | 0.0445 | 70.0 | 6720 | 0.0457 | 0.8529 | 0.9255 | 0.8878 | 94 | 0.8720 | 0.8563 | 0.8640 | 167 | 1.0 | 0.9854 | 0.9926 | 137 | 0.9102 | 0.9171 | 0.9136 | 0.9840 | | 0.0456 | 71.0 | 6816 | 0.0474 | 0.8462 | 0.9362 | 0.8889 | 94 | 0.8788 | 0.8683 | 0.8735 | 167 | 1.0 | 0.9854 | 0.9926 | 137 | 0.9109 | 0.9246 | 0.9177 | 0.9851 | | 0.0473 | 72.0 | 6912 | 0.0479 | 0.8381 | 0.9362 | 0.8844 | 94 | 0.8659 | 0.8503 | 0.8580 | 167 | 1.0 | 0.9854 | 0.9926 | 137 | 0.9035 | 0.9171 | 0.9102 | 0.9837 | | 0.0434 | 73.0 | 7008 | 0.0475 | 0.8381 | 0.9362 | 0.8844 | 94 | 0.8712 | 0.8503 | 0.8606 | 167 | 1.0 | 0.9854 | 0.9926 | 137 | 0.9057 | 0.9171 | 0.9114 | 0.9840 | | 0.042 | 74.0 | 7104 | 0.0463 | 0.8462 | 0.9362 | 0.8889 | 94 | 0.8765 | 0.8503 | 0.8632 | 167 | 1.0 | 0.9854 | 0.9926 | 137 | 0.9102 | 0.9171 | 0.9136 | 0.9837 | | 0.0438 | 75.0 | 7200 | 0.0463 | 0.8462 | 0.9362 | 0.8889 | 94 | 0.8765 | 0.8503 | 0.8632 | 167 | 1.0 | 0.9854 | 0.9926 | 137 | 0.9102 | 0.9171 | 0.9136 | 0.9837 | | 0.0437 | 76.0 | 7296 | 0.0459 | 0.8462 | 0.9362 | 0.8889 | 94 | 0.8623 | 0.8623 | 0.8623 | 167 | 1.0 | 0.9854 | 0.9926 | 137 | 0.9039 | 0.9221 | 0.9129 | 0.9843 | | 0.0455 | 77.0 | 7392 | 0.0469 | 0.8381 | 0.9362 | 0.8844 | 94 | 0.8827 | 0.8563 | 0.8693 | 167 | 1.0 | 0.9854 | 0.9926 | 137 | 0.9104 | 0.9196 | 0.9150 | 0.9840 | | 0.0426 | 78.0 | 7488 | 0.0467 | 0.8381 | 0.9362 | 0.8844 | 94 | 0.8727 | 0.8623 | 0.8675 | 167 | 1.0 | 0.9854 | 0.9926 | 137 | 0.9062 | 0.9221 | 0.9141 | 0.9848 | | 0.043 | 79.0 | 7584 | 0.0457 | 0.8381 | 0.9362 | 0.8844 | 94 | 0.8735 | 0.8683 | 0.8709 | 167 | 1.0 | 0.9854 | 0.9926 | 137 | 0.9064 | 0.9246 | 0.9154 | 0.9854 | | 0.0435 | 80.0 | 7680 | 0.0462 | 0.8381 | 0.9362 | 0.8844 | 94 | 0.8727 | 0.8623 | 0.8675 | 167 | 1.0 | 0.9854 | 0.9926 | 137 | 0.9062 | 0.9221 | 0.9141 | 0.9851 | | 0.0411 | 81.0 | 7776 | 0.0461 | 0.8381 | 0.9362 | 0.8844 | 94 | 0.8606 | 0.8503 | 0.8554 | 167 | 1.0 | 0.9854 | 0.9926 | 137 | 0.9012 | 0.9171 | 0.9091 | 0.9843 | | 0.0421 | 82.0 | 7872 | 0.0458 | 0.8544 | 0.9362 | 0.8934 | 94 | 0.8720 | 0.8563 | 0.8640 | 167 | 1.0 | 0.9854 | 0.9926 | 137 | 0.9104 | 0.9196 | 0.9150 | 0.9843 | | 0.0416 | 83.0 | 7968 | 0.0462 | 0.8381 | 0.9362 | 0.8844 | 94 | 0.8773 | 0.8563 | 0.8667 | 167 | 1.0 | 0.9854 | 0.9926 | 137 | 0.9082 | 0.9196 | 0.9139 | 0.9843 | | 0.0412 | 84.0 | 8064 | 0.0461 | 0.8462 | 0.9362 | 0.8889 | 94 | 0.8788 | 0.8683 | 0.8735 | 167 | 1.0 | 0.9854 | 0.9926 | 137 | 0.9109 | 0.9246 | 0.9177 | 0.9851 | | 0.0428 | 85.0 | 8160 | 0.0465 | 0.8462 | 0.9362 | 0.8889 | 94 | 0.8773 | 0.8563 | 0.8667 | 167 | 1.0 | 0.9854 | 0.9926 | 137 | 0.9104 | 0.9196 | 0.9150 | 0.9845 | | 0.0434 | 86.0 | 8256 | 0.0467 | 0.8381 | 0.9362 | 0.8844 | 94 | 0.8720 | 0.8563 | 0.8640 | 167 | 1.0 | 0.9854 | 0.9926 | 137 | 0.9059 | 0.9196 | 0.9127 | 0.9840 | | 0.0411 | 87.0 | 8352 | 0.0466 | 0.8381 | 0.9362 | 0.8844 | 94 | 0.8720 | 0.8563 | 0.8640 | 167 | 1.0 | 0.9854 | 0.9926 | 137 | 0.9059 | 0.9196 | 0.9127 | 0.9840 | | 0.0436 | 88.0 | 8448 | 0.0467 | 0.8381 | 0.9362 | 0.8844 | 94 | 0.8780 | 0.8623 | 0.8701 | 167 | 1.0 | 0.9854 | 0.9926 | 137 | 0.9084 | 0.9221 | 0.9152 | 0.9848 | | 0.0413 | 89.0 | 8544 | 0.0460 | 0.8544 | 0.9362 | 0.8934 | 94 | 0.8795 | 0.8743 | 0.8769 | 167 | 1.0 | 0.9854 | 0.9926 | 137 | 0.9134 | 0.9271 | 0.9202 | 0.9854 | | 0.0401 | 90.0 | 8640 | 0.0467 | 0.8462 | 0.9362 | 0.8889 | 94 | 0.8675 | 0.8623 | 0.8649 | 167 | 1.0 | 0.9854 | 0.9926 | 137 | 0.9062 | 0.9221 | 0.9141 | 0.9848 | | 0.0421 | 91.0 | 8736 | 0.0467 | 0.8462 | 0.9362 | 0.8889 | 94 | 0.8780 | 0.8623 | 0.8701 | 167 | 1.0 | 0.9854 | 0.9926 | 137 | 0.9107 | 0.9221 | 0.9164 | 0.9845 | | 0.0407 | 92.0 | 8832 | 0.0462 | 0.8462 | 0.9362 | 0.8889 | 94 | 0.8773 | 0.8563 | 0.8667 | 167 | 1.0 | 0.9854 | 0.9926 | 137 | 0.9104 | 0.9196 | 0.9150 | 0.9845 | | 0.0449 | 93.0 | 8928 | 0.0463 | 0.8462 | 0.9362 | 0.8889 | 94 | 0.8773 | 0.8563 | 0.8667 | 167 | 1.0 | 0.9854 | 0.9926 | 137 | 0.9104 | 0.9196 | 0.9150 | 0.9845 | | 0.0397 | 94.0 | 9024 | 0.0462 | 0.8381 | 0.9362 | 0.8844 | 94 | 0.8667 | 0.8563 | 0.8614 | 167 | 1.0 | 0.9854 | 0.9926 | 137 | 0.9037 | 0.9196 | 0.9116 | 0.9845 | | 0.0417 | 95.0 | 9120 | 0.0463 | 0.8381 | 0.9362 | 0.8844 | 94 | 0.8667 | 0.8563 | 0.8614 | 167 | 1.0 | 0.9854 | 0.9926 | 137 | 0.9037 | 0.9196 | 0.9116 | 0.9845 | | 0.0402 | 96.0 | 9216 | 0.0465 | 0.8381 | 0.9362 | 0.8844 | 94 | 0.8780 | 0.8623 | 0.8701 | 167 | 1.0 | 0.9854 | 0.9926 | 137 | 0.9084 | 0.9221 | 0.9152 | 0.9848 | | 0.0422 | 97.0 | 9312 | 0.0464 | 0.8462 | 0.9362 | 0.8889 | 94 | 0.8720 | 0.8563 | 0.8640 | 167 | 1.0 | 0.9854 | 0.9926 | 137 | 0.9082 | 0.9196 | 0.9139 | 0.9851 | | 0.0417 | 98.0 | 9408 | 0.0463 | 0.8462 | 0.9362 | 0.8889 | 94 | 0.8720 | 0.8563 | 0.8640 | 167 | 1.0 | 0.9854 | 0.9926 | 137 | 0.9082 | 0.9196 | 0.9139 | 0.9851 | | 0.0409 | 99.0 | 9504 | 0.0463 | 0.8462 | 0.9362 | 0.8889 | 94 | 0.8667 | 0.8563 | 0.8614 | 167 | 1.0 | 0.9854 | 0.9926 | 137 | 0.9059 | 0.9196 | 0.9127 | 0.9848 | | 0.0404 | 100.0 | 9600 | 0.0463 | 0.8462 | 0.9362 | 0.8889 | 94 | 0.8667 | 0.8563 | 0.8614 | 167 | 1.0 | 0.9854 | 0.9926 | 137 | 0.9059 | 0.9196 | 0.9127 | 0.9848 | ### Framework versions - Transformers 4.39.3 - Pytorch 2.3.0+cu121 - Datasets 2.19.1 - Tokenizers 0.15.2
lcw99/llama-3-10b-ko-240604-e2f
lcw99
2024-06-04T01:17:10Z
2,249
0
transformers
[ "transformers", "safetensors", "llama", "text-generation", "conversational", "ko", "license:apache-2.0", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2024-06-04T00:37:02Z
--- language: - ko license: apache-2.0 library_name: transformers --- # Model Card for Model ID ## Model Details ### Model Description Korean layer added instruction tunning of meta-llama/Meta-Llama-3-8B-Instruct #### Chat template tokenizer.apply_chat_template(chat, tokenize=False)
Sharan1712/llama2_7B_alpaca_loftq_4bit_3f
Sharan1712
2024-06-04T01:14:46Z
77
0
transformers
[ "transformers", "safetensors", "llama", "text-generation", "arxiv:1910.09700", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "4-bit", "bitsandbytes", "region:us" ]
text-generation
2024-06-04T01:12:05Z
--- library_name: transformers tags: [] --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
Sharan1712/llama2_7B_alpaca_loftq_4bit_3e
Sharan1712
2024-06-04T01:14:36Z
78
0
transformers
[ "transformers", "safetensors", "llama", "text-generation", "arxiv:1910.09700", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "4-bit", "bitsandbytes", "region:us" ]
text-generation
2024-06-04T01:12:05Z
--- library_name: transformers tags: [] --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
baf2b252097d46299a/medical_summarizer_6ec63f0624e84fea9af33517007b93a4
baf2b252097d46299a
2024-06-04T01:13:31Z
0
0
transformers
[ "transformers", "safetensors", "arxiv:1910.09700", "endpoints_compatible", "region:us" ]
null
2024-06-04T01:13:06Z
--- library_name: transformers tags: [] --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
melancholic/watercolor_tattoo_lora
melancholic
2024-06-04T01:09:38Z
3
0
diffusers
[ "diffusers", "tensorboard", "text-to-image", "diffusers-training", "lora", "template:sd-lora", "stable-diffusion-xl", "stable-diffusion-xl-diffusers", "base_model:stabilityai/stable-diffusion-xl-base-1.0", "base_model:adapter:stabilityai/stable-diffusion-xl-base-1.0", "license:openrail++", "region:us" ]
text-to-image
2024-06-03T08:41:51Z
--- license: openrail++ library_name: diffusers tags: - text-to-image - diffusers-training - diffusers - lora - template:sd-lora - stable-diffusion-xl - stable-diffusion-xl-diffusers - text-to-image - text-to-image - diffusers-training - diffusers - lora - template:sd-lora - stable-diffusion-xl - stable-diffusion-xl-diffusers base_model: stabilityai/stable-diffusion-xl-base-1.0 instance_prompt: a watercolor tattoo style widget: [] --- <!-- This model card has been generated automatically according to the information the training script had access to. You should probably proofread and complete it, then remove this comment. --> # SDXL LoRA DreamBooth - melancholic/watercolor_tattoo_lora <Gallery /> ## Model description These are melancholic/watercolor_tattoo_lora LoRA adaption weights for stabilityai/stable-diffusion-xl-base-1.0. The weights were trained using [DreamBooth](https://dreambooth.github.io/). LoRA for the text encoder was enabled: False. Special VAE used for training: madebyollin/sdxl-vae-fp16-fix. ## Trigger words You should use a watercolor tattoo style to trigger the image generation. ## Download model Weights for this model are available in Safetensors format. [Download](melancholic/watercolor_tattoo_lora/tree/main) them in the Files & versions tab. ## Intended uses & limitations #### How to use ```python # TODO: add an example code snippet for running this diffusion pipeline ``` #### Limitations and bias [TODO: provide examples of latent issues and potential remediations] ## Training details [TODO: describe the data used to train the model]
melancholic/neotraditional_tattoo_lora
melancholic
2024-06-04T01:09:32Z
3
0
diffusers
[ "diffusers", "tensorboard", "text-to-image", "diffusers-training", "lora", "template:sd-lora", "stable-diffusion-xl", "stable-diffusion-xl-diffusers", "base_model:stabilityai/stable-diffusion-xl-base-1.0", "base_model:adapter:stabilityai/stable-diffusion-xl-base-1.0", "license:openrail++", "region:us" ]
text-to-image
2024-06-03T06:17:47Z
--- license: openrail++ library_name: diffusers tags: - text-to-image - diffusers-training - diffusers - lora - template:sd-lora - stable-diffusion-xl - stable-diffusion-xl-diffusers - text-to-image - text-to-image - diffusers-training - diffusers - lora - template:sd-lora - stable-diffusion-xl - stable-diffusion-xl-diffusers base_model: stabilityai/stable-diffusion-xl-base-1.0 instance_prompt: a neotraditional tattoo style widget: [] --- <!-- This model card has been generated automatically according to the information the training script had access to. You should probably proofread and complete it, then remove this comment. --> # SDXL LoRA DreamBooth - melancholic/neotraditional_tattoo_lora <Gallery /> ## Model description These are melancholic/neotraditional_tattoo_lora LoRA adaption weights for stabilityai/stable-diffusion-xl-base-1.0. The weights were trained using [DreamBooth](https://dreambooth.github.io/). LoRA for the text encoder was enabled: False. Special VAE used for training: madebyollin/sdxl-vae-fp16-fix. ## Trigger words You should use a neotraditional tattoo style to trigger the image generation. ## Download model Weights for this model are available in Safetensors format. [Download](melancholic/neotraditional_tattoo_lora/tree/main) them in the Files & versions tab. ## Intended uses & limitations #### How to use ```python # TODO: add an example code snippet for running this diffusion pipeline ``` #### Limitations and bias [TODO: provide examples of latent issues and potential remediations] ## Training details [TODO: describe the data used to train the model]
apwic/nerui-base-3
apwic
2024-06-04T01:02:10Z
24
0
transformers
[ "transformers", "tensorboard", "safetensors", "bert", "token-classification", "generated_from_trainer", "id", "base_model:indolem/indobert-base-uncased", "base_model:finetune:indolem/indobert-base-uncased", "license:mit", "autotrain_compatible", "endpoints_compatible", "region:us" ]
token-classification
2024-05-28T05:46:39Z
--- language: - id license: mit base_model: indolem/indobert-base-uncased tags: - generated_from_trainer model-index: - name: nerui-base-3 results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # nerui-base-3 This model is a fine-tuned version of [indolem/indobert-base-uncased](https://huggingface.co/indolem/indobert-base-uncased) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.1047 - Location Precision: 0.8925 - Location Recall: 0.9651 - Location F1: 0.9274 - Location Number: 86 - Organization Precision: 0.9538 - Organization Recall: 0.9270 - Organization F1: 0.9402 - Organization Number: 178 - Person Precision: 0.9685 - Person Recall: 0.9609 - Person F1: 0.9647 - Person Number: 128 - Overall Precision: 0.9440 - Overall Recall: 0.9464 - Overall F1: 0.9452 - Overall Accuracy: 0.9876 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 64 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 100.0 ### Training results | Training Loss | Epoch | Step | Validation Loss | Location Precision | Location Recall | Location F1 | Location Number | Organization Precision | Organization Recall | Organization F1 | Organization Number | Person Precision | Person Recall | Person F1 | Person Number | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy | |:-------------:|:-----:|:----:|:---------------:|:------------------:|:---------------:|:-----------:|:---------------:|:----------------------:|:-------------------:|:---------------:|:-------------------:|:----------------:|:-------------:|:---------:|:-------------:|:-----------------:|:--------------:|:----------:|:----------------:| | 0.2442 | 1.0 | 96 | 0.0581 | 0.8384 | 0.9651 | 0.8973 | 86 | 0.8535 | 0.9494 | 0.8989 | 178 | 0.9690 | 0.9766 | 0.9728 | 128 | 0.8850 | 0.9617 | 0.9218 | 0.9822 | | 0.0581 | 2.0 | 192 | 0.0548 | 0.8283 | 0.9535 | 0.8865 | 86 | 0.9464 | 0.8933 | 0.9191 | 178 | 0.9690 | 0.9766 | 0.9728 | 128 | 0.9242 | 0.9337 | 0.9289 | 0.9852 | | 0.0357 | 3.0 | 288 | 0.0514 | 0.8542 | 0.9535 | 0.9011 | 86 | 0.9310 | 0.9101 | 0.9205 | 178 | 0.9841 | 0.9688 | 0.9764 | 128 | 0.9293 | 0.9388 | 0.9340 | 0.9857 | | 0.0251 | 4.0 | 384 | 0.0607 | 0.8989 | 0.9302 | 0.9143 | 86 | 0.8942 | 0.9494 | 0.9210 | 178 | 0.9837 | 0.9453 | 0.9641 | 128 | 0.9227 | 0.9439 | 0.9332 | 0.9852 | | 0.0146 | 5.0 | 480 | 0.0617 | 0.8804 | 0.9419 | 0.9101 | 86 | 0.9231 | 0.9438 | 0.9333 | 178 | 0.976 | 0.9531 | 0.9644 | 128 | 0.9298 | 0.9464 | 0.9381 | 0.9865 | | 0.0117 | 6.0 | 576 | 0.0706 | 0.8511 | 0.9302 | 0.8889 | 86 | 0.9066 | 0.9270 | 0.9167 | 178 | 0.9758 | 0.9453 | 0.9603 | 128 | 0.915 | 0.9337 | 0.9242 | 0.9857 | | 0.0083 | 7.0 | 672 | 0.0926 | 0.7788 | 0.9419 | 0.8526 | 86 | 0.9162 | 0.9213 | 0.9188 | 178 | 0.9462 | 0.9609 | 0.9535 | 128 | 0.8910 | 0.9388 | 0.9143 | 0.9819 | | 0.008 | 8.0 | 768 | 0.0781 | 0.8617 | 0.9419 | 0.9000 | 86 | 0.9535 | 0.9213 | 0.9371 | 178 | 0.984 | 0.9609 | 0.9723 | 128 | 0.9412 | 0.9388 | 0.9400 | 0.9857 | | 0.0042 | 9.0 | 864 | 0.0659 | 0.8764 | 0.9070 | 0.8914 | 86 | 0.9663 | 0.9663 | 0.9663 | 178 | 0.9764 | 0.9688 | 0.9725 | 128 | 0.9492 | 0.9541 | 0.9517 | 0.9889 | | 0.0044 | 10.0 | 960 | 0.0712 | 0.8681 | 0.9186 | 0.8927 | 86 | 0.9389 | 0.9494 | 0.9441 | 178 | 0.9457 | 0.9531 | 0.9494 | 128 | 0.925 | 0.9439 | 0.9343 | 0.9873 | | 0.005 | 11.0 | 1056 | 0.0855 | 0.8384 | 0.9651 | 0.8973 | 86 | 0.9438 | 0.9438 | 0.9438 | 178 | 0.9762 | 0.9609 | 0.9685 | 128 | 0.9280 | 0.9541 | 0.9409 | 0.9870 | | 0.0036 | 12.0 | 1152 | 0.0859 | 0.8710 | 0.9419 | 0.9050 | 86 | 0.9435 | 0.9382 | 0.9408 | 178 | 0.984 | 0.9609 | 0.9723 | 128 | 0.9392 | 0.9464 | 0.9428 | 0.9873 | | 0.0042 | 13.0 | 1248 | 0.0761 | 0.8901 | 0.9419 | 0.9153 | 86 | 0.9448 | 0.9607 | 0.9526 | 178 | 0.984 | 0.9609 | 0.9723 | 128 | 0.9446 | 0.9566 | 0.9506 | 0.9889 | | 0.0036 | 14.0 | 1344 | 0.0843 | 0.8876 | 0.9186 | 0.9029 | 86 | 0.9538 | 0.9270 | 0.9402 | 178 | 0.9841 | 0.9688 | 0.9764 | 128 | 0.9485 | 0.9388 | 0.9436 | 0.9862 | | 0.0028 | 15.0 | 1440 | 0.0906 | 0.8723 | 0.9535 | 0.9111 | 86 | 0.9429 | 0.9270 | 0.9348 | 178 | 0.984 | 0.9609 | 0.9723 | 128 | 0.9391 | 0.9439 | 0.9415 | 0.9868 | | 0.0017 | 16.0 | 1536 | 0.0914 | 0.8526 | 0.9419 | 0.8950 | 86 | 0.9645 | 0.9157 | 0.9395 | 178 | 0.9683 | 0.9531 | 0.9606 | 128 | 0.9385 | 0.9337 | 0.9361 | 0.9862 | | 0.002 | 17.0 | 1632 | 0.0828 | 0.8587 | 0.9186 | 0.8876 | 86 | 0.9545 | 0.9438 | 0.9492 | 178 | 0.9762 | 0.9609 | 0.9685 | 128 | 0.9391 | 0.9439 | 0.9415 | 0.9884 | | 0.0033 | 18.0 | 1728 | 0.0641 | 0.8646 | 0.9651 | 0.9121 | 86 | 0.9126 | 0.9382 | 0.9252 | 178 | 0.9841 | 0.9688 | 0.9764 | 128 | 0.9235 | 0.9541 | 0.9385 | 0.9887 | | 0.0024 | 19.0 | 1824 | 0.0982 | 0.8667 | 0.9070 | 0.8864 | 86 | 0.9297 | 0.9663 | 0.9477 | 178 | 0.9683 | 0.9531 | 0.9606 | 128 | 0.9277 | 0.9490 | 0.9382 | 0.9868 | | 0.0037 | 20.0 | 1920 | 0.0904 | 0.8283 | 0.9535 | 0.8865 | 86 | 0.9659 | 0.9551 | 0.9605 | 178 | 0.984 | 0.9609 | 0.9723 | 128 | 0.9375 | 0.9566 | 0.9470 | 0.9887 | | 0.0038 | 21.0 | 2016 | 0.0787 | 0.8925 | 0.9651 | 0.9274 | 86 | 0.9385 | 0.9438 | 0.9412 | 178 | 0.9609 | 0.9609 | 0.9609 | 128 | 0.935 | 0.9541 | 0.9444 | 0.9879 | | 0.0024 | 22.0 | 2112 | 0.0697 | 0.8526 | 0.9419 | 0.8950 | 86 | 0.9286 | 0.9494 | 0.9389 | 178 | 0.9677 | 0.9375 | 0.9524 | 128 | 0.9227 | 0.9439 | 0.9332 | 0.9889 | | 0.0041 | 23.0 | 2208 | 0.0794 | 0.9011 | 0.9535 | 0.9266 | 86 | 0.9441 | 0.9494 | 0.9468 | 178 | 0.9685 | 0.9609 | 0.9647 | 128 | 0.9421 | 0.9541 | 0.9480 | 0.9876 | | 0.0033 | 24.0 | 2304 | 0.0830 | 0.9 | 0.9419 | 0.9205 | 86 | 0.9231 | 0.9438 | 0.9333 | 178 | 0.9758 | 0.9453 | 0.9603 | 128 | 0.9343 | 0.9439 | 0.9391 | 0.9881 | | 0.0034 | 25.0 | 2400 | 0.0804 | 0.8632 | 0.9535 | 0.9061 | 86 | 0.9448 | 0.9607 | 0.9526 | 178 | 0.9841 | 0.9688 | 0.9764 | 128 | 0.9378 | 0.9617 | 0.9496 | 0.9881 | | 0.0012 | 26.0 | 2496 | 0.0728 | 0.9011 | 0.9535 | 0.9266 | 86 | 0.9341 | 0.9551 | 0.9444 | 178 | 0.9841 | 0.9688 | 0.9764 | 128 | 0.9424 | 0.9592 | 0.9507 | 0.9903 | | 0.0015 | 27.0 | 2592 | 0.0957 | 0.9101 | 0.9419 | 0.9257 | 86 | 0.9301 | 0.9719 | 0.9505 | 178 | 0.9762 | 0.9609 | 0.9685 | 128 | 0.9401 | 0.9617 | 0.9508 | 0.9881 | | 0.0029 | 28.0 | 2688 | 0.0766 | 0.8830 | 0.9651 | 0.9222 | 86 | 0.9545 | 0.9438 | 0.9492 | 178 | 0.9841 | 0.9688 | 0.9764 | 128 | 0.9470 | 0.9566 | 0.9518 | 0.9881 | | 0.0031 | 29.0 | 2784 | 0.0802 | 0.8571 | 0.9767 | 0.9130 | 86 | 0.9649 | 0.9270 | 0.9456 | 178 | 0.9764 | 0.9688 | 0.9725 | 128 | 0.9419 | 0.9515 | 0.9467 | 0.9879 | | 0.0018 | 30.0 | 2880 | 0.0837 | 0.8710 | 0.9419 | 0.9050 | 86 | 0.9605 | 0.9551 | 0.9577 | 178 | 0.9841 | 0.9688 | 0.9764 | 128 | 0.9470 | 0.9566 | 0.9518 | 0.9892 | | 0.0017 | 31.0 | 2976 | 0.0792 | 0.9222 | 0.9651 | 0.9432 | 86 | 0.9505 | 0.9719 | 0.9611 | 178 | 0.9683 | 0.9531 | 0.9606 | 128 | 0.9497 | 0.9643 | 0.9570 | 0.9903 | | 0.0017 | 32.0 | 3072 | 0.0675 | 0.8737 | 0.9651 | 0.9171 | 86 | 0.9661 | 0.9607 | 0.9634 | 178 | 0.976 | 0.9531 | 0.9644 | 128 | 0.9471 | 0.9592 | 0.9531 | 0.9906 | | 0.0012 | 33.0 | 3168 | 0.0909 | 0.8925 | 0.9651 | 0.9274 | 86 | 0.9709 | 0.9382 | 0.9543 | 178 | 0.984 | 0.9609 | 0.9723 | 128 | 0.9564 | 0.9515 | 0.9540 | 0.9897 | | 0.002 | 34.0 | 3264 | 0.1077 | 0.9101 | 0.9419 | 0.9257 | 86 | 0.9422 | 0.9157 | 0.9288 | 178 | 0.968 | 0.9453 | 0.9565 | 128 | 0.9432 | 0.9311 | 0.9371 | 0.9846 | | 0.0023 | 35.0 | 3360 | 0.0912 | 0.8913 | 0.9535 | 0.9213 | 86 | 0.9396 | 0.9607 | 0.95 | 178 | 0.9762 | 0.9609 | 0.9685 | 128 | 0.94 | 0.9592 | 0.9495 | 0.9881 | | 0.0016 | 36.0 | 3456 | 0.0839 | 0.8925 | 0.9651 | 0.9274 | 86 | 0.9655 | 0.9438 | 0.9545 | 178 | 0.984 | 0.9609 | 0.9723 | 128 | 0.9541 | 0.9541 | 0.9541 | 0.9892 | | 0.0012 | 37.0 | 3552 | 0.1070 | 0.8817 | 0.9535 | 0.9162 | 86 | 0.9480 | 0.9213 | 0.9345 | 178 | 0.976 | 0.9531 | 0.9644 | 128 | 0.9412 | 0.9388 | 0.9400 | 0.9857 | | 0.0009 | 38.0 | 3648 | 0.0856 | 0.8947 | 0.9884 | 0.9392 | 86 | 0.9540 | 0.9326 | 0.9432 | 178 | 0.984 | 0.9609 | 0.9723 | 128 | 0.9492 | 0.9541 | 0.9517 | 0.9884 | | 0.0006 | 39.0 | 3744 | 0.0964 | 0.8936 | 0.9767 | 0.9333 | 86 | 0.9483 | 0.9270 | 0.9375 | 178 | 0.9685 | 0.9609 | 0.9647 | 128 | 0.9418 | 0.9490 | 0.9454 | 0.9862 | | 0.0011 | 40.0 | 3840 | 0.0992 | 0.9011 | 0.9535 | 0.9266 | 86 | 0.9492 | 0.9438 | 0.9465 | 178 | 0.9762 | 0.9609 | 0.9685 | 128 | 0.9467 | 0.9515 | 0.9491 | 0.9870 | | 0.0009 | 41.0 | 3936 | 0.1072 | 0.9032 | 0.9767 | 0.9385 | 86 | 0.9489 | 0.9382 | 0.9435 | 178 | 0.976 | 0.9531 | 0.9644 | 128 | 0.9467 | 0.9515 | 0.9491 | 0.9860 | | 0.0007 | 42.0 | 4032 | 0.1193 | 0.8936 | 0.9767 | 0.9333 | 86 | 0.9595 | 0.9326 | 0.9459 | 178 | 0.9839 | 0.9531 | 0.9683 | 128 | 0.9514 | 0.9490 | 0.9502 | 0.9865 | | 0.0014 | 43.0 | 4128 | 0.1129 | 0.9032 | 0.9767 | 0.9385 | 86 | 0.9489 | 0.9382 | 0.9435 | 178 | 0.9683 | 0.9531 | 0.9606 | 128 | 0.9443 | 0.9515 | 0.9479 | 0.9868 | | 0.0007 | 44.0 | 4224 | 0.1289 | 0.9130 | 0.9767 | 0.9438 | 86 | 0.9492 | 0.9438 | 0.9465 | 178 | 0.9609 | 0.9609 | 0.9609 | 128 | 0.9446 | 0.9566 | 0.9506 | 0.9849 | | 0.0006 | 45.0 | 4320 | 0.1167 | 0.8842 | 0.9767 | 0.9282 | 86 | 0.9392 | 0.9551 | 0.9471 | 178 | 0.9688 | 0.9688 | 0.9688 | 128 | 0.9356 | 0.9643 | 0.9497 | 0.9868 | | 0.0014 | 46.0 | 4416 | 0.1168 | 0.8646 | 0.9651 | 0.9121 | 86 | 0.9543 | 0.9382 | 0.9462 | 178 | 0.9839 | 0.9531 | 0.9683 | 128 | 0.9418 | 0.9490 | 0.9454 | 0.9873 | | 0.0022 | 47.0 | 4512 | 0.1090 | 0.8737 | 0.9651 | 0.9171 | 86 | 0.9702 | 0.9157 | 0.9422 | 178 | 0.9841 | 0.9688 | 0.9764 | 128 | 0.9512 | 0.9439 | 0.9475 | 0.9868 | | 0.0033 | 48.0 | 4608 | 0.0899 | 0.9222 | 0.9651 | 0.9432 | 86 | 0.9333 | 0.9438 | 0.9385 | 178 | 0.9758 | 0.9453 | 0.9603 | 128 | 0.9442 | 0.9490 | 0.9466 | 0.9889 | | 0.001 | 49.0 | 4704 | 0.1123 | 0.8830 | 0.9651 | 0.9222 | 86 | 0.9704 | 0.9213 | 0.9452 | 178 | 0.9839 | 0.9531 | 0.9683 | 128 | 0.9535 | 0.9413 | 0.9474 | 0.9870 | | 0.0007 | 50.0 | 4800 | 0.0937 | 0.9011 | 0.9535 | 0.9266 | 86 | 0.9486 | 0.9326 | 0.9405 | 178 | 0.984 | 0.9609 | 0.9723 | 128 | 0.9488 | 0.9464 | 0.9476 | 0.9887 | | 0.0011 | 51.0 | 4896 | 0.1082 | 0.9032 | 0.9767 | 0.9385 | 86 | 0.9278 | 0.9382 | 0.9330 | 178 | 0.9841 | 0.9688 | 0.9764 | 128 | 0.9398 | 0.9566 | 0.9482 | 0.9865 | | 0.0015 | 52.0 | 4992 | 0.1112 | 0.9011 | 0.9535 | 0.9266 | 86 | 0.9645 | 0.9157 | 0.9395 | 178 | 0.9762 | 0.9609 | 0.9685 | 128 | 0.9534 | 0.9388 | 0.9460 | 0.9879 | | 0.0009 | 53.0 | 5088 | 0.1032 | 0.8925 | 0.9651 | 0.9274 | 86 | 0.9341 | 0.9551 | 0.9444 | 178 | 0.984 | 0.9609 | 0.9723 | 128 | 0.94 | 0.9592 | 0.9495 | 0.9881 | | 0.0033 | 54.0 | 5184 | 0.1181 | 0.8925 | 0.9651 | 0.9274 | 86 | 0.9593 | 0.9270 | 0.9429 | 178 | 0.984 | 0.9609 | 0.9723 | 128 | 0.9513 | 0.9464 | 0.9488 | 0.9870 | | 0.0008 | 55.0 | 5280 | 0.1207 | 0.9022 | 0.9651 | 0.9326 | 86 | 0.9651 | 0.9326 | 0.9486 | 178 | 0.9688 | 0.9688 | 0.9688 | 128 | 0.9515 | 0.9515 | 0.9515 | 0.9865 | | 0.0009 | 56.0 | 5376 | 0.1379 | 0.8632 | 0.9535 | 0.9061 | 86 | 0.9702 | 0.9157 | 0.9422 | 178 | 0.984 | 0.9609 | 0.9723 | 128 | 0.9485 | 0.9388 | 0.9436 | 0.9857 | | 0.001 | 57.0 | 5472 | 0.1120 | 0.8925 | 0.9651 | 0.9274 | 86 | 0.9708 | 0.9326 | 0.9513 | 178 | 0.984 | 0.9609 | 0.9723 | 128 | 0.9563 | 0.9490 | 0.9526 | 0.9881 | | 0.0013 | 58.0 | 5568 | 0.1086 | 0.8830 | 0.9651 | 0.9222 | 86 | 0.9483 | 0.9270 | 0.9375 | 178 | 0.9841 | 0.9688 | 0.9764 | 128 | 0.9442 | 0.9490 | 0.9466 | 0.9862 | | 0.0005 | 59.0 | 5664 | 0.1218 | 0.8660 | 0.9767 | 0.9180 | 86 | 0.9641 | 0.9045 | 0.9333 | 178 | 0.9538 | 0.9688 | 0.9612 | 128 | 0.9365 | 0.9413 | 0.9389 | 0.9854 | | 0.0007 | 60.0 | 5760 | 0.0958 | 0.8913 | 0.9535 | 0.9213 | 86 | 0.9239 | 0.9551 | 0.9392 | 178 | 0.9839 | 0.9531 | 0.9683 | 128 | 0.935 | 0.9541 | 0.9444 | 0.9881 | | 0.0002 | 61.0 | 5856 | 0.1076 | 0.8817 | 0.9535 | 0.9162 | 86 | 0.9593 | 0.9270 | 0.9429 | 178 | 0.976 | 0.9531 | 0.9644 | 128 | 0.9462 | 0.9413 | 0.9437 | 0.9879 | | 0.0023 | 62.0 | 5952 | 0.0877 | 0.9140 | 0.9884 | 0.9497 | 86 | 0.9494 | 0.9494 | 0.9494 | 178 | 0.9764 | 0.9688 | 0.9725 | 128 | 0.9497 | 0.9643 | 0.9570 | 0.9895 | | 0.0013 | 63.0 | 6048 | 0.0885 | 0.9032 | 0.9767 | 0.9385 | 86 | 0.9448 | 0.9607 | 0.9526 | 178 | 0.9841 | 0.9688 | 0.9764 | 128 | 0.9475 | 0.9668 | 0.9571 | 0.9895 | | 0.0009 | 64.0 | 6144 | 0.0825 | 0.9032 | 0.9767 | 0.9385 | 86 | 0.9605 | 0.9551 | 0.9577 | 178 | 0.9841 | 0.9688 | 0.9764 | 128 | 0.9545 | 0.9643 | 0.9594 | 0.9900 | | 0.0003 | 65.0 | 6240 | 0.0838 | 0.9222 | 0.9651 | 0.9432 | 86 | 0.96 | 0.9438 | 0.9518 | 178 | 0.9841 | 0.9688 | 0.9764 | 128 | 0.9591 | 0.9566 | 0.9579 | 0.9884 | | 0.0006 | 66.0 | 6336 | 0.0957 | 0.9032 | 0.9767 | 0.9385 | 86 | 0.96 | 0.9438 | 0.9518 | 178 | 0.9841 | 0.9688 | 0.9764 | 128 | 0.9543 | 0.9592 | 0.9567 | 0.9887 | | 0.0004 | 67.0 | 6432 | 0.1129 | 0.8925 | 0.9651 | 0.9274 | 86 | 0.9649 | 0.9270 | 0.9456 | 178 | 0.9841 | 0.9688 | 0.9764 | 128 | 0.9538 | 0.9490 | 0.9514 | 0.9879 | | 0.0003 | 68.0 | 6528 | 0.1161 | 0.8936 | 0.9767 | 0.9333 | 86 | 0.9538 | 0.9270 | 0.9402 | 178 | 0.9764 | 0.9688 | 0.9725 | 128 | 0.9467 | 0.9515 | 0.9491 | 0.9870 | | 0.0002 | 69.0 | 6624 | 0.1234 | 0.8936 | 0.9767 | 0.9333 | 86 | 0.9645 | 0.9157 | 0.9395 | 178 | 0.9688 | 0.9688 | 0.9688 | 128 | 0.9488 | 0.9464 | 0.9476 | 0.9862 | | 0.0006 | 70.0 | 6720 | 0.1162 | 0.9231 | 0.9767 | 0.9492 | 86 | 0.9651 | 0.9326 | 0.9486 | 178 | 0.9841 | 0.9688 | 0.9764 | 128 | 0.9614 | 0.9541 | 0.9577 | 0.9884 | | 0.0002 | 71.0 | 6816 | 0.1107 | 0.9333 | 0.9767 | 0.9545 | 86 | 0.96 | 0.9438 | 0.9518 | 178 | 0.9841 | 0.9688 | 0.9764 | 128 | 0.9616 | 0.9592 | 0.9604 | 0.9879 | | 0.0002 | 72.0 | 6912 | 0.1121 | 0.9231 | 0.9767 | 0.9492 | 86 | 0.9598 | 0.9382 | 0.9489 | 178 | 0.9841 | 0.9688 | 0.9764 | 128 | 0.9591 | 0.9566 | 0.9579 | 0.9879 | | 0.0002 | 73.0 | 7008 | 0.1122 | 0.9231 | 0.9767 | 0.9492 | 86 | 0.9543 | 0.9382 | 0.9462 | 178 | 0.9841 | 0.9688 | 0.9764 | 128 | 0.9566 | 0.9566 | 0.9566 | 0.9881 | | 0.0005 | 74.0 | 7104 | 0.1127 | 0.9231 | 0.9767 | 0.9492 | 86 | 0.9543 | 0.9382 | 0.9462 | 178 | 0.9841 | 0.9688 | 0.9764 | 128 | 0.9566 | 0.9566 | 0.9566 | 0.9873 | | 0.0004 | 75.0 | 7200 | 0.1170 | 0.9130 | 0.9767 | 0.9438 | 86 | 0.9540 | 0.9326 | 0.9432 | 178 | 0.9688 | 0.9688 | 0.9688 | 128 | 0.9492 | 0.9541 | 0.9517 | 0.9862 | | 0.0003 | 76.0 | 7296 | 0.1089 | 0.9333 | 0.9767 | 0.9545 | 86 | 0.9444 | 0.9551 | 0.9497 | 178 | 0.9762 | 0.9609 | 0.9685 | 128 | 0.9520 | 0.9617 | 0.9569 | 0.9892 | | 0.001 | 77.0 | 7392 | 0.1082 | 0.9231 | 0.9767 | 0.9492 | 86 | 0.9503 | 0.9663 | 0.9582 | 178 | 0.9764 | 0.9688 | 0.9725 | 128 | 0.9524 | 0.9694 | 0.9608 | 0.9895 | | 0.0012 | 78.0 | 7488 | 0.1009 | 0.9022 | 0.9651 | 0.9326 | 86 | 0.9330 | 0.9382 | 0.9356 | 178 | 0.9688 | 0.9688 | 0.9688 | 128 | 0.9373 | 0.9541 | 0.9456 | 0.9862 | | 0.0002 | 79.0 | 7584 | 0.1051 | 0.8632 | 0.9535 | 0.9061 | 86 | 0.9489 | 0.9382 | 0.9435 | 178 | 0.976 | 0.9531 | 0.9644 | 128 | 0.9369 | 0.9464 | 0.9416 | 0.9865 | | 0.0002 | 80.0 | 7680 | 0.1108 | 0.8723 | 0.9535 | 0.9111 | 86 | 0.9540 | 0.9326 | 0.9432 | 178 | 0.976 | 0.9531 | 0.9644 | 128 | 0.9415 | 0.9439 | 0.9427 | 0.9865 | | 0.0005 | 81.0 | 7776 | 0.1037 | 0.8913 | 0.9535 | 0.9213 | 86 | 0.9543 | 0.9382 | 0.9462 | 178 | 0.9762 | 0.9609 | 0.9685 | 128 | 0.9466 | 0.9490 | 0.9478 | 0.9870 | | 0.0003 | 82.0 | 7872 | 0.1031 | 0.8710 | 0.9419 | 0.9050 | 86 | 0.9540 | 0.9326 | 0.9432 | 178 | 0.976 | 0.9531 | 0.9644 | 128 | 0.9413 | 0.9413 | 0.9413 | 0.9868 | | 0.0003 | 83.0 | 7968 | 0.0996 | 0.9121 | 0.9651 | 0.9379 | 86 | 0.9602 | 0.9494 | 0.9548 | 178 | 0.9685 | 0.9609 | 0.9647 | 128 | 0.9518 | 0.9566 | 0.9542 | 0.9887 | | 0.0002 | 84.0 | 8064 | 0.0987 | 0.9222 | 0.9651 | 0.9432 | 86 | 0.9602 | 0.9494 | 0.9548 | 178 | 0.9685 | 0.9609 | 0.9647 | 128 | 0.9542 | 0.9566 | 0.9554 | 0.9887 | | 0.0004 | 85.0 | 8160 | 0.1017 | 0.9222 | 0.9651 | 0.9432 | 86 | 0.9602 | 0.9494 | 0.9548 | 178 | 0.9685 | 0.9609 | 0.9647 | 128 | 0.9542 | 0.9566 | 0.9554 | 0.9887 | | 0.0002 | 86.0 | 8256 | 0.1018 | 0.9222 | 0.9651 | 0.9432 | 86 | 0.9602 | 0.9494 | 0.9548 | 178 | 0.9685 | 0.9609 | 0.9647 | 128 | 0.9542 | 0.9566 | 0.9554 | 0.9887 | | 0.0001 | 87.0 | 8352 | 0.1017 | 0.9222 | 0.9651 | 0.9432 | 86 | 0.9553 | 0.9607 | 0.9580 | 178 | 0.9685 | 0.9609 | 0.9647 | 128 | 0.9520 | 0.9617 | 0.9569 | 0.9889 | | 0.0002 | 88.0 | 8448 | 0.1028 | 0.9222 | 0.9651 | 0.9432 | 86 | 0.9602 | 0.9494 | 0.9548 | 178 | 0.9685 | 0.9609 | 0.9647 | 128 | 0.9542 | 0.9566 | 0.9554 | 0.9887 | | 0.0001 | 89.0 | 8544 | 0.1033 | 0.9222 | 0.9651 | 0.9432 | 86 | 0.9602 | 0.9494 | 0.9548 | 178 | 0.9685 | 0.9609 | 0.9647 | 128 | 0.9542 | 0.9566 | 0.9554 | 0.9887 | | 0.0002 | 90.0 | 8640 | 0.1026 | 0.9213 | 0.9535 | 0.9371 | 86 | 0.9545 | 0.9438 | 0.9492 | 178 | 0.9762 | 0.9609 | 0.9685 | 128 | 0.9540 | 0.9515 | 0.9527 | 0.9879 | | 0.0002 | 91.0 | 8736 | 0.1024 | 0.9213 | 0.9535 | 0.9371 | 86 | 0.9545 | 0.9438 | 0.9492 | 178 | 0.9762 | 0.9609 | 0.9685 | 128 | 0.9540 | 0.9515 | 0.9527 | 0.9879 | | 0.0002 | 92.0 | 8832 | 0.1025 | 0.9213 | 0.9535 | 0.9371 | 86 | 0.9545 | 0.9438 | 0.9492 | 178 | 0.9762 | 0.9609 | 0.9685 | 128 | 0.9540 | 0.9515 | 0.9527 | 0.9879 | | 0.0002 | 93.0 | 8928 | 0.1039 | 0.9213 | 0.9535 | 0.9371 | 86 | 0.9545 | 0.9438 | 0.9492 | 178 | 0.9762 | 0.9609 | 0.9685 | 128 | 0.9540 | 0.9515 | 0.9527 | 0.9879 | | 0.0001 | 94.0 | 9024 | 0.1034 | 0.9213 | 0.9535 | 0.9371 | 86 | 0.9545 | 0.9438 | 0.9492 | 178 | 0.9762 | 0.9609 | 0.9685 | 128 | 0.9540 | 0.9515 | 0.9527 | 0.9879 | | 0.0001 | 95.0 | 9120 | 0.1036 | 0.9213 | 0.9535 | 0.9371 | 86 | 0.9545 | 0.9438 | 0.9492 | 178 | 0.9762 | 0.9609 | 0.9685 | 128 | 0.9540 | 0.9515 | 0.9527 | 0.9879 | | 0.0001 | 96.0 | 9216 | 0.1087 | 0.8925 | 0.9651 | 0.9274 | 86 | 0.9538 | 0.9270 | 0.9402 | 178 | 0.9762 | 0.9609 | 0.9685 | 128 | 0.9464 | 0.9464 | 0.9464 | 0.9873 | | 0.0005 | 97.0 | 9312 | 0.1056 | 0.8925 | 0.9651 | 0.9274 | 86 | 0.9538 | 0.9270 | 0.9402 | 178 | 0.9685 | 0.9609 | 0.9647 | 128 | 0.9440 | 0.9464 | 0.9452 | 0.9876 | | 0.0003 | 98.0 | 9408 | 0.1045 | 0.8925 | 0.9651 | 0.9274 | 86 | 0.9538 | 0.9270 | 0.9402 | 178 | 0.9685 | 0.9609 | 0.9647 | 128 | 0.9440 | 0.9464 | 0.9452 | 0.9876 | | 0.0001 | 99.0 | 9504 | 0.1047 | 0.8925 | 0.9651 | 0.9274 | 86 | 0.9538 | 0.9270 | 0.9402 | 178 | 0.9685 | 0.9609 | 0.9647 | 128 | 0.9440 | 0.9464 | 0.9452 | 0.9876 | | 0.0002 | 100.0 | 9600 | 0.1047 | 0.8925 | 0.9651 | 0.9274 | 86 | 0.9538 | 0.9270 | 0.9402 | 178 | 0.9685 | 0.9609 | 0.9647 | 128 | 0.9440 | 0.9464 | 0.9452 | 0.9876 | ### Framework versions - Transformers 4.39.3 - Pytorch 2.3.0+cu121 - Datasets 2.19.1 - Tokenizers 0.15.2
harveybro/molt5-augmented-default-400-base-caption2smiles
harveybro
2024-06-04T00:49:43Z
107
0
transformers
[ "transformers", "safetensors", "t5", "text2text-generation", "arxiv:1910.09700", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text2text-generation
2024-06-04T00:49:10Z
--- library_name: transformers tags: [] --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
datek/Qwen-Qwen1.5-7B-1717461786
datek
2024-06-04T00:47:06Z
5
0
transformers
[ "transformers", "safetensors", "qwen2", "text-generation", "conversational", "arxiv:1910.09700", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2024-06-04T00:43:13Z
--- library_name: transformers tags: [] --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
RichardErkhov/TeeZee_-_DarkForest-20B-v1.1-gguf
RichardErkhov
2024-06-04T00:45:17Z
25
0
null
[ "gguf", "endpoints_compatible", "region:us" ]
null
2024-06-03T19:49:45Z
Quantization made by Richard Erkhov. [Github](https://github.com/RichardErkhov) [Discord](https://discord.gg/pvy7H8DZMG) [Request more models](https://github.com/RichardErkhov/quant_request) DarkForest-20B-v1.1 - GGUF - Model creator: https://huggingface.co/TeeZee/ - Original model: https://huggingface.co/TeeZee/DarkForest-20B-v1.1/ | Name | Quant method | Size | | ---- | ---- | ---- | | [DarkForest-20B-v1.1.Q2_K.gguf](https://huggingface.co/RichardErkhov/TeeZee_-_DarkForest-20B-v1.1-gguf/blob/main/DarkForest-20B-v1.1.Q2_K.gguf) | Q2_K | 6.91GB | | [DarkForest-20B-v1.1.IQ3_XS.gguf](https://huggingface.co/RichardErkhov/TeeZee_-_DarkForest-20B-v1.1-gguf/blob/main/DarkForest-20B-v1.1.IQ3_XS.gguf) | IQ3_XS | 7.63GB | | [DarkForest-20B-v1.1.IQ3_S.gguf](https://huggingface.co/RichardErkhov/TeeZee_-_DarkForest-20B-v1.1-gguf/blob/main/DarkForest-20B-v1.1.IQ3_S.gguf) | IQ3_S | 8.06GB | | [DarkForest-20B-v1.1.Q3_K_S.gguf](https://huggingface.co/RichardErkhov/TeeZee_-_DarkForest-20B-v1.1-gguf/blob/main/DarkForest-20B-v1.1.Q3_K_S.gguf) | Q3_K_S | 6.24GB | | [DarkForest-20B-v1.1.IQ3_M.gguf](https://huggingface.co/RichardErkhov/TeeZee_-_DarkForest-20B-v1.1-gguf/blob/main/DarkForest-20B-v1.1.IQ3_M.gguf) | IQ3_M | 8.53GB | | [DarkForest-20B-v1.1.Q3_K.gguf](https://huggingface.co/RichardErkhov/TeeZee_-_DarkForest-20B-v1.1-gguf/blob/main/DarkForest-20B-v1.1.Q3_K.gguf) | Q3_K | 9.04GB | | [DarkForest-20B-v1.1.Q3_K_M.gguf](https://huggingface.co/RichardErkhov/TeeZee_-_DarkForest-20B-v1.1-gguf/blob/main/DarkForest-20B-v1.1.Q3_K_M.gguf) | Q3_K_M | 9.04GB | | [DarkForest-20B-v1.1.Q3_K_L.gguf](https://huggingface.co/RichardErkhov/TeeZee_-_DarkForest-20B-v1.1-gguf/blob/main/DarkForest-20B-v1.1.Q3_K_L.gguf) | Q3_K_L | 9.9GB | | [DarkForest-20B-v1.1.IQ4_XS.gguf](https://huggingface.co/RichardErkhov/TeeZee_-_DarkForest-20B-v1.1-gguf/blob/main/DarkForest-20B-v1.1.IQ4_XS.gguf) | IQ4_XS | 10.01GB | | [DarkForest-20B-v1.1.Q4_0.gguf](https://huggingface.co/RichardErkhov/TeeZee_-_DarkForest-20B-v1.1-gguf/blob/main/DarkForest-20B-v1.1.Q4_0.gguf) | Q4_0 | 7.95GB | | [DarkForest-20B-v1.1.IQ4_NL.gguf](https://huggingface.co/RichardErkhov/TeeZee_-_DarkForest-20B-v1.1-gguf/blob/main/DarkForest-20B-v1.1.IQ4_NL.gguf) | IQ4_NL | 1.88GB | | [DarkForest-20B-v1.1.Q4_K_S.gguf](https://huggingface.co/RichardErkhov/TeeZee_-_DarkForest-20B-v1.1-gguf/blob/main/DarkForest-20B-v1.1.Q4_K_S.gguf) | Q4_K_S | 1.42GB | | [DarkForest-20B-v1.1.Q4_K.gguf](https://huggingface.co/RichardErkhov/TeeZee_-_DarkForest-20B-v1.1-gguf/blob/main/DarkForest-20B-v1.1.Q4_K.gguf) | Q4_K | 1.13GB | | [DarkForest-20B-v1.1.Q4_K_M.gguf](https://huggingface.co/RichardErkhov/TeeZee_-_DarkForest-20B-v1.1-gguf/blob/main/DarkForest-20B-v1.1.Q4_K_M.gguf) | Q4_K_M | 0.63GB | | [DarkForest-20B-v1.1.Q4_1.gguf](https://huggingface.co/RichardErkhov/TeeZee_-_DarkForest-20B-v1.1-gguf/blob/main/DarkForest-20B-v1.1.Q4_1.gguf) | Q4_1 | 0.55GB | | [DarkForest-20B-v1.1.Q5_0.gguf](https://huggingface.co/RichardErkhov/TeeZee_-_DarkForest-20B-v1.1-gguf/blob/main/DarkForest-20B-v1.1.Q5_0.gguf) | Q5_0 | 0.53GB | | [DarkForest-20B-v1.1.Q5_K_S.gguf](https://huggingface.co/RichardErkhov/TeeZee_-_DarkForest-20B-v1.1-gguf/blob/main/DarkForest-20B-v1.1.Q5_K_S.gguf) | Q5_K_S | 0.3GB | | [DarkForest-20B-v1.1.Q5_K.gguf](https://huggingface.co/RichardErkhov/TeeZee_-_DarkForest-20B-v1.1-gguf/blob/main/DarkForest-20B-v1.1.Q5_K.gguf) | Q5_K | 13.18GB | | [DarkForest-20B-v1.1.Q5_K_M.gguf](https://huggingface.co/RichardErkhov/TeeZee_-_DarkForest-20B-v1.1-gguf/blob/main/DarkForest-20B-v1.1.Q5_K_M.gguf) | Q5_K_M | 2.78GB | | [DarkForest-20B-v1.1.Q5_1.gguf](https://huggingface.co/RichardErkhov/TeeZee_-_DarkForest-20B-v1.1-gguf/blob/main/DarkForest-20B-v1.1.Q5_1.gguf) | Q5_1 | 13.98GB | | [DarkForest-20B-v1.1.Q6_K.gguf](https://huggingface.co/RichardErkhov/TeeZee_-_DarkForest-20B-v1.1-gguf/blob/main/DarkForest-20B-v1.1.Q6_K.gguf) | Q6_K | 9.18GB | | [DarkForest-20B-v1.1.Q8_0.gguf](https://huggingface.co/RichardErkhov/TeeZee_-_DarkForest-20B-v1.1-gguf/blob/main/DarkForest-20B-v1.1.Q8_0.gguf) | Q8_0 | 7.48GB | Original model description: --- license: other tags: - merge - not-for-all-audiences license_name: microsoft-research-license model-index: - name: DarkForest-20B-v1.2 results: - task: type: text-generation name: Text Generation dataset: name: AI2 Reasoning Challenge (25-Shot) type: ai2_arc config: ARC-Challenge split: test args: num_few_shot: 25 metrics: - type: acc_norm value: 63.57 name: normalized accuracy source: url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=TeeZee/DarkForest-20B-v1.2 name: Open LLM Leaderboard - task: type: text-generation name: Text Generation dataset: name: HellaSwag (10-Shot) type: hellaswag split: validation args: num_few_shot: 10 metrics: - type: acc_norm value: 86.42 name: normalized accuracy source: url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=TeeZee/DarkForest-20B-v1.2 name: Open LLM Leaderboard - task: type: text-generation name: Text Generation dataset: name: MMLU (5-Shot) type: cais/mmlu config: all split: test args: num_few_shot: 5 metrics: - type: acc value: 59.77 name: accuracy source: url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=TeeZee/DarkForest-20B-v1.2 name: Open LLM Leaderboard - task: type: text-generation name: Text Generation dataset: name: TruthfulQA (0-shot) type: truthful_qa config: multiple_choice split: validation args: num_few_shot: 0 metrics: - type: mc2 value: 56.31 source: url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=TeeZee/DarkForest-20B-v1.2 name: Open LLM Leaderboard - task: type: text-generation name: Text Generation dataset: name: Winogrande (5-shot) type: winogrande config: winogrande_xl split: validation args: num_few_shot: 5 metrics: - type: acc value: 77.74 name: accuracy source: url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=TeeZee/DarkForest-20B-v1.2 name: Open LLM Leaderboard - task: type: text-generation name: Text Generation dataset: name: GSM8k (5-shot) type: gsm8k config: main split: test args: num_few_shot: 5 metrics: - type: acc value: 24.94 name: accuracy source: url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=TeeZee/DarkForest-20B-v1.2 name: Open LLM Leaderboard --- # DarkForest 20B v1.1 ![image/png](https://huggingface.co/TeeZee/DarkForest-20B-v1.1/resolve/main/DarkForest_v1.1.jpg) ## Model Details - To create this model two step procedure was used. First a new 20B model was created using [microsoft/Orca-2-13b](https://huggingface.co/microsoft/Orca-2-13b) and [KoboldAI/LLaMA2-13B-Erebus-v3](https://huggingface.co/KoboldAI/LLaMA2-13B-Erebus-v3) , deatils of the merge in [mergekit-config_step1.yml](https://huggingface.co/TeeZee/DarkForest-20B-v1.0/resolve/main/mergekit-config_step1.yml) - then [jebcarter/psyonic-cetacean-20B](https://huggingface.co/jebcarter/psyonic-cetacean-20B) was used to produce the final model, merge config in [mergekit-config-step2.yml](https://huggingface.co/TeeZee/DarkForest-20B-v1.1/resolve/main/mergekit-config-step2.yml) - instead of linear merge method used in v1.0, this time DARE TIES method was used for step2 - The resulting model has approximately 20 billion parameters. **Warning: This model can produce NSFW content!** ## Results - produces SFW nad NSFW content without issues, switches context seamlessly. - good at following instructions. - good at tracking multiple characters in one scene. - very creative, scenarios produced are mature and complicated, model doesn't shy from writing about PTSD, menatal issues or complicated relationships. - NSFW output is more creative and suprising than typical limaRP output. - definitely for mature audiences, not only because of vivid NSFW content but also because of overall maturity of stories it produces. - This is NOT Harry Potter level storytelling. All comments are greatly appreciated, download, test and if you appreciate my work, consider buying me my fuel: <a href="https://www.buymeacoffee.com/TeeZee" target="_blank"><img src="https://cdn.buymeacoffee.com/buttons/v2/default-yellow.png" alt="Buy Me A Coffee" style="height: 60px !important;width: 217px !important;" ></a> # [Open LLM Leaderboard Evaluation Results](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard) Detailed results can be found [here](https://huggingface.co/datasets/open-llm-leaderboard/details_TeeZee__DarkForest-20B-v1.2) | Metric |Value| |---------------------------------|----:| |Avg. |61.46| |AI2 Reasoning Challenge (25-Shot)|63.57| |HellaSwag (10-Shot) |86.42| |MMLU (5-Shot) |59.77| |TruthfulQA (0-shot) |56.31| |Winogrande (5-shot) |77.74| |GSM8k (5-shot) |24.94|
zzha6204/languagebind-mlp
zzha6204
2024-06-04T00:44:05Z
0
0
null
[ "multimodal", "classification", "content detection", "license:mit", "region:us" ]
null
2024-06-04T00:36:09Z
--- license: mit tags: - multimodal - classification - content detection ---
zzha6204/imagebind-mlp
zzha6204
2024-06-04T00:43:07Z
0
0
null
[ "multimodal", "classification", "content detection", "license:mit", "region:us" ]
null
2024-06-04T00:29:36Z
--- license: mit tags: - multimodal - classification - content detection ---
abdurrahman22224/distilbert-finetuned-emotion_output
abdurrahman22224
2024-06-04T00:42:39Z
108
0
transformers
[ "transformers", "tensorboard", "safetensors", "distilbert", "text-classification", "generated_from_trainer", "dataset:emotion", "base_model:distilbert/distilbert-base-uncased", "base_model:finetune:distilbert/distilbert-base-uncased", "license:apache-2.0", "model-index", "autotrain_compatible", "endpoints_compatible", "region:us" ]
text-classification
2024-06-04T00:38:08Z
--- license: apache-2.0 base_model: distilbert-base-uncased tags: - generated_from_trainer datasets: - emotion metrics: - accuracy - f1 model-index: - name: distilbert-finetuned-emotion_output results: - task: name: Text Classification type: text-classification dataset: name: emotion type: emotion config: split split: validation args: split metrics: - name: Accuracy type: accuracy value: 0.9285 - name: F1 type: f1 value: 0.9285881569186282 --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # distilbert-finetuned-emotion_output This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the emotion dataset. It achieves the following results on the evaluation set: - Loss: 0.2084 - Accuracy: 0.9285 - F1: 0.9286 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 64 - eval_batch_size: 64 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 2 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | |:-------------:|:-----:|:----:|:---------------:|:--------:|:------:| | No log | 1.0 | 250 | 0.2941 | 0.911 | 0.9102 | | 0.5131 | 2.0 | 500 | 0.2084 | 0.9285 | 0.9286 | ### Framework versions - Transformers 4.41.2 - Pytorch 2.3.0+cu121 - Datasets 2.19.2 - Tokenizers 0.19.1
h104/SN6
h104
2024-06-04T00:37:18Z
151
0
transformers
[ "transformers", "safetensors", "llama", "text-generation", "conversational", "arxiv:1910.09700", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2024-05-18T14:20:34Z
--- library_name: transformers tags: [] --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
fangphattha/a2c-PandaReachDense-v3
fangphattha
2024-06-04T00:31:38Z
0
0
stable-baselines3
[ "stable-baselines3", "PandaReachDense-v3", "deep-reinforcement-learning", "reinforcement-learning", "model-index", "region:us" ]
reinforcement-learning
2024-06-04T00:26:08Z
--- library_name: stable-baselines3 tags: - PandaReachDense-v3 - deep-reinforcement-learning - reinforcement-learning - stable-baselines3 model-index: - name: A2C results: - task: type: reinforcement-learning name: reinforcement-learning dataset: name: PandaReachDense-v3 type: PandaReachDense-v3 metrics: - type: mean_reward value: -0.17 +/- 0.11 name: mean_reward verified: false --- # **A2C** Agent playing **PandaReachDense-v3** This is a trained model of a **A2C** agent playing **PandaReachDense-v3** using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3). ## Usage (with Stable-baselines3) TODO: Add your code ```python from stable_baselines3 import ... from huggingface_sb3 import load_from_hub ... ```
Aitrepreneur/ToonCrafter-fp16
Aitrepreneur
2024-06-04T00:28:35Z
0
7
null
[ "region:us" ]
null
2024-06-03T23:12:02Z
## ___***ToonCrafter: Generative Cartoon Interpolation***___ <!-- ![](./assets/logo_long.png#gh-light-mode-only){: width="50%"} --> <!-- ![](./assets/logo_long_dark.png#gh-dark-mode-only=100x20) --> <div align="center"> </div> ## 🔆 Introduction ⚠️ Please check our [disclaimer](#disc) first. 🤗 ToonCrafter can interpolate two cartoon images by leveraging the pre-trained image-to-video diffusion priors. Please check our project page and paper for more information. <br> ### 1.1 Showcases (512x320) <table class="center"> <tr style="font-weight: bolder;text-align:center;"> <td>Input starting frame</td> <td>Input ending frame</td> <td>Generated video</td> </tr> <tr> <td> <img src=assets/72109_125.mp4_00-00.png width="250"> </td> <td> <img src=assets/72109_125.mp4_00-01.png width="250"> </td> <td> <img src=assets/00.gif width="250"> </td> </tr> <tr> <td> <img src=assets/Japan_v2_2_062266_s2_frame1.png width="250"> </td> <td> <img src=assets/Japan_v2_2_062266_s2_frame3.png width="250"> </td> <td> <img src=assets/03.gif width="250"> </td> </tr> <tr> <td> <img src=assets/Japan_v2_1_070321_s3_frame1.png width="250"> </td> <td> <img src=assets/Japan_v2_1_070321_s3_frame3.png width="250"> </td> <td> <img src=assets/02.gif width="250"> </td> </tr> <tr> <td> <img src=assets/74302_1349_frame1.png width="250"> </td> <td> <img src=assets/74302_1349_frame3.png width="250"> </td> <td> <img src=assets/01.gif width="250"> </td> </tr> </table> ### 1.2 Sparse sketch guidance <table class="center"> <tr style="font-weight: bolder;text-align:center;"> <td>Input starting frame</td> <td>Input ending frame</td> <td>Input sketch guidance</td> <td>Generated video</td> </tr> <tr> <td> <img src=assets/72105_388.mp4_00-00.png width="200"> </td> <td> <img src=assets/72105_388.mp4_00-01.png width="200"> </td> <td> <img src=assets/06.gif width="200"> </td> <td> <img src=assets/07.gif width="200"> </td> </tr> <tr> <td> <img src=assets/72110_255.mp4_00-00.png width="200"> </td> <td> <img src=assets/72110_255.mp4_00-01.png width="200"> </td> <td> <img src=assets/12.gif width="200"> </td> <td> <img src=assets/13.gif width="200"> </td> </tr> </table> ### 2. Applications #### 2.1 Cartoon Sketch Interpolation (see project page for more details) <table class="center"> <tr style="font-weight: bolder;text-align:center;"> <td>Input starting frame</td> <td>Input ending frame</td> <td>Generated video</td> </tr> <tr> <td> <img src=assets/frame0001_10.png width="250"> </td> <td> <img src=assets/frame0016_10.png width="250"> </td> <td> <img src=assets/10.gif width="250"> </td> </tr> <tr> <td> <img src=assets/frame0001_11.png width="250"> </td> <td> <img src=assets/frame0016_11.png width="250"> </td> <td> <img src=assets/11.gif width="250"> </td> </tr> </table> #### 2.2 Reference-based Sketch Colorization <table class="center"> <tr style="font-weight: bolder;text-align:center;"> <td>Input sketch</td> <td>Input reference</td> <td>Colorization results</td> </tr> <tr> <td> <img src=assets/04.gif width="250"> </td> <td> <img src=assets/frame0001_05.png width="250"> </td> <td> <img src=assets/05.gif width="250"> </td> </tr> <tr> <td> <img src=assets/08.gif width="250"> </td> <td> <img src=assets/frame0001_09.png width="250"> </td> <td> <img src=assets/09.gif width="250"> </td> </tr> </table> ## 📝 Changelog - [ ] Add sketch control and colorization function. - __[2024.05.29]__: 🔥🔥 Release code and model weights. - __[2024.05.28]__: Launch the project page and update the arXiv preprint. <br> ## 🧰 Models |Model|Resolution|GPU Mem. & Inference Time (A100, ddim 50steps)|Checkpoint| |:---------|:---------|:--------|:--------| |ToonCrafter_512|320x512| TBD (`perframe_ae=True`)|[Hugging Face](https://huggingface.co/Doubiiu/ToonCrafter/blob/main/model.ckpt)| Currently, our ToonCrafter can support generating videos of up to 16 frames with a resolution of 512x320. The inference time can be reduced by using fewer DDIM steps. ## ⚙️ Setup ### Install Environment via Anaconda (Recommended) ```bash conda create -n tooncrafter python=3.8.5 conda activate tooncrafter pip install -r requirements.txt ``` ## 💫 Inference ### 1. Command line Download pretrained ToonCrafter_512 and put the `model.ckpt` in `checkpoints/tooncrafter_512_interp_v1/model.ckpt`. ```bash sh scripts/run.sh ``` ### 2. Local Gradio demo Download the pretrained model and put it in the corresponding directory according to the previous guidelines. ```bash python gradio_app.py ``` <!-- ## 🤝 Community Support --> <a name="disc"></a> ## 📢 Disclaimer Calm down. Our framework opens up the era of generative cartoon interpolation, but due to the variaity of generative video prior, the success rate is not guaranteed. ⚠️This is an open-source research exploration, instead of commercial products. It can't meet all your expectations. This project strives to impact the domain of AI-driven video generation positively. Users are granted the freedom to create videos using this tool, but they are expected to comply with local laws and utilize it responsibly. The developers do not assume any responsibility for potential misuse by users. ****
ehottl/distilbert-base-uncased-finetuned-emotion
ehottl
2024-06-04T00:21:31Z
121
0
transformers
[ "transformers", "tensorboard", "safetensors", "distilbert", "text-classification", "generated_from_trainer", "dataset:emotion", "base_model:distilbert/distilbert-base-uncased", "base_model:finetune:distilbert/distilbert-base-uncased", "license:apache-2.0", "model-index", "autotrain_compatible", "endpoints_compatible", "region:us" ]
text-classification
2024-06-04T00:10:46Z
--- license: apache-2.0 base_model: distilbert-base-uncased tags: - generated_from_trainer datasets: - emotion metrics: - accuracy - f1 model-index: - name: distilbert-base-uncased-finetuned-emotion results: - task: name: Text Classification type: text-classification dataset: name: emotion type: emotion config: split split: validation args: split metrics: - name: Accuracy type: accuracy value: 0.929 - name: F1 type: f1 value: 0.9290384064576098 --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # distilbert-base-uncased-finetuned-emotion This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the emotion dataset. It achieves the following results on the evaluation set: - Loss: 0.2064 - Accuracy: 0.929 - F1: 0.9290 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 64 - eval_batch_size: 64 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 2 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | |:-------------:|:-----:|:----:|:---------------:|:--------:|:------:| | 0.8175 | 1.0 | 250 | 0.2950 | 0.911 | 0.9108 | | 0.238 | 2.0 | 500 | 0.2064 | 0.929 | 0.9290 | ### Framework versions - Transformers 4.41.1 - Pytorch 2.3.0+cu121 - Datasets 2.19.2 - Tokenizers 0.19.1
dbands/llama-3-8b-instruct-code_bagel_hermes-2-5-blender-f16
dbands
2024-06-04T00:20:29Z
0
0
transformers
[ "transformers", "text-generation-inference", "unsloth", "llama", "gguf", "en", "base_model:unsloth/llama-3-8b-bnb-4bit", "base_model:finetune:unsloth/llama-3-8b-bnb-4bit", "license:apache-2.0", "endpoints_compatible", "region:us" ]
null
2024-06-04T00:20:27Z
--- language: - en license: apache-2.0 tags: - text-generation-inference - transformers - unsloth - llama - gguf base_model: unsloth/llama-3-8b-bnb-4bit --- # Uploaded model - **Developed by:** dbands - **License:** apache-2.0 - **Finetuned from model :** unsloth/llama-3-8b-bnb-4bit This llama model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library. [<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
T3Zhang/mymodel
T3Zhang
2024-06-04T00:20:17Z
140
0
transformers
[ "transformers", "safetensors", "llama", "text-generation", "conversational", "arxiv:1910.09700", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2024-05-18T15:21:49Z
--- library_name: transformers tags: [] --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
azmoulai/vizwiz-blip-model
azmoulai
2024-06-04T00:16:24Z
6
0
transformers
[ "transformers", "safetensors", "blip", "visual-question-answering", "arxiv:1910.09700", "endpoints_compatible", "region:us" ]
visual-question-answering
2024-05-29T04:12:54Z
--- library_name: transformers tags: [] --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
apwic/nerui-base-0
apwic
2024-06-04T00:16:00Z
8
0
transformers
[ "transformers", "tensorboard", "safetensors", "bert", "token-classification", "generated_from_trainer", "id", "base_model:indolem/indobert-base-uncased", "base_model:finetune:indolem/indobert-base-uncased", "license:mit", "autotrain_compatible", "endpoints_compatible", "region:us" ]
token-classification
2024-05-28T03:49:20Z
--- language: - id license: mit base_model: indolem/indobert-base-uncased tags: - generated_from_trainer model-index: - name: nerui-base-0 results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # nerui-base-0 This model is a fine-tuned version of [indolem/indobert-base-uncased](https://huggingface.co/indolem/indobert-base-uncased) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.1084 - Location Precision: 0.89 - Location Recall: 0.9468 - Location F1: 0.9175 - Location Number: 94 - Organization Precision: 0.9387 - Organization Recall: 0.9162 - Organization F1: 0.9273 - Organization Number: 167 - Person Precision: 1.0 - Person Recall: 0.9781 - Person F1: 0.9889 - Person Number: 137 - Overall Precision: 0.9471 - Overall Recall: 0.9447 - Overall F1: 0.9459 - Overall Accuracy: 0.9887 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 64 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 100.0 ### Training results | Training Loss | Epoch | Step | Validation Loss | Location Precision | Location Recall | Location F1 | Location Number | Organization Precision | Organization Recall | Organization F1 | Organization Number | Person Precision | Person Recall | Person F1 | Person Number | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy | |:-------------:|:-----:|:----:|:---------------:|:------------------:|:---------------:|:-----------:|:---------------:|:----------------------:|:-------------------:|:---------------:|:-------------------:|:----------------:|:-------------:|:---------:|:-------------:|:-----------------:|:--------------:|:----------:|:----------------:| | 0.2566 | 1.0 | 96 | 0.0455 | 0.9634 | 0.8404 | 0.8977 | 94 | 0.8333 | 0.9281 | 0.8782 | 167 | 0.9708 | 0.9708 | 0.9708 | 137 | 0.9062 | 0.9221 | 0.9141 | 0.9843 | | 0.0617 | 2.0 | 192 | 0.0519 | 0.8381 | 0.9362 | 0.8844 | 94 | 0.8896 | 0.8683 | 0.8788 | 167 | 0.9926 | 0.9781 | 0.9853 | 137 | 0.9107 | 0.9221 | 0.9164 | 0.9834 | | 0.0356 | 3.0 | 288 | 0.0534 | 0.9062 | 0.9255 | 0.9158 | 94 | 0.8211 | 0.9341 | 0.8739 | 167 | 1.0 | 0.9708 | 0.9852 | 137 | 0.8974 | 0.9447 | 0.9204 | 0.9840 | | 0.0235 | 4.0 | 384 | 0.0525 | 0.8866 | 0.9149 | 0.9005 | 94 | 0.9006 | 0.9222 | 0.9112 | 167 | 1.0 | 0.9781 | 0.9889 | 137 | 0.9303 | 0.9397 | 0.9350 | 0.9856 | | 0.0156 | 5.0 | 480 | 0.0623 | 0.9032 | 0.8936 | 0.8984 | 94 | 0.9333 | 0.9222 | 0.9277 | 167 | 0.9926 | 0.9781 | 0.9853 | 137 | 0.9466 | 0.9347 | 0.9406 | 0.9873 | | 0.0101 | 6.0 | 576 | 0.0590 | 0.9043 | 0.9043 | 0.9043 | 94 | 0.8929 | 0.8982 | 0.8955 | 167 | 0.9926 | 0.9781 | 0.9853 | 137 | 0.9295 | 0.9271 | 0.9283 | 0.9859 | | 0.0091 | 7.0 | 672 | 0.0955 | 0.8036 | 0.9574 | 0.8738 | 94 | 0.9211 | 0.8383 | 0.8777 | 167 | 0.9643 | 0.9854 | 0.9747 | 137 | 0.9035 | 0.9171 | 0.9102 | 0.9809 | | 0.0084 | 8.0 | 768 | 0.0871 | 0.8365 | 0.9255 | 0.8788 | 94 | 0.9062 | 0.8683 | 0.8869 | 167 | 1.0 | 0.9781 | 0.9889 | 137 | 0.9196 | 0.9196 | 0.9196 | 0.9826 | | 0.007 | 9.0 | 864 | 0.0629 | 0.9565 | 0.9362 | 0.9462 | 94 | 0.8895 | 0.9162 | 0.9027 | 167 | 1.0 | 0.9854 | 0.9926 | 137 | 0.9424 | 0.9447 | 0.9435 | 0.9881 | | 0.0047 | 10.0 | 960 | 0.0564 | 0.9167 | 0.9362 | 0.9263 | 94 | 0.9512 | 0.9341 | 0.9426 | 167 | 1.0 | 0.9781 | 0.9889 | 137 | 0.9594 | 0.9497 | 0.9545 | 0.9901 | | 0.0043 | 11.0 | 1056 | 0.0829 | 0.9158 | 0.9255 | 0.9206 | 94 | 0.8708 | 0.9281 | 0.8986 | 167 | 0.9926 | 0.9781 | 0.9853 | 137 | 0.9216 | 0.9447 | 0.9330 | 0.9856 | | 0.0034 | 12.0 | 1152 | 0.0779 | 0.9247 | 0.9149 | 0.9198 | 94 | 0.8667 | 0.9341 | 0.8991 | 167 | 0.9926 | 0.9781 | 0.9853 | 137 | 0.9216 | 0.9447 | 0.9330 | 0.9865 | | 0.0047 | 13.0 | 1248 | 0.0781 | 0.8922 | 0.9681 | 0.9286 | 94 | 0.95 | 0.9102 | 0.9297 | 167 | 0.9854 | 0.9854 | 0.9854 | 137 | 0.9474 | 0.9497 | 0.9486 | 0.9862 | | 0.006 | 14.0 | 1344 | 0.0682 | 0.9271 | 0.9468 | 0.9368 | 94 | 0.9236 | 0.8683 | 0.8951 | 167 | 1.0 | 0.9781 | 0.9889 | 137 | 0.9509 | 0.9246 | 0.9376 | 0.9859 | | 0.0031 | 15.0 | 1440 | 0.0759 | 0.9149 | 0.9149 | 0.9149 | 94 | 0.8814 | 0.9341 | 0.9070 | 167 | 0.9926 | 0.9781 | 0.9853 | 137 | 0.9261 | 0.9447 | 0.9353 | 0.9878 | | 0.0049 | 16.0 | 1536 | 0.0801 | 0.9082 | 0.9468 | 0.9271 | 94 | 0.9107 | 0.9162 | 0.9134 | 167 | 0.9574 | 0.9854 | 0.9712 | 137 | 0.9263 | 0.9472 | 0.9366 | 0.9865 | | 0.0036 | 17.0 | 1632 | 0.0933 | 0.9278 | 0.9574 | 0.9424 | 94 | 0.9333 | 0.9222 | 0.9277 | 167 | 0.9853 | 0.9781 | 0.9817 | 137 | 0.9497 | 0.9497 | 0.9497 | 0.9887 | | 0.0033 | 18.0 | 1728 | 0.0828 | 0.9167 | 0.9362 | 0.9263 | 94 | 0.9167 | 0.9222 | 0.9194 | 167 | 0.9926 | 0.9781 | 0.9853 | 137 | 0.9424 | 0.9447 | 0.9435 | 0.9870 | | 0.0031 | 19.0 | 1824 | 0.0819 | 0.9149 | 0.9149 | 0.9149 | 94 | 0.9102 | 0.9102 | 0.9102 | 167 | 0.9708 | 0.9708 | 0.9708 | 137 | 0.9322 | 0.9322 | 0.9322 | 0.9873 | | 0.0025 | 20.0 | 1920 | 0.0871 | 0.8969 | 0.9255 | 0.9110 | 94 | 0.9321 | 0.9042 | 0.9179 | 167 | 0.9708 | 0.9708 | 0.9708 | 137 | 0.9369 | 0.9322 | 0.9345 | 0.9878 | | 0.0023 | 21.0 | 2016 | 0.0813 | 0.89 | 0.9468 | 0.9175 | 94 | 0.9162 | 0.9162 | 0.9162 | 167 | 0.9706 | 0.9635 | 0.9670 | 137 | 0.9280 | 0.9397 | 0.9338 | 0.9873 | | 0.0023 | 22.0 | 2112 | 0.0885 | 0.9158 | 0.9255 | 0.9206 | 94 | 0.8814 | 0.9341 | 0.9070 | 167 | 1.0 | 0.9635 | 0.9814 | 137 | 0.9282 | 0.9422 | 0.9352 | 0.9867 | | 0.0018 | 23.0 | 2208 | 0.1209 | 0.8788 | 0.9255 | 0.9016 | 94 | 0.8947 | 0.9162 | 0.9053 | 167 | 0.9779 | 0.9708 | 0.9744 | 137 | 0.9187 | 0.9372 | 0.9279 | 0.9837 | | 0.0036 | 24.0 | 2304 | 0.0841 | 0.9175 | 0.9468 | 0.9319 | 94 | 0.9029 | 0.9461 | 0.9240 | 167 | 0.9853 | 0.9781 | 0.9817 | 137 | 0.9338 | 0.9573 | 0.9454 | 0.9878 | | 0.0034 | 25.0 | 2400 | 0.0860 | 0.9368 | 0.9468 | 0.9418 | 94 | 0.9186 | 0.9461 | 0.9322 | 167 | 0.9926 | 0.9781 | 0.9853 | 137 | 0.9478 | 0.9573 | 0.9525 | 0.9884 | | 0.0029 | 26.0 | 2496 | 0.0684 | 0.9381 | 0.9681 | 0.9529 | 94 | 0.9176 | 0.9341 | 0.9258 | 167 | 0.9926 | 0.9781 | 0.9853 | 137 | 0.9478 | 0.9573 | 0.9525 | 0.9898 | | 0.0031 | 27.0 | 2592 | 0.1158 | 0.9278 | 0.9574 | 0.9424 | 94 | 0.8933 | 0.9521 | 0.9217 | 167 | 0.9926 | 0.9781 | 0.9853 | 137 | 0.9341 | 0.9623 | 0.9480 | 0.9865 | | 0.0045 | 28.0 | 2688 | 0.0860 | 0.9263 | 0.9362 | 0.9312 | 94 | 0.8963 | 0.8802 | 0.8882 | 167 | 0.9926 | 0.9781 | 0.9853 | 137 | 0.9365 | 0.9271 | 0.9318 | 0.9854 | | 0.0018 | 29.0 | 2784 | 0.0869 | 0.9271 | 0.9468 | 0.9368 | 94 | 0.9290 | 0.9401 | 0.9345 | 167 | 0.9926 | 0.9781 | 0.9853 | 137 | 0.95 | 0.9548 | 0.9524 | 0.9884 | | 0.0023 | 30.0 | 2880 | 0.1042 | 0.9184 | 0.9574 | 0.9375 | 94 | 0.9394 | 0.9281 | 0.9337 | 167 | 1.0 | 0.9781 | 0.9889 | 137 | 0.9547 | 0.9523 | 0.9535 | 0.9881 | | 0.0028 | 31.0 | 2976 | 0.1003 | 0.9020 | 0.9787 | 0.9388 | 94 | 0.9118 | 0.9281 | 0.9199 | 167 | 0.9853 | 0.9781 | 0.9817 | 137 | 0.9338 | 0.9573 | 0.9454 | 0.9862 | | 0.0015 | 32.0 | 3072 | 0.0802 | 0.91 | 0.9681 | 0.9381 | 94 | 0.9353 | 0.9521 | 0.9436 | 167 | 0.9853 | 0.9781 | 0.9817 | 137 | 0.9458 | 0.9648 | 0.9552 | 0.9890 | | 0.0025 | 33.0 | 3168 | 0.0959 | 0.8667 | 0.9681 | 0.9146 | 94 | 0.9375 | 0.8982 | 0.9174 | 167 | 1.0 | 0.9781 | 0.9889 | 137 | 0.9398 | 0.9422 | 0.9410 | 0.9862 | | 0.0014 | 34.0 | 3264 | 0.0970 | 0.9184 | 0.9574 | 0.9375 | 94 | 0.9286 | 0.9341 | 0.9313 | 167 | 1.0 | 0.9781 | 0.9889 | 137 | 0.95 | 0.9548 | 0.9524 | 0.9881 | | 0.0017 | 35.0 | 3360 | 0.0790 | 0.9570 | 0.9468 | 0.9519 | 94 | 0.9123 | 0.9341 | 0.9231 | 167 | 0.9926 | 0.9781 | 0.9853 | 137 | 0.9499 | 0.9523 | 0.9511 | 0.9890 | | 0.002 | 36.0 | 3456 | 0.0912 | 0.9010 | 0.9681 | 0.9333 | 94 | 0.9317 | 0.8982 | 0.9146 | 167 | 0.9853 | 0.9781 | 0.9817 | 137 | 0.9422 | 0.9422 | 0.9422 | 0.9870 | | 0.0025 | 37.0 | 3552 | 0.1061 | 0.9271 | 0.9468 | 0.9368 | 94 | 0.9030 | 0.8922 | 0.8976 | 167 | 1.0 | 0.9781 | 0.9889 | 137 | 0.9418 | 0.9347 | 0.9382 | 0.9865 | | 0.0028 | 38.0 | 3648 | 0.0982 | 0.9184 | 0.9574 | 0.9375 | 94 | 0.9085 | 0.8922 | 0.9003 | 167 | 1.0 | 0.9781 | 0.9889 | 137 | 0.9419 | 0.9372 | 0.9395 | 0.9870 | | 0.0022 | 39.0 | 3744 | 0.1061 | 0.8969 | 0.9255 | 0.9110 | 94 | 0.8953 | 0.9222 | 0.9086 | 167 | 1.0 | 0.9781 | 0.9889 | 137 | 0.9305 | 0.9422 | 0.9363 | 0.9848 | | 0.0018 | 40.0 | 3840 | 0.1077 | 0.8980 | 0.9362 | 0.9167 | 94 | 0.9202 | 0.8982 | 0.9091 | 167 | 1.0 | 0.9781 | 0.9889 | 137 | 0.9418 | 0.9347 | 0.9382 | 0.9862 | | 0.002 | 41.0 | 3936 | 0.0923 | 0.8980 | 0.9362 | 0.9167 | 94 | 0.9325 | 0.9102 | 0.9212 | 167 | 1.0 | 0.9781 | 0.9889 | 137 | 0.9468 | 0.9397 | 0.9433 | 0.9870 | | 0.003 | 42.0 | 4032 | 0.0899 | 0.9053 | 0.9149 | 0.9101 | 94 | 0.9112 | 0.9222 | 0.9167 | 167 | 0.9853 | 0.9781 | 0.9817 | 137 | 0.935 | 0.9397 | 0.9373 | 0.9862 | | 0.0027 | 43.0 | 4128 | 0.0827 | 0.9355 | 0.9255 | 0.9305 | 94 | 0.9277 | 0.9222 | 0.9249 | 167 | 1.0 | 0.9781 | 0.9889 | 137 | 0.9542 | 0.9422 | 0.9482 | 0.9878 | | 0.0015 | 44.0 | 4224 | 0.0798 | 0.9149 | 0.9149 | 0.9149 | 94 | 0.9102 | 0.9102 | 0.9102 | 167 | 1.0 | 0.9781 | 0.9889 | 137 | 0.9418 | 0.9347 | 0.9382 | 0.9878 | | 0.0011 | 45.0 | 4320 | 0.0868 | 0.8958 | 0.9149 | 0.9053 | 94 | 0.9313 | 0.8922 | 0.9113 | 167 | 0.9853 | 0.9781 | 0.9817 | 137 | 0.9413 | 0.9271 | 0.9342 | 0.9881 | | 0.0012 | 46.0 | 4416 | 0.0743 | 0.8922 | 0.9681 | 0.9286 | 94 | 0.9679 | 0.9042 | 0.9350 | 167 | 0.9852 | 0.9708 | 0.9779 | 137 | 0.9542 | 0.9422 | 0.9482 | 0.9903 | | 0.0012 | 47.0 | 4512 | 0.0870 | 0.9072 | 0.9362 | 0.9215 | 94 | 0.9375 | 0.8982 | 0.9174 | 167 | 0.9853 | 0.9781 | 0.9817 | 137 | 0.9466 | 0.9347 | 0.9406 | 0.9884 | | 0.0019 | 48.0 | 4608 | 0.0759 | 0.89 | 0.9468 | 0.9175 | 94 | 0.9308 | 0.8862 | 0.9080 | 167 | 0.9779 | 0.9708 | 0.9744 | 137 | 0.9367 | 0.9296 | 0.9332 | 0.9881 | | 0.0015 | 49.0 | 4704 | 0.0810 | 0.9271 | 0.9468 | 0.9368 | 94 | 0.9176 | 0.9341 | 0.9258 | 167 | 1.0 | 0.9781 | 0.9889 | 137 | 0.9475 | 0.9523 | 0.9499 | 0.9895 | | 0.0011 | 50.0 | 4800 | 0.0890 | 0.9082 | 0.9468 | 0.9271 | 94 | 0.9506 | 0.9222 | 0.9362 | 167 | 0.9853 | 0.9781 | 0.9817 | 137 | 0.9520 | 0.9472 | 0.9496 | 0.9890 | | 0.0007 | 51.0 | 4896 | 0.0827 | 0.9167 | 0.9362 | 0.9263 | 94 | 0.9341 | 0.9341 | 0.9341 | 167 | 0.9853 | 0.9781 | 0.9817 | 137 | 0.9474 | 0.9497 | 0.9486 | 0.9895 | | 0.001 | 52.0 | 4992 | 0.0873 | 0.8980 | 0.9362 | 0.9167 | 94 | 0.9281 | 0.9281 | 0.9281 | 167 | 0.9926 | 0.9781 | 0.9853 | 137 | 0.9425 | 0.9472 | 0.9449 | 0.9887 | | 0.001 | 53.0 | 5088 | 0.0820 | 0.8980 | 0.9362 | 0.9167 | 94 | 0.9394 | 0.9281 | 0.9337 | 167 | 0.9852 | 0.9708 | 0.9779 | 137 | 0.9447 | 0.9447 | 0.9447 | 0.9890 | | 0.0004 | 54.0 | 5184 | 0.0917 | 0.8911 | 0.9574 | 0.9231 | 94 | 0.9434 | 0.8982 | 0.9202 | 167 | 0.9853 | 0.9781 | 0.9817 | 137 | 0.9444 | 0.9397 | 0.9421 | 0.9867 | | 0.0006 | 55.0 | 5280 | 0.1053 | 0.8980 | 0.9362 | 0.9167 | 94 | 0.9333 | 0.9222 | 0.9277 | 167 | 0.9926 | 0.9781 | 0.9853 | 137 | 0.9447 | 0.9447 | 0.9447 | 0.9884 | | 0.001 | 56.0 | 5376 | 0.1040 | 0.8990 | 0.9468 | 0.9223 | 94 | 0.9333 | 0.9222 | 0.9277 | 167 | 0.9853 | 0.9781 | 0.9817 | 137 | 0.9425 | 0.9472 | 0.9449 | 0.9881 | | 0.0005 | 57.0 | 5472 | 0.1042 | 0.8990 | 0.9468 | 0.9223 | 94 | 0.9337 | 0.9281 | 0.9309 | 167 | 0.9926 | 0.9781 | 0.9853 | 137 | 0.945 | 0.9497 | 0.9474 | 0.9884 | | 0.0009 | 58.0 | 5568 | 0.1057 | 0.9082 | 0.9468 | 0.9271 | 94 | 0.9202 | 0.8982 | 0.9091 | 167 | 0.9853 | 0.9781 | 0.9817 | 137 | 0.9395 | 0.9372 | 0.9384 | 0.9876 | | 0.001 | 59.0 | 5664 | 0.1034 | 0.8911 | 0.9574 | 0.9231 | 94 | 0.9277 | 0.9222 | 0.9249 | 167 | 1.0 | 0.9781 | 0.9889 | 137 | 0.9426 | 0.9497 | 0.9462 | 0.9873 | | 0.0012 | 60.0 | 5760 | 0.0910 | 0.9072 | 0.9362 | 0.9215 | 94 | 0.9337 | 0.9281 | 0.9309 | 167 | 0.9779 | 0.9708 | 0.9744 | 137 | 0.9424 | 0.9447 | 0.9435 | 0.9887 | | 0.0008 | 61.0 | 5856 | 0.0987 | 0.9247 | 0.9149 | 0.9198 | 94 | 0.9102 | 0.9102 | 0.9102 | 167 | 0.9779 | 0.9708 | 0.9744 | 137 | 0.9369 | 0.9322 | 0.9345 | 0.9862 | | 0.0005 | 62.0 | 5952 | 0.1056 | 0.8889 | 0.9362 | 0.9119 | 94 | 0.9387 | 0.9162 | 0.9273 | 167 | 1.0 | 0.9781 | 0.9889 | 137 | 0.9470 | 0.9422 | 0.9446 | 0.9876 | | 0.0006 | 63.0 | 6048 | 0.1050 | 0.8980 | 0.9362 | 0.9167 | 94 | 0.9268 | 0.9102 | 0.9184 | 167 | 0.9926 | 0.9781 | 0.9853 | 137 | 0.9421 | 0.9397 | 0.9409 | 0.9873 | | 0.0013 | 64.0 | 6144 | 0.0956 | 0.9072 | 0.9362 | 0.9215 | 94 | 0.9329 | 0.9162 | 0.9245 | 167 | 1.0 | 0.9781 | 0.9889 | 137 | 0.9494 | 0.9422 | 0.9458 | 0.9884 | | 0.0006 | 65.0 | 6240 | 0.1061 | 0.9082 | 0.9468 | 0.9271 | 94 | 0.9313 | 0.8922 | 0.9113 | 167 | 1.0 | 0.9781 | 0.9889 | 137 | 0.9490 | 0.9347 | 0.9418 | 0.9854 | | 0.0008 | 66.0 | 6336 | 0.1032 | 0.8980 | 0.9362 | 0.9167 | 94 | 0.9325 | 0.9102 | 0.9212 | 167 | 0.9926 | 0.9781 | 0.9853 | 137 | 0.9444 | 0.9397 | 0.9421 | 0.9881 | | 0.0004 | 67.0 | 6432 | 0.0961 | 0.8980 | 0.9362 | 0.9167 | 94 | 0.9273 | 0.9162 | 0.9217 | 167 | 1.0 | 0.9781 | 0.9889 | 137 | 0.9446 | 0.9422 | 0.9434 | 0.9890 | | 0.0008 | 68.0 | 6528 | 0.0979 | 0.88 | 0.9362 | 0.9072 | 94 | 0.925 | 0.8862 | 0.9052 | 167 | 0.9926 | 0.9781 | 0.9853 | 137 | 0.9367 | 0.9296 | 0.9332 | 0.9870 | | 0.0013 | 69.0 | 6624 | 0.1021 | 0.89 | 0.9468 | 0.9175 | 94 | 0.9162 | 0.9162 | 0.9162 | 167 | 1.0 | 0.9781 | 0.9889 | 137 | 0.9377 | 0.9447 | 0.9412 | 0.9870 | | 0.0004 | 70.0 | 6720 | 0.0933 | 0.88 | 0.9362 | 0.9072 | 94 | 0.9264 | 0.9042 | 0.9152 | 167 | 1.0 | 0.9781 | 0.9889 | 137 | 0.9395 | 0.9372 | 0.9384 | 0.9881 | | 0.001 | 71.0 | 6816 | 0.0892 | 0.8788 | 0.9255 | 0.9016 | 94 | 0.9264 | 0.9042 | 0.9152 | 167 | 0.9852 | 0.9708 | 0.9779 | 137 | 0.9345 | 0.9322 | 0.9333 | 0.9881 | | 0.0006 | 72.0 | 6912 | 0.0966 | 0.9091 | 0.9574 | 0.9326 | 94 | 0.9509 | 0.9281 | 0.9394 | 167 | 0.9926 | 0.9781 | 0.9853 | 137 | 0.9547 | 0.9523 | 0.9535 | 0.9892 | | 0.0006 | 73.0 | 7008 | 0.0997 | 0.8911 | 0.9574 | 0.9231 | 94 | 0.9441 | 0.9102 | 0.9268 | 167 | 1.0 | 0.9781 | 0.9889 | 137 | 0.9495 | 0.9447 | 0.9471 | 0.9884 | | 0.0004 | 74.0 | 7104 | 0.1035 | 0.8824 | 0.9574 | 0.9184 | 94 | 0.9497 | 0.9042 | 0.9264 | 167 | 0.9926 | 0.9781 | 0.9853 | 137 | 0.9470 | 0.9422 | 0.9446 | 0.9881 | | 0.0005 | 75.0 | 7200 | 0.1036 | 0.8788 | 0.9255 | 0.9016 | 94 | 0.9371 | 0.8922 | 0.9141 | 167 | 0.9852 | 0.9708 | 0.9779 | 137 | 0.9389 | 0.9271 | 0.9330 | 0.9870 | | 0.0004 | 76.0 | 7296 | 0.0978 | 0.8788 | 0.9255 | 0.9016 | 94 | 0.9317 | 0.8982 | 0.9146 | 167 | 0.9638 | 0.9708 | 0.9673 | 137 | 0.9296 | 0.9296 | 0.9296 | 0.9867 | | 0.0004 | 77.0 | 7392 | 0.0896 | 0.88 | 0.9362 | 0.9072 | 94 | 0.9273 | 0.9162 | 0.9217 | 167 | 0.9926 | 0.9781 | 0.9853 | 137 | 0.9375 | 0.9422 | 0.9398 | 0.9887 | | 0.0007 | 78.0 | 7488 | 0.1034 | 0.8889 | 0.9362 | 0.9119 | 94 | 0.9308 | 0.8862 | 0.9080 | 167 | 1.0 | 0.9781 | 0.9889 | 137 | 0.9439 | 0.9296 | 0.9367 | 0.9878 | | 0.0004 | 79.0 | 7584 | 0.1117 | 0.8812 | 0.9468 | 0.9128 | 94 | 0.9259 | 0.8982 | 0.9119 | 167 | 1.0 | 0.9781 | 0.9889 | 137 | 0.9395 | 0.9372 | 0.9384 | 0.9873 | | 0.0006 | 80.0 | 7680 | 0.1053 | 0.8980 | 0.9362 | 0.9167 | 94 | 0.9017 | 0.9341 | 0.9176 | 167 | 1.0 | 0.9781 | 0.9889 | 137 | 0.9333 | 0.9497 | 0.9415 | 0.9873 | | 0.0003 | 81.0 | 7776 | 0.1023 | 0.8980 | 0.9362 | 0.9167 | 94 | 0.9222 | 0.9222 | 0.9222 | 167 | 1.0 | 0.9781 | 0.9889 | 137 | 0.9424 | 0.9447 | 0.9435 | 0.9884 | | 0.0005 | 82.0 | 7872 | 0.0998 | 0.8990 | 0.9468 | 0.9223 | 94 | 0.9281 | 0.9281 | 0.9281 | 167 | 1.0 | 0.9781 | 0.9889 | 137 | 0.945 | 0.9497 | 0.9474 | 0.9887 | | 0.0004 | 83.0 | 7968 | 0.1031 | 0.8980 | 0.9362 | 0.9167 | 94 | 0.9222 | 0.9222 | 0.9222 | 167 | 1.0 | 0.9781 | 0.9889 | 137 | 0.9424 | 0.9447 | 0.9435 | 0.9884 | | 0.0002 | 84.0 | 8064 | 0.1076 | 0.9072 | 0.9362 | 0.9215 | 94 | 0.9273 | 0.9162 | 0.9217 | 167 | 1.0 | 0.9781 | 0.9889 | 137 | 0.9470 | 0.9422 | 0.9446 | 0.9890 | | 0.0008 | 85.0 | 8160 | 0.1031 | 0.9062 | 0.9255 | 0.9158 | 94 | 0.9273 | 0.9162 | 0.9217 | 167 | 0.9925 | 0.9708 | 0.9815 | 137 | 0.9443 | 0.9372 | 0.9407 | 0.9887 | | 0.0003 | 86.0 | 8256 | 0.0967 | 0.9062 | 0.9255 | 0.9158 | 94 | 0.9383 | 0.9102 | 0.9240 | 167 | 0.9925 | 0.9708 | 0.9815 | 137 | 0.9490 | 0.9347 | 0.9418 | 0.9892 | | 0.0005 | 87.0 | 8352 | 0.0978 | 0.8889 | 0.9362 | 0.9119 | 94 | 0.9317 | 0.8982 | 0.9146 | 167 | 1.0 | 0.9781 | 0.9889 | 137 | 0.9442 | 0.9347 | 0.9394 | 0.9884 | | 0.0003 | 88.0 | 8448 | 0.1104 | 0.8889 | 0.9362 | 0.9119 | 94 | 0.9375 | 0.8982 | 0.9174 | 167 | 1.0 | 0.9781 | 0.9889 | 137 | 0.9466 | 0.9347 | 0.9406 | 0.9881 | | 0.0005 | 89.0 | 8544 | 0.1069 | 0.89 | 0.9468 | 0.9175 | 94 | 0.9441 | 0.9102 | 0.9268 | 167 | 1.0 | 0.9781 | 0.9889 | 137 | 0.9494 | 0.9422 | 0.9458 | 0.9887 | | 0.0003 | 90.0 | 8640 | 0.1071 | 0.89 | 0.9468 | 0.9175 | 94 | 0.9441 | 0.9102 | 0.9268 | 167 | 1.0 | 0.9781 | 0.9889 | 137 | 0.9494 | 0.9422 | 0.9458 | 0.9887 | | 0.0005 | 91.0 | 8736 | 0.1068 | 0.89 | 0.9468 | 0.9175 | 94 | 0.9441 | 0.9102 | 0.9268 | 167 | 1.0 | 0.9781 | 0.9889 | 137 | 0.9494 | 0.9422 | 0.9458 | 0.9887 | | 0.0004 | 92.0 | 8832 | 0.1078 | 0.89 | 0.9468 | 0.9175 | 94 | 0.9444 | 0.9162 | 0.9301 | 167 | 1.0 | 0.9781 | 0.9889 | 137 | 0.9495 | 0.9447 | 0.9471 | 0.9890 | | 0.0003 | 93.0 | 8928 | 0.1079 | 0.89 | 0.9468 | 0.9175 | 94 | 0.9444 | 0.9162 | 0.9301 | 167 | 1.0 | 0.9781 | 0.9889 | 137 | 0.9495 | 0.9447 | 0.9471 | 0.9890 | | 0.0004 | 94.0 | 9024 | 0.1082 | 0.89 | 0.9468 | 0.9175 | 94 | 0.9387 | 0.9162 | 0.9273 | 167 | 1.0 | 0.9781 | 0.9889 | 137 | 0.9471 | 0.9447 | 0.9459 | 0.9887 | | 0.0003 | 95.0 | 9120 | 0.1080 | 0.89 | 0.9468 | 0.9175 | 94 | 0.9387 | 0.9162 | 0.9273 | 167 | 1.0 | 0.9781 | 0.9889 | 137 | 0.9471 | 0.9447 | 0.9459 | 0.9887 | | 0.0003 | 96.0 | 9216 | 0.1082 | 0.89 | 0.9468 | 0.9175 | 94 | 0.9387 | 0.9162 | 0.9273 | 167 | 1.0 | 0.9781 | 0.9889 | 137 | 0.9471 | 0.9447 | 0.9459 | 0.9887 | | 0.0002 | 97.0 | 9312 | 0.1080 | 0.89 | 0.9468 | 0.9175 | 94 | 0.9387 | 0.9162 | 0.9273 | 167 | 1.0 | 0.9781 | 0.9889 | 137 | 0.9471 | 0.9447 | 0.9459 | 0.9887 | | 0.0003 | 98.0 | 9408 | 0.1080 | 0.89 | 0.9468 | 0.9175 | 94 | 0.9444 | 0.9162 | 0.9301 | 167 | 1.0 | 0.9781 | 0.9889 | 137 | 0.9495 | 0.9447 | 0.9471 | 0.9890 | | 0.0003 | 99.0 | 9504 | 0.1085 | 0.89 | 0.9468 | 0.9175 | 94 | 0.9387 | 0.9162 | 0.9273 | 167 | 1.0 | 0.9781 | 0.9889 | 137 | 0.9471 | 0.9447 | 0.9459 | 0.9887 | | 0.0002 | 100.0 | 9600 | 0.1084 | 0.89 | 0.9468 | 0.9175 | 94 | 0.9387 | 0.9162 | 0.9273 | 167 | 1.0 | 0.9781 | 0.9889 | 137 | 0.9471 | 0.9447 | 0.9459 | 0.9887 | ### Framework versions - Transformers 4.39.3 - Pytorch 2.3.0+cu121 - Datasets 2.19.1 - Tokenizers 0.15.2
ThreeZ/6_1
ThreeZ
2024-06-04T00:10:17Z
140
0
transformers
[ "transformers", "safetensors", "llama", "text-generation", "conversational", "arxiv:1910.09700", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2024-05-17T12:40:33Z
--- library_name: transformers tags: [] --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
martinsinnona/visdecode_vega_1
martinsinnona
2024-06-04T00:09:09Z
48
0
transformers
[ "transformers", "safetensors", "pix2struct", "image-text-to-text", "arxiv:1910.09700", "endpoints_compatible", "region:us" ]
image-text-to-text
2024-05-21T18:34:35Z
--- library_name: transformers tags: [] --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
Andresckamilo/Lora-AgentCustomer
Andresckamilo
2024-06-04T00:08:22Z
0
0
transformers
[ "transformers", "safetensors", "unsloth", "arxiv:1910.09700", "endpoints_compatible", "region:us" ]
null
2024-06-04T00:07:35Z
--- library_name: transformers tags: - unsloth --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
ahmedesmail16/Paper_Compared-swinv2-base
ahmedesmail16
2024-06-04T00:07:01Z
155
0
transformers
[ "transformers", "tensorboard", "safetensors", "swinv2", "image-classification", "generated_from_trainer", "base_model:microsoft/swinv2-base-patch4-window12-192-22k", "base_model:finetune:microsoft/swinv2-base-patch4-window12-192-22k", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
image-classification
2024-06-03T17:57:27Z
--- license: apache-2.0 base_model: microsoft/swinv2-base-patch4-window12-192-22k tags: - generated_from_trainer metrics: - accuracy model-index: - name: Paper_Compared-swinv2-base results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # Paper_Compared-swinv2-base This model is a fine-tuned version of [microsoft/swinv2-base-patch4-window12-192-22k](https://huggingface.co/microsoft/swinv2-base-patch4-window12-192-22k) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.7159 - Accuracy: 0.8533 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:------:|:----:|:---------------:|:--------:| | 1.6917 | 0.9492 | 14 | 0.7844 | 0.7562 | | 0.7734 | 1.9661 | 29 | 0.4380 | 0.8521 | | 0.1927 | 2.9831 | 44 | 0.4694 | 0.8544 | | 0.0956 | 4.0 | 59 | 0.6487 | 0.8251 | | 0.0638 | 4.9492 | 73 | 0.6688 | 0.8296 | | 0.0343 | 5.9661 | 88 | 0.7615 | 0.8352 | | 0.0182 | 6.9831 | 103 | 0.7470 | 0.8352 | | 0.038 | 8.0 | 118 | 0.7666 | 0.8465 | | 0.0057 | 8.9492 | 132 | 0.7086 | 0.8454 | | 0.0062 | 9.4915 | 140 | 0.7159 | 0.8533 | ### Framework versions - Transformers 4.41.1 - Pytorch 2.1.2 - Datasets 2.19.2 - Tokenizers 0.19.1
powermove72/GK-inv-MoE-0.1
powermove72
2024-06-03T23:56:52Z
7
0
transformers
[ "transformers", "safetensors", "mixtral", "text-generation", "moe", "frankenmoe", "merge", "mergekit", "lazymergekit", "GritLM/GritLM-7B", "argilla/notus-7b-v1", "conversational", "custom_code", "base_model:GritLM/GritLM-7B", "base_model:merge:GritLM/GritLM-7B", "base_model:argilla/notus-7b-v1", "base_model:merge:argilla/notus-7b-v1", "license:apache-2.0", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2024-06-03T23:49:56Z
--- license: apache-2.0 tags: - moe - frankenmoe - merge - mergekit - lazymergekit - GritLM/GritLM-7B - argilla/notus-7b-v1 base_model: - GritLM/GritLM-7B - argilla/notus-7b-v1 --- # GK-inv-MoE-0.1 GK-inv-MoE-0.1 is a Mixture of Experts (MoE) made with the following models using [LazyMergekit](https://colab.research.google.com/drive/1obulZ1ROXHjYLn6PPZJwRR6GzgQogxxb?usp=sharing): * [GritLM/GritLM-7B](https://huggingface.co/GritLM/GritLM-7B) * [argilla/notus-7b-v1](https://huggingface.co/argilla/notus-7b-v1) ## 🧩 Configuration ```yaml base_model: GritLM/GritLM-7B experts: - source_model: GritLM/GritLM-7B positive_prompts: - "chat" - "assistant" - "tell me" - "explain" - "I want" - "reason" - "math" - "mathematics" - "solve" - "count" - source_model: argilla/notus-7b-v1 positive_prompts: - "code" - "VB.NET" - "vb.net" - "programming" - "algorithm" - "develop" ``` ## 💻 Usage ```python !pip install -qU transformers bitsandbytes accelerate from transformers import AutoTokenizer import transformers import torch model = "powermove72/GK-inv-MoE-0.1" tokenizer = AutoTokenizer.from_pretrained(model) pipeline = transformers.pipeline( "text-generation", model=model, model_kwargs={"torch_dtype": torch.float16, "load_in_4bit": True}, ) messages = [{"role": "user", "content": "Explain what a Mixture of Experts is in less than 100 words."}] prompt = pipeline.tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True) outputs = pipeline(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95) print(outputs[0]["generated_text"]) ```
kevinvelez18/ViT_model
kevinvelez18
2024-06-03T23:46:48Z
222
0
transformers
[ "transformers", "tensorboard", "safetensors", "vit", "image-classification", "generated_from_trainer", "base_model:google/vit-base-patch16-224-in21k", "base_model:finetune:google/vit-base-patch16-224-in21k", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
image-classification
2024-06-03T23:43:33Z
--- license: apache-2.0 base_model: google/vit-base-patch16-224-in21k tags: - generated_from_trainer metrics: - accuracy model-index: - name: ViT_model results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # ViT_model This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.0252 - Accuracy: 0.9925 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0002 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 4 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:------:|:----:|:---------------:|:--------:| | 0.1492 | 3.8462 | 500 | 0.0252 | 0.9925 | ### Framework versions - Transformers 4.41.1 - Pytorch 2.3.0+cu121 - Datasets 2.19.2 - Tokenizers 0.19.1
apwic/nerugm-unipelt-3
apwic
2024-06-03T23:43:57Z
0
0
null
[ "tensorboard", "generated_from_trainer", "id", "base_model:indolem/indobert-base-uncased", "base_model:finetune:indolem/indobert-base-uncased", "license:mit", "region:us" ]
null
2024-05-28T02:27:08Z
--- language: - id license: mit base_model: indolem/indobert-base-uncased tags: - generated_from_trainer model-index: - name: nerugm-unipelt-3 results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # nerugm-unipelt-3 This model is a fine-tuned version of [indolem/indobert-base-uncased](https://huggingface.co/indolem/indobert-base-uncased) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.2237 - Location Precision: 0.725 - Location Recall: 0.8169 - Location F1: 0.7682 - Location Number: 71 - Organization Precision: 0.6962 - Organization Recall: 0.8462 - Organization F1: 0.7639 - Organization Number: 65 - Person Precision: 0.8924 - Person Recall: 0.94 - Person F1: 0.9156 - Person Number: 150 - Quantity Precision: 0.7179 - Quantity Recall: 0.8485 - Quantity F1: 0.7778 - Quantity Number: 33 - Time Precision: 0.8 - Time Recall: 0.8571 - Time F1: 0.8276 - Time Number: 28 - Overall Precision: 0.7927 - Overall Recall: 0.8818 - Overall F1: 0.8349 - Overall Accuracy: 0.9612 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 64 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 100.0 ### Training results | Training Loss | Epoch | Step | Validation Loss | Location Precision | Location Recall | Location F1 | Location Number | Organization Precision | Organization Recall | Organization F1 | Organization Number | Person Precision | Person Recall | Person F1 | Person Number | Quantity Precision | Quantity Recall | Quantity F1 | Quantity Number | Time Precision | Time Recall | Time F1 | Time Number | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy | |:-------------:|:-----:|:-----:|:---------------:|:------------------:|:---------------:|:-----------:|:---------------:|:----------------------:|:-------------------:|:---------------:|:-------------------:|:----------------:|:-------------:|:---------:|:-------------:|:------------------:|:---------------:|:-----------:|:---------------:|:--------------:|:-----------:|:-------:|:-----------:|:-----------------:|:--------------:|:----------:|:----------------:| | 0.9512 | 1.0 | 106 | 0.6040 | 0.0 | 0.0 | 0.0 | 71 | 0.0 | 0.0 | 0.0 | 65 | 0.3333 | 0.0067 | 0.0131 | 150 | 0.0 | 0.0 | 0.0 | 33 | 0.0 | 0.0 | 0.0 | 28 | 0.3333 | 0.0029 | 0.0057 | 0.8412 | | 0.5021 | 2.0 | 212 | 0.3378 | 0.5472 | 0.4085 | 0.4677 | 71 | 0.2063 | 0.2 | 0.2031 | 65 | 0.6218 | 0.8 | 0.6997 | 150 | 0.2 | 0.1818 | 0.1905 | 33 | 0.4545 | 0.5357 | 0.4918 | 28 | 0.4919 | 0.5274 | 0.5090 | 0.9076 | | 0.2912 | 3.0 | 318 | 0.1864 | 0.5682 | 0.7042 | 0.6289 | 71 | 0.5797 | 0.6154 | 0.5970 | 65 | 0.7892 | 0.8733 | 0.8291 | 150 | 0.4878 | 0.6061 | 0.5405 | 33 | 0.88 | 0.7857 | 0.8302 | 28 | 0.6761 | 0.7579 | 0.7147 | 0.9377 | | 0.1943 | 4.0 | 424 | 0.1580 | 0.6186 | 0.8451 | 0.7143 | 71 | 0.5670 | 0.8462 | 0.6790 | 65 | 0.8059 | 0.9133 | 0.8562 | 150 | 0.5714 | 0.8485 | 0.6829 | 33 | 0.8571 | 0.8571 | 0.8571 | 28 | 0.6893 | 0.8761 | 0.7716 | 0.9479 | | 0.1653 | 5.0 | 530 | 0.1364 | 0.6316 | 0.8451 | 0.7229 | 71 | 0.6049 | 0.7538 | 0.6712 | 65 | 0.8282 | 0.9 | 0.8626 | 150 | 0.7143 | 0.9091 | 0.8 | 33 | 0.75 | 0.8571 | 0.8000 | 28 | 0.7215 | 0.8588 | 0.7842 | 0.9525 | | 0.152 | 6.0 | 636 | 0.1579 | 0.6383 | 0.8451 | 0.7273 | 71 | 0.5567 | 0.8308 | 0.6667 | 65 | 0.7943 | 0.9267 | 0.8554 | 150 | 0.6087 | 0.8485 | 0.7089 | 33 | 0.6053 | 0.8214 | 0.6970 | 28 | 0.6756 | 0.8761 | 0.7629 | 0.9447 | | 0.1379 | 7.0 | 742 | 0.1532 | 0.6559 | 0.8592 | 0.7439 | 71 | 0.6067 | 0.8308 | 0.7013 | 65 | 0.7943 | 0.9267 | 0.8554 | 150 | 0.5306 | 0.7879 | 0.6341 | 33 | 0.6216 | 0.8214 | 0.7077 | 28 | 0.6840 | 0.8732 | 0.7671 | 0.9464 | | 0.125 | 8.0 | 848 | 0.1271 | 0.6458 | 0.8732 | 0.7425 | 71 | 0.5914 | 0.8462 | 0.6962 | 65 | 0.8571 | 0.92 | 0.8875 | 150 | 0.7105 | 0.8182 | 0.7606 | 33 | 0.75 | 0.8571 | 0.8000 | 28 | 0.7286 | 0.8818 | 0.7979 | 0.9561 | | 0.1161 | 9.0 | 954 | 0.1268 | 0.6829 | 0.7887 | 0.7320 | 71 | 0.5556 | 0.8462 | 0.6707 | 65 | 0.8519 | 0.92 | 0.8846 | 150 | 0.7179 | 0.8485 | 0.7778 | 33 | 0.8 | 0.8571 | 0.8276 | 28 | 0.7306 | 0.8674 | 0.7931 | 0.9549 | | 0.1124 | 10.0 | 1060 | 0.1231 | 0.6988 | 0.8169 | 0.7532 | 71 | 0.6667 | 0.8 | 0.7273 | 65 | 0.8434 | 0.9333 | 0.8861 | 150 | 0.6829 | 0.8485 | 0.7568 | 33 | 0.7273 | 0.8571 | 0.7869 | 28 | 0.7531 | 0.8703 | 0.8075 | 0.9593 | | 0.1059 | 11.0 | 1166 | 0.1203 | 0.6988 | 0.8169 | 0.7532 | 71 | 0.6667 | 0.7692 | 0.7143 | 65 | 0.8625 | 0.92 | 0.8903 | 150 | 0.675 | 0.8182 | 0.7397 | 33 | 0.7273 | 0.8571 | 0.7869 | 28 | 0.7596 | 0.8559 | 0.8049 | 0.9593 | | 0.0975 | 12.0 | 1272 | 0.1371 | 0.6667 | 0.8169 | 0.7342 | 71 | 0.5833 | 0.8615 | 0.6957 | 65 | 0.8176 | 0.9267 | 0.8688 | 150 | 0.7368 | 0.8485 | 0.7887 | 33 | 0.6944 | 0.8929 | 0.7812 | 28 | 0.7166 | 0.8818 | 0.7907 | 0.9527 | | 0.0915 | 13.0 | 1378 | 0.1216 | 0.7195 | 0.8310 | 0.7712 | 71 | 0.6353 | 0.8308 | 0.7200 | 65 | 0.8545 | 0.94 | 0.8952 | 150 | 0.7105 | 0.8182 | 0.7606 | 33 | 0.9259 | 0.8929 | 0.9091 | 28 | 0.7708 | 0.8818 | 0.8226 | 0.9610 | | 0.0913 | 14.0 | 1484 | 0.1168 | 0.7215 | 0.8028 | 0.76 | 71 | 0.6883 | 0.8154 | 0.7465 | 65 | 0.875 | 0.9333 | 0.9032 | 150 | 0.7632 | 0.8788 | 0.8169 | 33 | 0.9231 | 0.8571 | 0.8889 | 28 | 0.7974 | 0.8732 | 0.8336 | 0.9629 | | 0.0853 | 15.0 | 1590 | 0.1217 | 0.75 | 0.8451 | 0.7947 | 71 | 0.6548 | 0.8462 | 0.7383 | 65 | 0.8571 | 0.92 | 0.8875 | 150 | 0.7073 | 0.8788 | 0.7838 | 33 | 0.7812 | 0.8929 | 0.8333 | 28 | 0.7714 | 0.8847 | 0.8242 | 0.9605 | | 0.0809 | 16.0 | 1696 | 0.1305 | 0.7176 | 0.8592 | 0.7821 | 71 | 0.6667 | 0.8615 | 0.7517 | 65 | 0.8476 | 0.9267 | 0.8854 | 150 | 0.6429 | 0.8182 | 0.7200 | 33 | 0.8065 | 0.8929 | 0.8475 | 28 | 0.7586 | 0.8876 | 0.8181 | 0.9590 | | 0.0773 | 17.0 | 1802 | 0.1276 | 0.7059 | 0.8451 | 0.7692 | 71 | 0.6136 | 0.8308 | 0.7059 | 65 | 0.8598 | 0.94 | 0.8981 | 150 | 0.7179 | 0.8485 | 0.7778 | 33 | 0.7576 | 0.8929 | 0.8197 | 28 | 0.7531 | 0.8876 | 0.8148 | 0.9590 | | 0.0737 | 18.0 | 1908 | 0.1533 | 0.6593 | 0.8451 | 0.7407 | 71 | 0.6 | 0.8308 | 0.6968 | 65 | 0.8343 | 0.94 | 0.8840 | 150 | 0.6279 | 0.8182 | 0.7105 | 33 | 0.7419 | 0.8214 | 0.7797 | 28 | 0.7193 | 0.8790 | 0.7912 | 0.9493 | | 0.0703 | 19.0 | 2014 | 0.1193 | 0.7436 | 0.8169 | 0.7785 | 71 | 0.6790 | 0.8462 | 0.7534 | 65 | 0.8790 | 0.92 | 0.8990 | 150 | 0.7 | 0.8485 | 0.7671 | 33 | 0.8571 | 0.8571 | 0.8571 | 28 | 0.7891 | 0.8732 | 0.8290 | 0.9631 | | 0.0713 | 20.0 | 2120 | 0.1247 | 0.7349 | 0.8592 | 0.7922 | 71 | 0.6322 | 0.8462 | 0.7237 | 65 | 0.8868 | 0.94 | 0.9126 | 150 | 0.6279 | 0.8182 | 0.7105 | 33 | 0.9615 | 0.8929 | 0.9259 | 28 | 0.7764 | 0.8905 | 0.8295 | 0.9624 | | 0.0649 | 21.0 | 2226 | 0.1380 | 0.7093 | 0.8592 | 0.7771 | 71 | 0.5895 | 0.8615 | 0.7 | 65 | 0.8688 | 0.9267 | 0.8968 | 150 | 0.6667 | 0.8485 | 0.7467 | 33 | 0.7742 | 0.8571 | 0.8136 | 28 | 0.7440 | 0.8876 | 0.8095 | 0.9564 | | 0.0645 | 22.0 | 2332 | 0.1445 | 0.7037 | 0.8028 | 0.75 | 71 | 0.6548 | 0.8462 | 0.7383 | 65 | 0.8924 | 0.94 | 0.9156 | 150 | 0.6429 | 0.8182 | 0.7200 | 33 | 0.8333 | 0.8929 | 0.8621 | 28 | 0.7722 | 0.8790 | 0.8221 | 0.9588 | | 0.0595 | 23.0 | 2438 | 0.1374 | 0.6897 | 0.8451 | 0.7595 | 71 | 0.6136 | 0.8308 | 0.7059 | 65 | 0.8924 | 0.94 | 0.9156 | 150 | 0.7436 | 0.8788 | 0.8056 | 33 | 0.9259 | 0.8929 | 0.9091 | 28 | 0.7744 | 0.8905 | 0.8284 | 0.9622 | | 0.0576 | 24.0 | 2544 | 0.1402 | 0.7059 | 0.8451 | 0.7692 | 71 | 0.65 | 0.8 | 0.7172 | 65 | 0.8696 | 0.9333 | 0.9003 | 150 | 0.7568 | 0.8485 | 0.8000 | 33 | 0.9259 | 0.8929 | 0.9091 | 28 | 0.7821 | 0.8790 | 0.8277 | 0.9590 | | 0.0562 | 25.0 | 2650 | 0.1584 | 0.7059 | 0.8451 | 0.7692 | 71 | 0.6022 | 0.8615 | 0.7089 | 65 | 0.875 | 0.9333 | 0.9032 | 150 | 0.6279 | 0.8182 | 0.7105 | 33 | 0.7576 | 0.8929 | 0.8197 | 28 | 0.7440 | 0.8876 | 0.8095 | 0.9554 | | 0.0533 | 26.0 | 2756 | 0.1501 | 0.7089 | 0.7887 | 0.7467 | 71 | 0.6429 | 0.8308 | 0.7248 | 65 | 0.8485 | 0.9333 | 0.8889 | 150 | 0.6047 | 0.7879 | 0.6842 | 33 | 0.8621 | 0.8929 | 0.8772 | 28 | 0.7525 | 0.8674 | 0.8059 | 0.9539 | | 0.0529 | 27.0 | 2862 | 0.1519 | 0.7093 | 0.8592 | 0.7771 | 71 | 0.6364 | 0.8615 | 0.7320 | 65 | 0.8688 | 0.9267 | 0.8968 | 150 | 0.6667 | 0.8485 | 0.7467 | 33 | 0.8929 | 0.8929 | 0.8929 | 28 | 0.7649 | 0.8905 | 0.8229 | 0.9576 | | 0.0531 | 28.0 | 2968 | 0.1364 | 0.7763 | 0.8310 | 0.8027 | 71 | 0.6867 | 0.8769 | 0.7703 | 65 | 0.8625 | 0.92 | 0.8903 | 150 | 0.7105 | 0.8182 | 0.7606 | 33 | 0.9259 | 0.8929 | 0.9091 | 28 | 0.7969 | 0.8818 | 0.8372 | 0.9641 | | 0.0463 | 29.0 | 3074 | 0.1396 | 0.7403 | 0.8028 | 0.7703 | 71 | 0.6552 | 0.8769 | 0.75 | 65 | 0.8704 | 0.94 | 0.9038 | 150 | 0.8056 | 0.8788 | 0.8406 | 33 | 0.8571 | 0.8571 | 0.8571 | 28 | 0.7897 | 0.8876 | 0.8358 | 0.9624 | | 0.0466 | 30.0 | 3180 | 0.1535 | 0.7564 | 0.8310 | 0.7919 | 71 | 0.6196 | 0.8769 | 0.7261 | 65 | 0.8696 | 0.9333 | 0.9003 | 150 | 0.675 | 0.8182 | 0.7397 | 33 | 0.8621 | 0.8929 | 0.8772 | 28 | 0.77 | 0.8876 | 0.8246 | 0.9585 | | 0.0449 | 31.0 | 3286 | 0.1608 | 0.7093 | 0.8592 | 0.7771 | 71 | 0.6835 | 0.8308 | 0.75 | 65 | 0.8765 | 0.9467 | 0.9103 | 150 | 0.6829 | 0.8485 | 0.7568 | 33 | 0.8276 | 0.8571 | 0.8421 | 28 | 0.7783 | 0.8905 | 0.8306 | 0.9593 | | 0.043 | 32.0 | 3392 | 0.1635 | 0.6860 | 0.8310 | 0.7516 | 71 | 0.6136 | 0.8308 | 0.7059 | 65 | 0.8765 | 0.9467 | 0.9103 | 150 | 0.6190 | 0.7879 | 0.6933 | 33 | 0.8065 | 0.8929 | 0.8475 | 28 | 0.7482 | 0.8818 | 0.8095 | 0.9556 | | 0.0394 | 33.0 | 3498 | 0.1503 | 0.75 | 0.8028 | 0.7755 | 71 | 0.6543 | 0.8154 | 0.7260 | 65 | 0.8987 | 0.9467 | 0.9221 | 150 | 0.7838 | 0.8788 | 0.8286 | 33 | 0.8889 | 0.8571 | 0.8727 | 28 | 0.8047 | 0.8790 | 0.8402 | 0.9641 | | 0.0412 | 34.0 | 3604 | 0.1466 | 0.7662 | 0.8310 | 0.7973 | 71 | 0.6951 | 0.8769 | 0.7755 | 65 | 0.8987 | 0.9467 | 0.9221 | 150 | 0.7 | 0.8485 | 0.7671 | 33 | 0.8889 | 0.8571 | 0.8727 | 28 | 0.8073 | 0.8934 | 0.8482 | 0.9646 | | 0.0403 | 35.0 | 3710 | 0.1525 | 0.7143 | 0.8451 | 0.7742 | 71 | 0.675 | 0.8308 | 0.7448 | 65 | 0.8765 | 0.9467 | 0.9103 | 150 | 0.6585 | 0.8182 | 0.7297 | 33 | 0.8065 | 0.8929 | 0.8475 | 28 | 0.7739 | 0.8876 | 0.8268 | 0.9619 | | 0.0385 | 36.0 | 3816 | 0.1817 | 0.7564 | 0.8310 | 0.7919 | 71 | 0.6333 | 0.8769 | 0.7355 | 65 | 0.8503 | 0.9467 | 0.8959 | 150 | 0.6279 | 0.8182 | 0.7105 | 33 | 0.8929 | 0.8929 | 0.8929 | 28 | 0.7635 | 0.8934 | 0.8234 | 0.9551 | | 0.037 | 37.0 | 3922 | 0.2012 | 0.6824 | 0.8169 | 0.7436 | 71 | 0.6279 | 0.8308 | 0.7152 | 65 | 0.8512 | 0.9533 | 0.8994 | 150 | 0.5652 | 0.7879 | 0.6582 | 33 | 0.7742 | 0.8571 | 0.8136 | 28 | 0.7332 | 0.8790 | 0.7995 | 0.9517 | | 0.037 | 38.0 | 4028 | 0.1582 | 0.7532 | 0.8169 | 0.7838 | 71 | 0.7260 | 0.8154 | 0.7681 | 65 | 0.8688 | 0.9267 | 0.8968 | 150 | 0.7 | 0.8485 | 0.7671 | 33 | 0.9615 | 0.8929 | 0.9259 | 28 | 0.8059 | 0.8732 | 0.8382 | 0.9607 | | 0.0332 | 39.0 | 4134 | 0.1699 | 0.75 | 0.8451 | 0.7947 | 71 | 0.6706 | 0.8769 | 0.76 | 65 | 0.8720 | 0.9533 | 0.9108 | 150 | 0.675 | 0.8182 | 0.7397 | 33 | 0.8571 | 0.8571 | 0.8571 | 28 | 0.7834 | 0.8963 | 0.8360 | 0.9610 | | 0.0354 | 40.0 | 4240 | 0.1586 | 0.75 | 0.8451 | 0.7947 | 71 | 0.6795 | 0.8154 | 0.7413 | 65 | 0.8712 | 0.9467 | 0.9073 | 150 | 0.7 | 0.8485 | 0.7671 | 33 | 0.7059 | 0.8571 | 0.7742 | 28 | 0.7772 | 0.8847 | 0.8275 | 0.9612 | | 0.0331 | 41.0 | 4346 | 0.1633 | 0.7143 | 0.8451 | 0.7742 | 71 | 0.6829 | 0.8615 | 0.7619 | 65 | 0.8797 | 0.9267 | 0.9026 | 150 | 0.7 | 0.8485 | 0.7671 | 33 | 0.8 | 0.8571 | 0.8276 | 28 | 0.7792 | 0.8847 | 0.8286 | 0.9607 | | 0.0312 | 42.0 | 4452 | 0.1706 | 0.7436 | 0.8169 | 0.7785 | 71 | 0.675 | 0.8308 | 0.7448 | 65 | 0.8650 | 0.94 | 0.9010 | 150 | 0.65 | 0.7879 | 0.7123 | 33 | 0.8621 | 0.8929 | 0.8772 | 28 | 0.7795 | 0.8761 | 0.8250 | 0.9595 | | 0.031 | 43.0 | 4558 | 0.1645 | 0.7176 | 0.8592 | 0.7821 | 71 | 0.7027 | 0.8 | 0.7482 | 65 | 0.8712 | 0.9467 | 0.9073 | 150 | 0.75 | 0.9091 | 0.8219 | 33 | 0.8 | 0.8571 | 0.8276 | 28 | 0.7883 | 0.8905 | 0.8363 | 0.9619 | | 0.0305 | 44.0 | 4664 | 0.1853 | 0.7317 | 0.8451 | 0.7843 | 71 | 0.6628 | 0.8769 | 0.7550 | 65 | 0.8659 | 0.9467 | 0.9045 | 150 | 0.675 | 0.8182 | 0.7397 | 33 | 0.8929 | 0.8929 | 0.8929 | 28 | 0.7775 | 0.8963 | 0.8327 | 0.9610 | | 0.0284 | 45.0 | 4770 | 0.1658 | 0.7468 | 0.8310 | 0.7867 | 71 | 0.7067 | 0.8154 | 0.7571 | 65 | 0.8797 | 0.9267 | 0.9026 | 150 | 0.675 | 0.8182 | 0.7397 | 33 | 0.8571 | 0.8571 | 0.8571 | 28 | 0.7947 | 0.8703 | 0.8308 | 0.9634 | | 0.028 | 46.0 | 4876 | 0.1733 | 0.7073 | 0.8169 | 0.7582 | 71 | 0.6463 | 0.8154 | 0.7211 | 65 | 0.8868 | 0.94 | 0.9126 | 150 | 0.675 | 0.8182 | 0.7397 | 33 | 0.8929 | 0.8929 | 0.8929 | 28 | 0.7775 | 0.8761 | 0.8238 | 0.9600 | | 0.0257 | 47.0 | 4982 | 0.1833 | 0.7763 | 0.8310 | 0.8027 | 71 | 0.6437 | 0.8615 | 0.7368 | 65 | 0.8712 | 0.9467 | 0.9073 | 150 | 0.6923 | 0.8182 | 0.7500 | 33 | 0.8621 | 0.8929 | 0.8772 | 28 | 0.7843 | 0.8905 | 0.8340 | 0.9590 | | 0.0281 | 48.0 | 5088 | 0.1702 | 0.7532 | 0.8169 | 0.7838 | 71 | 0.6625 | 0.8154 | 0.7310 | 65 | 0.8580 | 0.9267 | 0.8910 | 150 | 0.7179 | 0.8485 | 0.7778 | 33 | 0.8846 | 0.8214 | 0.8519 | 28 | 0.7839 | 0.8674 | 0.8235 | 0.9607 | | 0.0245 | 49.0 | 5194 | 0.1863 | 0.7143 | 0.8451 | 0.7742 | 71 | 0.6585 | 0.8308 | 0.7347 | 65 | 0.8820 | 0.9467 | 0.9132 | 150 | 0.675 | 0.8182 | 0.7397 | 33 | 0.8333 | 0.8929 | 0.8621 | 28 | 0.7758 | 0.8876 | 0.8280 | 0.9614 | | 0.0251 | 50.0 | 5300 | 0.1628 | 0.7468 | 0.8310 | 0.7867 | 71 | 0.7123 | 0.8 | 0.7536 | 65 | 0.8910 | 0.9267 | 0.9085 | 150 | 0.7179 | 0.8485 | 0.7778 | 33 | 0.8929 | 0.8929 | 0.8929 | 28 | 0.808 | 0.8732 | 0.8393 | 0.9639 | | 0.0251 | 51.0 | 5406 | 0.1653 | 0.7284 | 0.8310 | 0.7763 | 71 | 0.7647 | 0.8 | 0.7820 | 65 | 0.8924 | 0.94 | 0.9156 | 150 | 0.7179 | 0.8485 | 0.7778 | 33 | 0.8889 | 0.8571 | 0.8727 | 28 | 0.8150 | 0.8761 | 0.8444 | 0.9663 | | 0.0245 | 52.0 | 5512 | 0.1833 | 0.7468 | 0.8310 | 0.7867 | 71 | 0.6706 | 0.8769 | 0.76 | 65 | 0.8659 | 0.9467 | 0.9045 | 150 | 0.675 | 0.8182 | 0.7397 | 33 | 0.8276 | 0.8571 | 0.8421 | 28 | 0.7783 | 0.8905 | 0.8306 | 0.9602 | | 0.0222 | 53.0 | 5618 | 0.1887 | 0.6897 | 0.8451 | 0.7595 | 71 | 0.6585 | 0.8308 | 0.7347 | 65 | 0.8765 | 0.9467 | 0.9103 | 150 | 0.675 | 0.8182 | 0.7397 | 33 | 0.8621 | 0.8929 | 0.8772 | 28 | 0.77 | 0.8876 | 0.8246 | 0.9595 | | 0.024 | 54.0 | 5724 | 0.1765 | 0.7564 | 0.8310 | 0.7919 | 71 | 0.6744 | 0.8923 | 0.7682 | 65 | 0.8820 | 0.9467 | 0.9132 | 150 | 0.7436 | 0.8788 | 0.8056 | 33 | 0.8621 | 0.8929 | 0.8772 | 28 | 0.7964 | 0.9020 | 0.8459 | 0.9634 | | 0.0238 | 55.0 | 5830 | 0.1749 | 0.7468 | 0.8310 | 0.7867 | 71 | 0.6951 | 0.8769 | 0.7755 | 65 | 0.8931 | 0.9467 | 0.9191 | 150 | 0.7 | 0.8485 | 0.7671 | 33 | 0.9615 | 0.8929 | 0.9259 | 28 | 0.8057 | 0.8963 | 0.8486 | 0.9646 | | 0.021 | 56.0 | 5936 | 0.1799 | 0.6905 | 0.8169 | 0.7484 | 71 | 0.6667 | 0.8308 | 0.7397 | 65 | 0.8812 | 0.94 | 0.9097 | 150 | 0.7179 | 0.8485 | 0.7778 | 33 | 0.8333 | 0.8929 | 0.8621 | 28 | 0.7766 | 0.8818 | 0.8259 | 0.9602 | | 0.0199 | 57.0 | 6042 | 0.1942 | 0.7284 | 0.8310 | 0.7763 | 71 | 0.7123 | 0.8 | 0.7536 | 65 | 0.8765 | 0.9467 | 0.9103 | 150 | 0.7436 | 0.8788 | 0.8056 | 33 | 0.9259 | 0.8929 | 0.9091 | 28 | 0.8037 | 0.8847 | 0.8422 | 0.9619 | | 0.0187 | 58.0 | 6148 | 0.1905 | 0.7037 | 0.8028 | 0.75 | 71 | 0.7123 | 0.8 | 0.7536 | 65 | 0.8854 | 0.9267 | 0.9055 | 150 | 0.6923 | 0.8182 | 0.7500 | 33 | 0.8621 | 0.8929 | 0.8772 | 28 | 0.7916 | 0.8646 | 0.8264 | 0.9610 | | 0.0199 | 59.0 | 6254 | 0.1940 | 0.7073 | 0.8169 | 0.7582 | 71 | 0.6883 | 0.8154 | 0.7465 | 65 | 0.8846 | 0.92 | 0.9020 | 150 | 0.7436 | 0.8788 | 0.8056 | 33 | 0.7931 | 0.8214 | 0.8070 | 28 | 0.7859 | 0.8674 | 0.8247 | 0.9593 | | 0.0183 | 60.0 | 6360 | 0.1952 | 0.7436 | 0.8169 | 0.7785 | 71 | 0.6512 | 0.8615 | 0.7417 | 65 | 0.8805 | 0.9333 | 0.9061 | 150 | 0.6923 | 0.8182 | 0.7500 | 33 | 0.8571 | 0.8571 | 0.8571 | 28 | 0.7821 | 0.8790 | 0.8277 | 0.9600 | | 0.0178 | 61.0 | 6466 | 0.1902 | 0.7534 | 0.7746 | 0.7639 | 71 | 0.7 | 0.8615 | 0.7724 | 65 | 0.8854 | 0.9267 | 0.9055 | 150 | 0.7895 | 0.9091 | 0.8451 | 33 | 0.8 | 0.8571 | 0.8276 | 28 | 0.8042 | 0.8761 | 0.8386 | 0.9622 | | 0.0196 | 62.0 | 6572 | 0.1832 | 0.7436 | 0.8169 | 0.7785 | 71 | 0.6835 | 0.8308 | 0.75 | 65 | 0.8726 | 0.9133 | 0.8925 | 150 | 0.7436 | 0.8788 | 0.8056 | 33 | 0.8214 | 0.8214 | 0.8214 | 28 | 0.7900 | 0.8674 | 0.8269 | 0.9622 | | 0.0182 | 63.0 | 6678 | 0.1880 | 0.7308 | 0.8028 | 0.7651 | 71 | 0.6829 | 0.8615 | 0.7619 | 65 | 0.8726 | 0.9133 | 0.8925 | 150 | 0.6923 | 0.8182 | 0.7500 | 33 | 0.8276 | 0.8571 | 0.8421 | 28 | 0.7818 | 0.8674 | 0.8224 | 0.9617 | | 0.0181 | 64.0 | 6784 | 0.1929 | 0.725 | 0.8169 | 0.7682 | 71 | 0.6875 | 0.8462 | 0.7586 | 65 | 0.8625 | 0.92 | 0.8903 | 150 | 0.7179 | 0.8485 | 0.7778 | 33 | 0.8 | 0.8571 | 0.8276 | 28 | 0.7789 | 0.8732 | 0.8234 | 0.9590 | | 0.0187 | 65.0 | 6890 | 0.1914 | 0.7284 | 0.8310 | 0.7763 | 71 | 0.7260 | 0.8154 | 0.7681 | 65 | 0.8790 | 0.92 | 0.8990 | 150 | 0.6923 | 0.8182 | 0.7500 | 33 | 0.8929 | 0.8929 | 0.8929 | 28 | 0.7989 | 0.8703 | 0.8331 | 0.9614 | | 0.016 | 66.0 | 6996 | 0.2022 | 0.7195 | 0.8310 | 0.7712 | 71 | 0.6923 | 0.8308 | 0.7552 | 65 | 0.8797 | 0.9267 | 0.9026 | 150 | 0.675 | 0.8182 | 0.7397 | 33 | 0.8 | 0.8571 | 0.8276 | 28 | 0.7809 | 0.8732 | 0.8245 | 0.9602 | | 0.0153 | 67.0 | 7102 | 0.1922 | 0.7125 | 0.8028 | 0.7550 | 71 | 0.7067 | 0.8154 | 0.7571 | 65 | 0.8812 | 0.94 | 0.9097 | 150 | 0.7436 | 0.8788 | 0.8056 | 33 | 0.8 | 0.8571 | 0.8276 | 28 | 0.7917 | 0.8761 | 0.8317 | 0.9607 | | 0.0165 | 68.0 | 7208 | 0.2077 | 0.7215 | 0.8028 | 0.76 | 71 | 0.6747 | 0.8615 | 0.7568 | 65 | 0.8854 | 0.9267 | 0.9055 | 150 | 0.675 | 0.8182 | 0.7397 | 33 | 0.8571 | 0.8571 | 0.8571 | 28 | 0.7829 | 0.8732 | 0.8256 | 0.9610 | | 0.0159 | 69.0 | 7314 | 0.2018 | 0.7125 | 0.8028 | 0.7550 | 71 | 0.6835 | 0.8308 | 0.75 | 65 | 0.8861 | 0.9333 | 0.9091 | 150 | 0.725 | 0.8788 | 0.7945 | 33 | 0.8621 | 0.8929 | 0.8772 | 28 | 0.7902 | 0.8790 | 0.8322 | 0.9619 | | 0.0151 | 70.0 | 7420 | 0.2193 | 0.6867 | 0.8028 | 0.7403 | 71 | 0.7013 | 0.8308 | 0.7606 | 65 | 0.8910 | 0.9267 | 0.9085 | 150 | 0.675 | 0.8182 | 0.7397 | 33 | 0.8 | 0.8571 | 0.8276 | 28 | 0.7798 | 0.8674 | 0.8213 | 0.9588 | | 0.0149 | 71.0 | 7526 | 0.2117 | 0.7284 | 0.8310 | 0.7763 | 71 | 0.6914 | 0.8615 | 0.7671 | 65 | 0.8861 | 0.9333 | 0.9091 | 150 | 0.7179 | 0.8485 | 0.7778 | 33 | 0.8276 | 0.8571 | 0.8421 | 28 | 0.7912 | 0.8847 | 0.8354 | 0.9614 | | 0.0152 | 72.0 | 7632 | 0.1995 | 0.7342 | 0.8169 | 0.7733 | 71 | 0.7143 | 0.8462 | 0.7746 | 65 | 0.8846 | 0.92 | 0.9020 | 150 | 0.7632 | 0.8788 | 0.8169 | 33 | 0.8519 | 0.8214 | 0.8364 | 28 | 0.8037 | 0.8732 | 0.8370 | 0.9624 | | 0.0149 | 73.0 | 7738 | 0.2208 | 0.7024 | 0.8310 | 0.7613 | 71 | 0.6829 | 0.8615 | 0.7619 | 65 | 0.8868 | 0.94 | 0.9126 | 150 | 0.7179 | 0.8485 | 0.7778 | 33 | 0.8276 | 0.8571 | 0.8421 | 28 | 0.7837 | 0.8876 | 0.8324 | 0.9610 | | 0.0141 | 74.0 | 7844 | 0.2141 | 0.7273 | 0.7887 | 0.7568 | 71 | 0.7297 | 0.8308 | 0.7770 | 65 | 0.8924 | 0.94 | 0.9156 | 150 | 0.7179 | 0.8485 | 0.7778 | 33 | 0.8276 | 0.8571 | 0.8421 | 28 | 0.8037 | 0.8732 | 0.8370 | 0.9624 | | 0.0142 | 75.0 | 7950 | 0.2097 | 0.7108 | 0.8310 | 0.7662 | 71 | 0.6582 | 0.8 | 0.7222 | 65 | 0.8854 | 0.9267 | 0.9055 | 150 | 0.7436 | 0.8788 | 0.8056 | 33 | 0.8621 | 0.8929 | 0.8772 | 28 | 0.7855 | 0.8761 | 0.8283 | 0.9610 | | 0.0132 | 76.0 | 8056 | 0.2149 | 0.7160 | 0.8169 | 0.7632 | 71 | 0.6974 | 0.8154 | 0.7518 | 65 | 0.8812 | 0.94 | 0.9097 | 150 | 0.7436 | 0.8788 | 0.8056 | 33 | 0.8276 | 0.8571 | 0.8421 | 28 | 0.7922 | 0.8790 | 0.8333 | 0.9619 | | 0.0132 | 77.0 | 8162 | 0.2158 | 0.6905 | 0.8169 | 0.7484 | 71 | 0.7105 | 0.8308 | 0.7660 | 65 | 0.8868 | 0.94 | 0.9126 | 150 | 0.6923 | 0.8182 | 0.7500 | 33 | 0.8929 | 0.8929 | 0.8929 | 28 | 0.7902 | 0.8790 | 0.8322 | 0.9607 | | 0.0141 | 78.0 | 8268 | 0.2088 | 0.7125 | 0.8028 | 0.7550 | 71 | 0.7013 | 0.8308 | 0.7606 | 65 | 0.8846 | 0.92 | 0.9020 | 150 | 0.7 | 0.8485 | 0.7671 | 33 | 0.8276 | 0.8571 | 0.8421 | 28 | 0.7880 | 0.8674 | 0.8258 | 0.9607 | | 0.0117 | 79.0 | 8374 | 0.2092 | 0.7125 | 0.8028 | 0.7550 | 71 | 0.7361 | 0.8154 | 0.7737 | 65 | 0.8861 | 0.9333 | 0.9091 | 150 | 0.7436 | 0.8788 | 0.8056 | 33 | 0.8276 | 0.8571 | 0.8421 | 28 | 0.8016 | 0.8732 | 0.8359 | 0.9614 | | 0.0127 | 80.0 | 8480 | 0.2205 | 0.7 | 0.7887 | 0.7417 | 71 | 0.6835 | 0.8308 | 0.75 | 65 | 0.8861 | 0.9333 | 0.9091 | 150 | 0.7179 | 0.8485 | 0.7778 | 33 | 0.8889 | 0.8571 | 0.8727 | 28 | 0.7885 | 0.8703 | 0.8274 | 0.9588 | | 0.0134 | 81.0 | 8586 | 0.2093 | 0.7215 | 0.8028 | 0.76 | 71 | 0.7361 | 0.8154 | 0.7737 | 65 | 0.8861 | 0.9333 | 0.9091 | 150 | 0.7436 | 0.8788 | 0.8056 | 33 | 0.8276 | 0.8571 | 0.8421 | 28 | 0.8037 | 0.8732 | 0.8370 | 0.9631 | | 0.0138 | 82.0 | 8692 | 0.2078 | 0.725 | 0.8169 | 0.7682 | 71 | 0.7105 | 0.8308 | 0.7660 | 65 | 0.8854 | 0.9267 | 0.9055 | 150 | 0.7436 | 0.8788 | 0.8056 | 33 | 0.8276 | 0.8571 | 0.8421 | 28 | 0.7979 | 0.8761 | 0.8352 | 0.9612 | | 0.0129 | 83.0 | 8798 | 0.2170 | 0.7 | 0.7887 | 0.7417 | 71 | 0.7051 | 0.8462 | 0.7692 | 65 | 0.8742 | 0.9267 | 0.8997 | 150 | 0.7179 | 0.8485 | 0.7778 | 33 | 0.8276 | 0.8571 | 0.8421 | 28 | 0.7844 | 0.8703 | 0.8251 | 0.9600 | | 0.0119 | 84.0 | 8904 | 0.2103 | 0.7195 | 0.8310 | 0.7712 | 71 | 0.7067 | 0.8154 | 0.7571 | 65 | 0.8734 | 0.92 | 0.8961 | 150 | 0.7436 | 0.8788 | 0.8056 | 33 | 0.8276 | 0.8571 | 0.8421 | 28 | 0.7911 | 0.8732 | 0.8301 | 0.9612 | | 0.0117 | 85.0 | 9010 | 0.2209 | 0.7160 | 0.8169 | 0.7632 | 71 | 0.6951 | 0.8769 | 0.7755 | 65 | 0.8868 | 0.94 | 0.9126 | 150 | 0.725 | 0.8788 | 0.7945 | 33 | 0.7931 | 0.8214 | 0.8070 | 28 | 0.7877 | 0.8876 | 0.8347 | 0.9597 | | 0.0129 | 86.0 | 9116 | 0.2100 | 0.7160 | 0.8169 | 0.7632 | 71 | 0.7105 | 0.8308 | 0.7660 | 65 | 0.8924 | 0.94 | 0.9156 | 150 | 0.7436 | 0.8788 | 0.8056 | 33 | 0.8 | 0.8571 | 0.8276 | 28 | 0.7969 | 0.8818 | 0.8372 | 0.9619 | | 0.0107 | 87.0 | 9222 | 0.2151 | 0.7125 | 0.8028 | 0.7550 | 71 | 0.7051 | 0.8462 | 0.7692 | 65 | 0.8924 | 0.94 | 0.9156 | 150 | 0.7436 | 0.8788 | 0.8056 | 33 | 0.8276 | 0.8571 | 0.8421 | 28 | 0.7969 | 0.8818 | 0.8372 | 0.9617 | | 0.0121 | 88.0 | 9328 | 0.2126 | 0.7342 | 0.8169 | 0.7733 | 71 | 0.7105 | 0.8308 | 0.7660 | 65 | 0.8790 | 0.92 | 0.8990 | 150 | 0.7179 | 0.8485 | 0.7778 | 33 | 0.8571 | 0.8571 | 0.8571 | 28 | 0.7968 | 0.8703 | 0.8320 | 0.9614 | | 0.0104 | 89.0 | 9434 | 0.2102 | 0.7342 | 0.8169 | 0.7733 | 71 | 0.7333 | 0.8462 | 0.7857 | 65 | 0.8861 | 0.9333 | 0.9091 | 150 | 0.7368 | 0.8485 | 0.7887 | 33 | 0.8276 | 0.8571 | 0.8421 | 28 | 0.8047 | 0.8790 | 0.8402 | 0.9631 | | 0.0114 | 90.0 | 9540 | 0.2103 | 0.725 | 0.8169 | 0.7682 | 71 | 0.7397 | 0.8308 | 0.7826 | 65 | 0.8805 | 0.9333 | 0.9061 | 150 | 0.7179 | 0.8485 | 0.7778 | 33 | 0.8571 | 0.8571 | 0.8571 | 28 | 0.8021 | 0.8761 | 0.8375 | 0.9629 | | 0.0105 | 91.0 | 9646 | 0.2144 | 0.7160 | 0.8169 | 0.7632 | 71 | 0.7297 | 0.8308 | 0.7770 | 65 | 0.8805 | 0.9333 | 0.9061 | 150 | 0.7436 | 0.8788 | 0.8056 | 33 | 0.8276 | 0.8571 | 0.8421 | 28 | 0.7984 | 0.8790 | 0.8368 | 0.9629 | | 0.011 | 92.0 | 9752 | 0.2228 | 0.7037 | 0.8028 | 0.75 | 71 | 0.7051 | 0.8462 | 0.7692 | 65 | 0.8924 | 0.94 | 0.9156 | 150 | 0.7436 | 0.8788 | 0.8056 | 33 | 0.8 | 0.8571 | 0.8276 | 28 | 0.7927 | 0.8818 | 0.8349 | 0.9607 | | 0.0107 | 93.0 | 9858 | 0.2212 | 0.7160 | 0.8169 | 0.7632 | 71 | 0.6923 | 0.8308 | 0.7552 | 65 | 0.8924 | 0.94 | 0.9156 | 150 | 0.7436 | 0.8788 | 0.8056 | 33 | 0.8 | 0.8571 | 0.8276 | 28 | 0.7927 | 0.8818 | 0.8349 | 0.9610 | | 0.009 | 94.0 | 9964 | 0.2232 | 0.7160 | 0.8169 | 0.7632 | 71 | 0.6875 | 0.8462 | 0.7586 | 65 | 0.8924 | 0.94 | 0.9156 | 150 | 0.7436 | 0.8788 | 0.8056 | 33 | 0.8 | 0.8571 | 0.8276 | 28 | 0.7912 | 0.8847 | 0.8354 | 0.9619 | | 0.0109 | 95.0 | 10070 | 0.2274 | 0.6951 | 0.8028 | 0.7451 | 71 | 0.6923 | 0.8308 | 0.7552 | 65 | 0.8917 | 0.9333 | 0.9121 | 150 | 0.7179 | 0.8485 | 0.7778 | 33 | 0.8621 | 0.8929 | 0.8772 | 28 | 0.7896 | 0.8761 | 0.8306 | 0.9593 | | 0.0098 | 96.0 | 10176 | 0.2233 | 0.7160 | 0.8169 | 0.7632 | 71 | 0.7051 | 0.8462 | 0.7692 | 65 | 0.8924 | 0.94 | 0.9156 | 150 | 0.7179 | 0.8485 | 0.7778 | 33 | 0.8621 | 0.8929 | 0.8772 | 28 | 0.7974 | 0.8847 | 0.8388 | 0.9614 | | 0.0103 | 97.0 | 10282 | 0.2204 | 0.7125 | 0.8028 | 0.7550 | 71 | 0.6962 | 0.8462 | 0.7639 | 65 | 0.8805 | 0.9333 | 0.9061 | 150 | 0.7179 | 0.8485 | 0.7778 | 33 | 0.8 | 0.8571 | 0.8276 | 28 | 0.7855 | 0.8761 | 0.8283 | 0.9610 | | 0.0102 | 98.0 | 10388 | 0.2219 | 0.725 | 0.8169 | 0.7682 | 71 | 0.6962 | 0.8462 | 0.7639 | 65 | 0.8924 | 0.94 | 0.9156 | 150 | 0.7179 | 0.8485 | 0.7778 | 33 | 0.8 | 0.8571 | 0.8276 | 28 | 0.7927 | 0.8818 | 0.8349 | 0.9612 | | 0.0094 | 99.0 | 10494 | 0.2234 | 0.7125 | 0.8028 | 0.7550 | 71 | 0.6962 | 0.8462 | 0.7639 | 65 | 0.8924 | 0.94 | 0.9156 | 150 | 0.7179 | 0.8485 | 0.7778 | 33 | 0.8 | 0.8571 | 0.8276 | 28 | 0.7902 | 0.8790 | 0.8322 | 0.9610 | | 0.0098 | 100.0 | 10600 | 0.2237 | 0.725 | 0.8169 | 0.7682 | 71 | 0.6962 | 0.8462 | 0.7639 | 65 | 0.8924 | 0.94 | 0.9156 | 150 | 0.7179 | 0.8485 | 0.7778 | 33 | 0.8 | 0.8571 | 0.8276 | 28 | 0.7927 | 0.8818 | 0.8349 | 0.9612 | ### Framework versions - Transformers 4.39.3 - Pytorch 2.3.0+cu121 - Datasets 2.19.1 - Tokenizers 0.15.2
josejointriple/brand_classification_2_20240604
josejointriple
2024-06-03T23:31:50Z
183
0
transformers
[ "transformers", "safetensors", "bert", "text-classification", "arxiv:1910.09700", "autotrain_compatible", "endpoints_compatible", "region:us" ]
text-classification
2024-06-03T23:31:42Z
--- library_name: transformers tags: [] --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
apwic/nerugm-unipelt-2
apwic
2024-06-03T23:27:26Z
0
0
null
[ "tensorboard", "generated_from_trainer", "id", "base_model:indolem/indobert-base-uncased", "base_model:finetune:indolem/indobert-base-uncased", "license:mit", "region:us" ]
null
2024-05-28T02:00:38Z
--- language: - id license: mit base_model: indolem/indobert-base-uncased tags: - generated_from_trainer model-index: - name: nerugm-unipelt-2 results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # nerugm-unipelt-2 This model is a fine-tuned version of [indolem/indobert-base-uncased](https://huggingface.co/indolem/indobert-base-uncased) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.2436 - Location Precision: 0.7973 - Location Recall: 0.8194 - Location F1: 0.8082 - Location Number: 72 - Organization Precision: 0.7045 - Organization Recall: 0.8267 - Organization F1: 0.7607 - Organization Number: 75 - Person Precision: 0.8590 - Person Recall: 0.9371 - Person F1: 0.8963 - Person Number: 143 - Quantity Precision: 0.6552 - Quantity Recall: 0.8261 - Quantity F1: 0.7308 - Quantity Number: 23 - Time Precision: 0.7857 - Time Recall: 0.8462 - Time F1: 0.8148 - Time Number: 26 - Overall Precision: 0.7893 - Overall Recall: 0.8732 - Overall F1: 0.8291 - Overall Accuracy: 0.9595 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 64 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 100.0 ### Training results | Training Loss | Epoch | Step | Validation Loss | Location Precision | Location Recall | Location F1 | Location Number | Organization Precision | Organization Recall | Organization F1 | Organization Number | Person Precision | Person Recall | Person F1 | Person Number | Quantity Precision | Quantity Recall | Quantity F1 | Quantity Number | Time Precision | Time Recall | Time F1 | Time Number | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy | |:-------------:|:-----:|:-----:|:---------------:|:------------------:|:---------------:|:-----------:|:---------------:|:----------------------:|:-------------------:|:---------------:|:-------------------:|:----------------:|:-------------:|:---------:|:-------------:|:------------------:|:---------------:|:-----------:|:---------------:|:--------------:|:-----------:|:-------:|:-----------:|:-----------------:|:--------------:|:----------:|:----------------:| | 0.9413 | 1.0 | 106 | 0.5834 | 0.0 | 0.0 | 0.0 | 72 | 0.0 | 0.0 | 0.0 | 75 | 0.0 | 0.0 | 0.0 | 143 | 0.0 | 0.0 | 0.0 | 23 | 0.0 | 0.0 | 0.0 | 26 | 0.0 | 0.0 | 0.0 | 0.8451 | | 0.501 | 2.0 | 212 | 0.3319 | 0.4143 | 0.4028 | 0.4085 | 72 | 0.0930 | 0.0533 | 0.0678 | 75 | 0.5421 | 0.8112 | 0.6499 | 143 | 0.0 | 0.0 | 0.0 | 23 | 0.6667 | 0.7692 | 0.7143 | 26 | 0.4507 | 0.4985 | 0.4734 | 0.9044 | | 0.2907 | 3.0 | 318 | 0.2038 | 0.6064 | 0.7917 | 0.6867 | 72 | 0.4655 | 0.72 | 0.5654 | 75 | 0.7679 | 0.9021 | 0.8296 | 143 | 0.2973 | 0.4783 | 0.3667 | 23 | 0.7692 | 0.7692 | 0.7692 | 26 | 0.6145 | 0.7994 | 0.6949 | 0.9352 | | 0.1997 | 4.0 | 424 | 0.1949 | 0.6629 | 0.8194 | 0.7329 | 72 | 0.4790 | 0.76 | 0.5876 | 75 | 0.7733 | 0.9301 | 0.8444 | 143 | 0.4865 | 0.7826 | 0.6000 | 23 | 0.5946 | 0.8462 | 0.6984 | 26 | 0.6366 | 0.8525 | 0.7289 | 0.9340 | | 0.1638 | 5.0 | 530 | 0.1405 | 0.7125 | 0.7917 | 0.75 | 72 | 0.6105 | 0.7733 | 0.6824 | 75 | 0.8447 | 0.9510 | 0.8947 | 143 | 0.6333 | 0.8261 | 0.7170 | 23 | 0.8462 | 0.8462 | 0.8462 | 26 | 0.7449 | 0.8614 | 0.7989 | 0.9542 | | 0.1475 | 6.0 | 636 | 0.1376 | 0.6813 | 0.8611 | 0.7607 | 72 | 0.6038 | 0.8533 | 0.7072 | 75 | 0.8654 | 0.9441 | 0.9030 | 143 | 0.5862 | 0.7391 | 0.6538 | 23 | 0.7857 | 0.8462 | 0.8148 | 26 | 0.7317 | 0.8850 | 0.8011 | 0.9542 | | 0.1352 | 7.0 | 742 | 0.1379 | 0.7342 | 0.8056 | 0.7682 | 72 | 0.6364 | 0.84 | 0.7241 | 75 | 0.8535 | 0.9371 | 0.8933 | 143 | 0.4595 | 0.7391 | 0.5667 | 23 | 0.8148 | 0.8462 | 0.8302 | 26 | 0.7368 | 0.8673 | 0.7967 | 0.9550 | | 0.1229 | 8.0 | 848 | 0.1327 | 0.7284 | 0.8194 | 0.7712 | 72 | 0.7143 | 0.8 | 0.7547 | 75 | 0.8385 | 0.9441 | 0.8882 | 143 | 0.6296 | 0.7391 | 0.68 | 23 | 0.6774 | 0.8077 | 0.7368 | 26 | 0.7604 | 0.8614 | 0.8077 | 0.9577 | | 0.119 | 9.0 | 954 | 0.1360 | 0.75 | 0.8333 | 0.7895 | 72 | 0.6809 | 0.8533 | 0.7574 | 75 | 0.8535 | 0.9371 | 0.8933 | 143 | 0.6429 | 0.7826 | 0.7059 | 23 | 0.6897 | 0.7692 | 0.7273 | 26 | 0.7629 | 0.8732 | 0.8143 | 0.9572 | | 0.1125 | 10.0 | 1060 | 0.1273 | 0.7662 | 0.8194 | 0.7919 | 72 | 0.6591 | 0.7733 | 0.7117 | 75 | 0.875 | 0.9301 | 0.9017 | 143 | 0.5667 | 0.7391 | 0.6415 | 23 | 0.7143 | 0.7692 | 0.7407 | 26 | 0.7653 | 0.8466 | 0.8039 | 0.9567 | | 0.1058 | 11.0 | 1166 | 0.1296 | 0.7439 | 0.8472 | 0.7922 | 72 | 0.6702 | 0.84 | 0.7456 | 75 | 0.8506 | 0.9161 | 0.8822 | 143 | 0.6 | 0.7826 | 0.6792 | 23 | 0.7857 | 0.8462 | 0.8148 | 26 | 0.7603 | 0.8702 | 0.8116 | 0.9582 | | 0.0983 | 12.0 | 1272 | 0.1320 | 0.7564 | 0.8194 | 0.7867 | 72 | 0.7 | 0.84 | 0.7636 | 75 | 0.8645 | 0.9371 | 0.8993 | 143 | 0.6207 | 0.7826 | 0.6923 | 23 | 0.8462 | 0.8462 | 0.8462 | 26 | 0.7831 | 0.8732 | 0.8257 | 0.9597 | | 0.0927 | 13.0 | 1378 | 0.1346 | 0.7595 | 0.8333 | 0.7947 | 72 | 0.6882 | 0.8533 | 0.7619 | 75 | 0.8428 | 0.9371 | 0.8874 | 143 | 0.72 | 0.7826 | 0.7500 | 23 | 0.8462 | 0.8462 | 0.8462 | 26 | 0.7801 | 0.8791 | 0.8266 | 0.9602 | | 0.0903 | 14.0 | 1484 | 0.1368 | 0.7662 | 0.8194 | 0.7919 | 72 | 0.7326 | 0.84 | 0.7826 | 75 | 0.8438 | 0.9441 | 0.8911 | 143 | 0.6 | 0.7826 | 0.6792 | 23 | 0.8077 | 0.8077 | 0.8077 | 26 | 0.7810 | 0.8732 | 0.8245 | 0.9600 | | 0.0838 | 15.0 | 1590 | 0.1399 | 0.8026 | 0.8472 | 0.8243 | 72 | 0.6915 | 0.8667 | 0.7692 | 75 | 0.8397 | 0.9161 | 0.8763 | 143 | 0.6 | 0.7826 | 0.6792 | 23 | 0.8077 | 0.8077 | 0.8077 | 26 | 0.7749 | 0.8732 | 0.8211 | 0.9577 | | 0.0821 | 16.0 | 1696 | 0.1585 | 0.7439 | 0.8472 | 0.7922 | 72 | 0.6442 | 0.8933 | 0.7486 | 75 | 0.8481 | 0.9371 | 0.8904 | 143 | 0.5484 | 0.7391 | 0.6296 | 23 | 0.75 | 0.8077 | 0.7778 | 26 | 0.7444 | 0.8850 | 0.8086 | 0.9530 | | 0.0793 | 17.0 | 1802 | 0.1424 | 0.7468 | 0.8194 | 0.7815 | 72 | 0.7126 | 0.8267 | 0.7654 | 75 | 0.8408 | 0.9231 | 0.8800 | 143 | 0.5152 | 0.7391 | 0.6071 | 23 | 0.8462 | 0.8462 | 0.8462 | 26 | 0.7644 | 0.8614 | 0.8100 | 0.9570 | | 0.0762 | 18.0 | 1908 | 0.1426 | 0.8116 | 0.7778 | 0.7943 | 72 | 0.7262 | 0.8133 | 0.7673 | 75 | 0.8618 | 0.9161 | 0.8881 | 143 | 0.5625 | 0.7826 | 0.6545 | 23 | 0.75 | 0.8077 | 0.7778 | 26 | 0.7863 | 0.8466 | 0.8153 | 0.9577 | | 0.0727 | 19.0 | 2014 | 0.1413 | 0.7973 | 0.8194 | 0.8082 | 72 | 0.7174 | 0.88 | 0.7904 | 75 | 0.8627 | 0.9231 | 0.8919 | 143 | 0.6 | 0.7826 | 0.6792 | 23 | 0.7857 | 0.8462 | 0.8148 | 26 | 0.7878 | 0.8761 | 0.8296 | 0.9602 | | 0.0696 | 20.0 | 2120 | 0.1565 | 0.8082 | 0.8194 | 0.8138 | 72 | 0.6809 | 0.8533 | 0.7574 | 75 | 0.8491 | 0.9441 | 0.8940 | 143 | 0.5806 | 0.7826 | 0.6667 | 23 | 0.7692 | 0.7692 | 0.7692 | 26 | 0.7728 | 0.8732 | 0.8199 | 0.9565 | | 0.0663 | 21.0 | 2226 | 0.1552 | 0.75 | 0.8333 | 0.7895 | 72 | 0.7126 | 0.8267 | 0.7654 | 75 | 0.8438 | 0.9441 | 0.8911 | 143 | 0.5 | 0.6957 | 0.5818 | 23 | 0.8519 | 0.8846 | 0.8679 | 26 | 0.7668 | 0.8732 | 0.8166 | 0.9570 | | 0.0648 | 22.0 | 2332 | 0.1632 | 0.7595 | 0.8333 | 0.7947 | 72 | 0.6321 | 0.8933 | 0.7403 | 75 | 0.8627 | 0.9231 | 0.8919 | 143 | 0.5143 | 0.7826 | 0.6207 | 23 | 0.7 | 0.8077 | 0.75 | 26 | 0.7395 | 0.8791 | 0.8032 | 0.9517 | | 0.0636 | 23.0 | 2438 | 0.1434 | 0.7808 | 0.7917 | 0.7862 | 72 | 0.6771 | 0.8667 | 0.7602 | 75 | 0.8733 | 0.9161 | 0.8942 | 143 | 0.6207 | 0.7826 | 0.6923 | 23 | 0.8846 | 0.8846 | 0.8846 | 26 | 0.7861 | 0.8673 | 0.8247 | 0.9587 | | 0.0591 | 24.0 | 2544 | 0.1626 | 0.7703 | 0.7917 | 0.7808 | 72 | 0.7209 | 0.8267 | 0.7702 | 75 | 0.8581 | 0.9301 | 0.8926 | 143 | 0.4848 | 0.6957 | 0.5714 | 23 | 0.7143 | 0.7692 | 0.7407 | 26 | 0.7660 | 0.8496 | 0.8056 | 0.9547 | | 0.058 | 25.0 | 2650 | 0.1489 | 0.7945 | 0.8056 | 0.8 | 72 | 0.6837 | 0.8933 | 0.7746 | 75 | 0.8693 | 0.9301 | 0.8986 | 143 | 0.5667 | 0.7391 | 0.6415 | 23 | 0.7586 | 0.8462 | 0.8 | 26 | 0.7755 | 0.8761 | 0.8227 | 0.9597 | | 0.055 | 26.0 | 2756 | 0.1739 | 0.8194 | 0.8194 | 0.8194 | 72 | 0.66 | 0.88 | 0.7543 | 75 | 0.8636 | 0.9301 | 0.8956 | 143 | 0.5152 | 0.7391 | 0.6071 | 23 | 0.7143 | 0.7692 | 0.7407 | 26 | 0.7623 | 0.8702 | 0.8127 | 0.9545 | | 0.0548 | 27.0 | 2862 | 0.1683 | 0.76 | 0.7917 | 0.7755 | 72 | 0.6957 | 0.8533 | 0.7665 | 75 | 0.8365 | 0.9301 | 0.8808 | 143 | 0.5484 | 0.7391 | 0.6296 | 23 | 0.7407 | 0.7692 | 0.7547 | 26 | 0.7578 | 0.8584 | 0.8050 | 0.9550 | | 0.0508 | 28.0 | 2968 | 0.1731 | 0.7895 | 0.8333 | 0.8108 | 72 | 0.6842 | 0.8667 | 0.7647 | 75 | 0.8481 | 0.9371 | 0.8904 | 143 | 0.72 | 0.7826 | 0.7500 | 23 | 0.625 | 0.7692 | 0.6897 | 26 | 0.7694 | 0.8761 | 0.8193 | 0.9547 | | 0.0508 | 29.0 | 3074 | 0.1688 | 0.7662 | 0.8194 | 0.7919 | 72 | 0.7222 | 0.8667 | 0.7879 | 75 | 0.8535 | 0.9371 | 0.8933 | 143 | 0.5152 | 0.7391 | 0.6071 | 23 | 0.7143 | 0.7692 | 0.7407 | 26 | 0.7662 | 0.8702 | 0.8149 | 0.9560 | | 0.0504 | 30.0 | 3180 | 0.1635 | 0.7639 | 0.7639 | 0.7639 | 72 | 0.7 | 0.84 | 0.7636 | 75 | 0.875 | 0.9301 | 0.9017 | 143 | 0.7308 | 0.8261 | 0.7755 | 23 | 0.7692 | 0.7692 | 0.7692 | 26 | 0.7923 | 0.8555 | 0.8227 | 0.9585 | | 0.0453 | 31.0 | 3286 | 0.1818 | 0.7763 | 0.8194 | 0.7973 | 72 | 0.6915 | 0.8667 | 0.7692 | 75 | 0.8471 | 0.9301 | 0.8867 | 143 | 0.4848 | 0.6957 | 0.5714 | 23 | 0.7241 | 0.8077 | 0.7636 | 26 | 0.7558 | 0.8673 | 0.8077 | 0.9535 | | 0.0452 | 32.0 | 3392 | 0.1650 | 0.7973 | 0.8194 | 0.8082 | 72 | 0.6889 | 0.8267 | 0.7515 | 75 | 0.88 | 0.9231 | 0.9010 | 143 | 0.4848 | 0.6957 | 0.5714 | 23 | 0.7586 | 0.8462 | 0.8 | 26 | 0.7739 | 0.8584 | 0.8140 | 0.9565 | | 0.044 | 33.0 | 3498 | 0.1832 | 0.8 | 0.8333 | 0.8163 | 72 | 0.6373 | 0.8667 | 0.7345 | 75 | 0.8618 | 0.9161 | 0.8881 | 143 | 0.5 | 0.6957 | 0.5818 | 23 | 0.75 | 0.8077 | 0.7778 | 26 | 0.7532 | 0.8643 | 0.8049 | 0.9522 | | 0.0407 | 34.0 | 3604 | 0.1828 | 0.7662 | 0.8194 | 0.7919 | 72 | 0.6667 | 0.88 | 0.7586 | 75 | 0.8599 | 0.9441 | 0.9 | 143 | 0.5484 | 0.7391 | 0.6296 | 23 | 0.7143 | 0.7692 | 0.7407 | 26 | 0.7577 | 0.8761 | 0.8126 | 0.9550 | | 0.0401 | 35.0 | 3710 | 0.1778 | 0.8082 | 0.8194 | 0.8138 | 72 | 0.6774 | 0.84 | 0.75 | 75 | 0.8859 | 0.9231 | 0.9041 | 143 | 0.5484 | 0.7391 | 0.6296 | 23 | 0.8 | 0.7692 | 0.7843 | 26 | 0.7844 | 0.8584 | 0.8197 | 0.9552 | | 0.0398 | 36.0 | 3816 | 0.1749 | 0.8108 | 0.8333 | 0.8219 | 72 | 0.6813 | 0.8267 | 0.7470 | 75 | 0.8618 | 0.9161 | 0.8881 | 143 | 0.5625 | 0.7826 | 0.6545 | 23 | 0.7407 | 0.7692 | 0.7547 | 26 | 0.7739 | 0.8584 | 0.8140 | 0.9567 | | 0.0402 | 37.0 | 3922 | 0.1733 | 0.7536 | 0.7222 | 0.7376 | 72 | 0.7011 | 0.8133 | 0.7531 | 75 | 0.8792 | 0.9161 | 0.8973 | 143 | 0.5862 | 0.7391 | 0.6538 | 23 | 0.8462 | 0.8462 | 0.8462 | 26 | 0.7861 | 0.8348 | 0.8097 | 0.9572 | | 0.0365 | 38.0 | 4028 | 0.1821 | 0.7576 | 0.6944 | 0.7246 | 72 | 0.7241 | 0.84 | 0.7778 | 75 | 0.8571 | 0.9231 | 0.8889 | 143 | 0.6 | 0.7826 | 0.6792 | 23 | 0.7407 | 0.7692 | 0.7547 | 26 | 0.7775 | 0.8348 | 0.8051 | 0.9565 | | 0.0366 | 39.0 | 4134 | 0.1903 | 0.7342 | 0.8056 | 0.7682 | 72 | 0.7126 | 0.8267 | 0.7654 | 75 | 0.8590 | 0.9371 | 0.8963 | 143 | 0.6552 | 0.8261 | 0.7308 | 23 | 0.7143 | 0.7692 | 0.7407 | 26 | 0.7731 | 0.8643 | 0.8162 | 0.9557 | | 0.0364 | 40.0 | 4240 | 0.2068 | 0.7160 | 0.8056 | 0.7582 | 72 | 0.6562 | 0.84 | 0.7368 | 75 | 0.8261 | 0.9301 | 0.8750 | 143 | 0.5758 | 0.8261 | 0.6786 | 23 | 0.6667 | 0.8462 | 0.7458 | 26 | 0.7302 | 0.8702 | 0.7941 | 0.9490 | | 0.033 | 41.0 | 4346 | 0.1836 | 0.7838 | 0.8056 | 0.7945 | 72 | 0.7159 | 0.84 | 0.7730 | 75 | 0.8636 | 0.9301 | 0.8956 | 143 | 0.6429 | 0.7826 | 0.7059 | 23 | 0.8148 | 0.8462 | 0.8302 | 26 | 0.7925 | 0.8673 | 0.8282 | 0.9582 | | 0.0317 | 42.0 | 4452 | 0.1928 | 0.8 | 0.8333 | 0.8163 | 72 | 0.7176 | 0.8133 | 0.7625 | 75 | 0.8627 | 0.9231 | 0.8919 | 143 | 0.5161 | 0.6957 | 0.5926 | 23 | 0.7 | 0.8077 | 0.75 | 26 | 0.7754 | 0.8555 | 0.8135 | 0.9560 | | 0.0309 | 43.0 | 4558 | 0.1895 | 0.7722 | 0.8472 | 0.8079 | 72 | 0.7333 | 0.88 | 0.8 | 75 | 0.8693 | 0.9301 | 0.8986 | 143 | 0.5312 | 0.7391 | 0.6182 | 23 | 0.7857 | 0.8462 | 0.8148 | 26 | 0.7827 | 0.8820 | 0.8294 | 0.9572 | | 0.0309 | 44.0 | 4664 | 0.1849 | 0.7867 | 0.8194 | 0.8027 | 72 | 0.7065 | 0.8667 | 0.7784 | 75 | 0.8824 | 0.9441 | 0.9122 | 143 | 0.5806 | 0.7826 | 0.6667 | 23 | 0.7333 | 0.8462 | 0.7857 | 26 | 0.7848 | 0.8820 | 0.8306 | 0.9587 | | 0.0292 | 45.0 | 4770 | 0.1803 | 0.8056 | 0.8056 | 0.8056 | 72 | 0.6818 | 0.8 | 0.7362 | 75 | 0.8710 | 0.9441 | 0.9060 | 143 | 0.5806 | 0.7826 | 0.6667 | 23 | 0.75 | 0.8077 | 0.7778 | 26 | 0.7807 | 0.8614 | 0.8191 | 0.9590 | | 0.0284 | 46.0 | 4876 | 0.1999 | 0.7973 | 0.8194 | 0.8082 | 72 | 0.7143 | 0.8667 | 0.7831 | 75 | 0.8654 | 0.9441 | 0.9030 | 143 | 0.4857 | 0.7391 | 0.5862 | 23 | 0.625 | 0.7692 | 0.6897 | 26 | 0.7629 | 0.8732 | 0.8143 | 0.9547 | | 0.0283 | 47.0 | 4982 | 0.1977 | 0.8056 | 0.8056 | 0.8056 | 72 | 0.7209 | 0.8267 | 0.7702 | 75 | 0.8774 | 0.9510 | 0.9128 | 143 | 0.5758 | 0.8261 | 0.6786 | 23 | 0.6897 | 0.7692 | 0.7273 | 26 | 0.7867 | 0.8702 | 0.8263 | 0.9587 | | 0.0262 | 48.0 | 5088 | 0.1941 | 0.7703 | 0.7917 | 0.7808 | 72 | 0.7412 | 0.84 | 0.7875 | 75 | 0.8874 | 0.9371 | 0.9116 | 143 | 0.6333 | 0.8261 | 0.7170 | 23 | 0.7692 | 0.7692 | 0.7692 | 26 | 0.8005 | 0.8643 | 0.8312 | 0.9597 | | 0.0273 | 49.0 | 5194 | 0.2066 | 0.8108 | 0.8333 | 0.8219 | 72 | 0.6966 | 0.8267 | 0.7561 | 75 | 0.8645 | 0.9371 | 0.8993 | 143 | 0.5625 | 0.7826 | 0.6545 | 23 | 0.7143 | 0.7692 | 0.7407 | 26 | 0.7778 | 0.8673 | 0.8201 | 0.9570 | | 0.0259 | 50.0 | 5300 | 0.2049 | 0.8108 | 0.8333 | 0.8219 | 72 | 0.7159 | 0.84 | 0.7730 | 75 | 0.8581 | 0.9301 | 0.8926 | 143 | 0.5625 | 0.7826 | 0.6545 | 23 | 0.7241 | 0.8077 | 0.7636 | 26 | 0.7804 | 0.8702 | 0.8229 | 0.9575 | | 0.0244 | 51.0 | 5406 | 0.2135 | 0.7703 | 0.7917 | 0.7808 | 72 | 0.6989 | 0.8667 | 0.7738 | 75 | 0.8581 | 0.9301 | 0.8926 | 143 | 0.5625 | 0.7826 | 0.6545 | 23 | 0.7143 | 0.7692 | 0.7407 | 26 | 0.7670 | 0.8643 | 0.8128 | 0.9555 | | 0.0251 | 52.0 | 5512 | 0.2014 | 0.7945 | 0.8056 | 0.8 | 72 | 0.7111 | 0.8533 | 0.7758 | 75 | 0.8808 | 0.9301 | 0.9048 | 143 | 0.6552 | 0.8261 | 0.7308 | 23 | 0.6774 | 0.8077 | 0.7368 | 26 | 0.7888 | 0.8702 | 0.8275 | 0.9580 | | 0.0236 | 53.0 | 5618 | 0.2271 | 0.7838 | 0.8056 | 0.7945 | 72 | 0.68 | 0.9067 | 0.7771 | 75 | 0.8471 | 0.9301 | 0.8867 | 143 | 0.6333 | 0.8261 | 0.7170 | 23 | 0.6562 | 0.8077 | 0.7241 | 26 | 0.7608 | 0.8820 | 0.8169 | 0.9537 | | 0.0238 | 54.0 | 5724 | 0.2099 | 0.7945 | 0.8056 | 0.8 | 72 | 0.7191 | 0.8533 | 0.7805 | 75 | 0.8645 | 0.9371 | 0.8993 | 143 | 0.6552 | 0.8261 | 0.7308 | 23 | 0.6897 | 0.7692 | 0.7273 | 26 | 0.7867 | 0.8702 | 0.8263 | 0.9585 | | 0.0211 | 55.0 | 5830 | 0.2168 | 0.7895 | 0.8333 | 0.8108 | 72 | 0.6907 | 0.8933 | 0.7791 | 75 | 0.8535 | 0.9371 | 0.8933 | 143 | 0.5625 | 0.7826 | 0.6545 | 23 | 0.7333 | 0.8462 | 0.7857 | 26 | 0.7679 | 0.8879 | 0.8235 | 0.9550 | | 0.0204 | 56.0 | 5936 | 0.2145 | 0.8056 | 0.8056 | 0.8056 | 72 | 0.7097 | 0.88 | 0.7857 | 75 | 0.8506 | 0.9161 | 0.8822 | 143 | 0.6333 | 0.8261 | 0.7170 | 23 | 0.6452 | 0.7692 | 0.7018 | 26 | 0.7737 | 0.8673 | 0.8178 | 0.9565 | | 0.0212 | 57.0 | 6042 | 0.2186 | 0.8028 | 0.7917 | 0.7972 | 72 | 0.65 | 0.8667 | 0.7429 | 75 | 0.8718 | 0.9510 | 0.9097 | 143 | 0.6552 | 0.8261 | 0.7308 | 23 | 0.6875 | 0.8462 | 0.7586 | 26 | 0.7706 | 0.8820 | 0.8226 | 0.9560 | | 0.0203 | 58.0 | 6148 | 0.2017 | 0.7945 | 0.8056 | 0.8 | 72 | 0.7073 | 0.7733 | 0.7389 | 75 | 0.875 | 0.9301 | 0.9017 | 143 | 0.5625 | 0.7826 | 0.6545 | 23 | 0.7586 | 0.8462 | 0.8 | 26 | 0.7853 | 0.8525 | 0.8175 | 0.9580 | | 0.0186 | 59.0 | 6254 | 0.2211 | 0.8 | 0.8333 | 0.8163 | 72 | 0.6923 | 0.84 | 0.7590 | 75 | 0.8710 | 0.9441 | 0.9060 | 143 | 0.5455 | 0.7826 | 0.6429 | 23 | 0.6129 | 0.7308 | 0.6667 | 26 | 0.7662 | 0.8702 | 0.8149 | 0.9555 | | 0.0203 | 60.0 | 6360 | 0.2336 | 0.8158 | 0.8611 | 0.8378 | 72 | 0.6875 | 0.88 | 0.7719 | 75 | 0.8375 | 0.9371 | 0.8845 | 143 | 0.5625 | 0.7826 | 0.6545 | 23 | 0.625 | 0.7692 | 0.6897 | 26 | 0.7576 | 0.8850 | 0.8163 | 0.9547 | | 0.0192 | 61.0 | 6466 | 0.2162 | 0.8169 | 0.8056 | 0.8112 | 72 | 0.7262 | 0.8133 | 0.7673 | 75 | 0.8428 | 0.9371 | 0.8874 | 143 | 0.6552 | 0.8261 | 0.7308 | 23 | 0.7333 | 0.8462 | 0.7857 | 26 | 0.7882 | 0.8673 | 0.8258 | 0.9582 | | 0.0185 | 62.0 | 6572 | 0.2180 | 0.7945 | 0.8056 | 0.8 | 72 | 0.7174 | 0.88 | 0.7904 | 75 | 0.8581 | 0.9301 | 0.8926 | 143 | 0.5667 | 0.7391 | 0.6415 | 23 | 0.6897 | 0.7692 | 0.7273 | 26 | 0.7757 | 0.8673 | 0.8189 | 0.9557 | | 0.0174 | 63.0 | 6678 | 0.2137 | 0.7917 | 0.7917 | 0.7917 | 72 | 0.7262 | 0.8133 | 0.7673 | 75 | 0.8693 | 0.9301 | 0.8986 | 143 | 0.6786 | 0.8261 | 0.7451 | 23 | 0.75 | 0.8077 | 0.7778 | 26 | 0.7973 | 0.8584 | 0.8267 | 0.9587 | | 0.0162 | 64.0 | 6784 | 0.2324 | 0.7945 | 0.8056 | 0.8 | 72 | 0.6923 | 0.84 | 0.7590 | 75 | 0.8526 | 0.9301 | 0.8896 | 143 | 0.6786 | 0.8261 | 0.7451 | 23 | 0.7241 | 0.8077 | 0.7636 | 26 | 0.7798 | 0.8673 | 0.8212 | 0.9562 | | 0.0174 | 65.0 | 6890 | 0.2370 | 0.775 | 0.8611 | 0.8158 | 72 | 0.7033 | 0.8533 | 0.7711 | 75 | 0.8471 | 0.9301 | 0.8867 | 143 | 0.5625 | 0.7826 | 0.6545 | 23 | 0.7857 | 0.8462 | 0.8148 | 26 | 0.7706 | 0.8820 | 0.8226 | 0.9555 | | 0.0177 | 66.0 | 6996 | 0.2165 | 0.8082 | 0.8194 | 0.8138 | 72 | 0.6966 | 0.8267 | 0.7561 | 75 | 0.8816 | 0.9371 | 0.9085 | 143 | 0.7037 | 0.8261 | 0.76 | 23 | 0.7857 | 0.8462 | 0.8148 | 26 | 0.8022 | 0.8732 | 0.8362 | 0.9600 | | 0.0151 | 67.0 | 7102 | 0.2298 | 0.8082 | 0.8194 | 0.8138 | 72 | 0.7108 | 0.7867 | 0.7468 | 75 | 0.8408 | 0.9231 | 0.8800 | 143 | 0.6 | 0.7826 | 0.6792 | 23 | 0.7931 | 0.8846 | 0.8364 | 26 | 0.7823 | 0.8584 | 0.8186 | 0.9565 | | 0.0188 | 68.0 | 7208 | 0.2234 | 0.7532 | 0.8056 | 0.7785 | 72 | 0.6739 | 0.8267 | 0.7425 | 75 | 0.875 | 0.9301 | 0.9017 | 143 | 0.6 | 0.7826 | 0.6792 | 23 | 0.7586 | 0.8462 | 0.8 | 26 | 0.7711 | 0.8643 | 0.8150 | 0.9565 | | 0.0151 | 69.0 | 7314 | 0.2399 | 0.7532 | 0.8056 | 0.7785 | 72 | 0.6989 | 0.8667 | 0.7738 | 75 | 0.8428 | 0.9371 | 0.8874 | 143 | 0.5484 | 0.7391 | 0.6296 | 23 | 0.7667 | 0.8846 | 0.8214 | 26 | 0.7615 | 0.8761 | 0.8148 | 0.9552 | | 0.0171 | 70.0 | 7420 | 0.2272 | 0.75 | 0.7917 | 0.7703 | 72 | 0.6860 | 0.7867 | 0.7329 | 75 | 0.8758 | 0.9371 | 0.9054 | 143 | 0.6333 | 0.8261 | 0.7170 | 23 | 0.7 | 0.8077 | 0.75 | 26 | 0.7733 | 0.8555 | 0.8123 | 0.9567 | | 0.0154 | 71.0 | 7526 | 0.2361 | 0.75 | 0.7917 | 0.7703 | 72 | 0.6848 | 0.84 | 0.7545 | 75 | 0.8481 | 0.9371 | 0.8904 | 143 | 0.5484 | 0.7391 | 0.6296 | 23 | 0.7586 | 0.8462 | 0.8 | 26 | 0.7591 | 0.8643 | 0.8083 | 0.9552 | | 0.0147 | 72.0 | 7632 | 0.2276 | 0.7838 | 0.8056 | 0.7945 | 72 | 0.6848 | 0.84 | 0.7545 | 75 | 0.8816 | 0.9371 | 0.9085 | 143 | 0.6552 | 0.8261 | 0.7308 | 23 | 0.7333 | 0.8462 | 0.7857 | 26 | 0.7851 | 0.8732 | 0.8268 | 0.9585 | | 0.0134 | 73.0 | 7738 | 0.2310 | 0.76 | 0.7917 | 0.7755 | 72 | 0.7241 | 0.84 | 0.7778 | 75 | 0.8590 | 0.9371 | 0.8963 | 143 | 0.6552 | 0.8261 | 0.7308 | 23 | 0.7333 | 0.8462 | 0.7857 | 26 | 0.7825 | 0.8702 | 0.8240 | 0.9582 | | 0.0125 | 74.0 | 7844 | 0.2296 | 0.7763 | 0.8194 | 0.7973 | 72 | 0.7111 | 0.8533 | 0.7758 | 75 | 0.8816 | 0.9371 | 0.9085 | 143 | 0.6552 | 0.8261 | 0.7308 | 23 | 0.8 | 0.9231 | 0.8571 | 26 | 0.7958 | 0.8850 | 0.8380 | 0.9592 | | 0.0138 | 75.0 | 7950 | 0.2448 | 0.7922 | 0.8472 | 0.8188 | 72 | 0.7033 | 0.8533 | 0.7711 | 75 | 0.8758 | 0.9371 | 0.9054 | 143 | 0.5484 | 0.7391 | 0.6296 | 23 | 0.7586 | 0.8462 | 0.8 | 26 | 0.7822 | 0.8791 | 0.8278 | 0.9567 | | 0.0138 | 76.0 | 8056 | 0.2369 | 0.75 | 0.8333 | 0.7895 | 72 | 0.7033 | 0.8533 | 0.7711 | 75 | 0.8710 | 0.9441 | 0.9060 | 143 | 0.6552 | 0.8261 | 0.7308 | 23 | 0.75 | 0.8077 | 0.7778 | 26 | 0.7807 | 0.8820 | 0.8283 | 0.9587 | | 0.0155 | 77.0 | 8162 | 0.2269 | 0.7703 | 0.7917 | 0.7808 | 72 | 0.7045 | 0.8267 | 0.7607 | 75 | 0.8701 | 0.9371 | 0.9024 | 143 | 0.6333 | 0.8261 | 0.7170 | 23 | 0.7586 | 0.8462 | 0.8 | 26 | 0.784 | 0.8673 | 0.8235 | 0.9587 | | 0.0121 | 78.0 | 8268 | 0.2355 | 0.7763 | 0.8194 | 0.7973 | 72 | 0.7079 | 0.84 | 0.7683 | 75 | 0.8766 | 0.9441 | 0.9091 | 143 | 0.5667 | 0.7391 | 0.6415 | 23 | 0.7857 | 0.8462 | 0.8148 | 26 | 0.7851 | 0.8732 | 0.8268 | 0.9585 | | 0.0134 | 79.0 | 8374 | 0.2424 | 0.7703 | 0.7917 | 0.7808 | 72 | 0.6923 | 0.84 | 0.7590 | 75 | 0.8758 | 0.9371 | 0.9054 | 143 | 0.6207 | 0.7826 | 0.6923 | 23 | 0.7857 | 0.8462 | 0.8148 | 26 | 0.784 | 0.8673 | 0.8235 | 0.9580 | | 0.0132 | 80.0 | 8480 | 0.2254 | 0.7532 | 0.8056 | 0.7785 | 72 | 0.7209 | 0.8267 | 0.7702 | 75 | 0.8874 | 0.9371 | 0.9116 | 143 | 0.7308 | 0.8261 | 0.7755 | 23 | 0.8214 | 0.8846 | 0.8519 | 26 | 0.8043 | 0.8732 | 0.8373 | 0.9610 | | 0.0128 | 81.0 | 8586 | 0.2412 | 0.7692 | 0.8333 | 0.8 | 72 | 0.7045 | 0.8267 | 0.7607 | 75 | 0.8581 | 0.9301 | 0.8926 | 143 | 0.6 | 0.7826 | 0.6792 | 23 | 0.8276 | 0.9231 | 0.8727 | 26 | 0.7816 | 0.8761 | 0.8261 | 0.9582 | | 0.0131 | 82.0 | 8692 | 0.2461 | 0.7564 | 0.8194 | 0.7867 | 72 | 0.7045 | 0.8267 | 0.7607 | 75 | 0.8581 | 0.9301 | 0.8926 | 143 | 0.5806 | 0.7826 | 0.6667 | 23 | 0.7333 | 0.8462 | 0.7857 | 26 | 0.7696 | 0.8673 | 0.8155 | 0.9560 | | 0.0127 | 83.0 | 8798 | 0.2331 | 0.7532 | 0.8056 | 0.7785 | 72 | 0.6932 | 0.8133 | 0.7485 | 75 | 0.8701 | 0.9371 | 0.9024 | 143 | 0.5806 | 0.7826 | 0.6667 | 23 | 0.7857 | 0.8462 | 0.8148 | 26 | 0.7751 | 0.8643 | 0.8173 | 0.9575 | | 0.012 | 84.0 | 8904 | 0.2384 | 0.7671 | 0.7778 | 0.7724 | 72 | 0.6966 | 0.8267 | 0.7561 | 75 | 0.8636 | 0.9301 | 0.8956 | 143 | 0.6552 | 0.8261 | 0.7308 | 23 | 0.7857 | 0.8462 | 0.8148 | 26 | 0.7828 | 0.8614 | 0.8202 | 0.9577 | | 0.0123 | 85.0 | 9010 | 0.2343 | 0.76 | 0.7917 | 0.7755 | 72 | 0.7209 | 0.8267 | 0.7702 | 75 | 0.8808 | 0.9301 | 0.9048 | 143 | 0.6786 | 0.8261 | 0.7451 | 23 | 0.7857 | 0.8462 | 0.8148 | 26 | 0.7962 | 0.8643 | 0.8289 | 0.9592 | | 0.0117 | 86.0 | 9116 | 0.2410 | 0.7568 | 0.7778 | 0.7671 | 72 | 0.6966 | 0.8267 | 0.7561 | 75 | 0.8645 | 0.9371 | 0.8993 | 143 | 0.5312 | 0.7391 | 0.6182 | 23 | 0.7586 | 0.8462 | 0.8 | 26 | 0.7678 | 0.8584 | 0.8106 | 0.9555 | | 0.0126 | 87.0 | 9222 | 0.2339 | 0.7808 | 0.7917 | 0.7862 | 72 | 0.6818 | 0.8 | 0.7362 | 75 | 0.875 | 0.9301 | 0.9017 | 143 | 0.6429 | 0.7826 | 0.7059 | 23 | 0.7857 | 0.8462 | 0.8148 | 26 | 0.7859 | 0.8555 | 0.8192 | 0.9582 | | 0.0119 | 88.0 | 9328 | 0.2375 | 0.7838 | 0.8056 | 0.7945 | 72 | 0.7126 | 0.8267 | 0.7654 | 75 | 0.8645 | 0.9371 | 0.8993 | 143 | 0.6429 | 0.7826 | 0.7059 | 23 | 0.7857 | 0.8462 | 0.8148 | 26 | 0.7903 | 0.8673 | 0.8270 | 0.9592 | | 0.0116 | 89.0 | 9434 | 0.2396 | 0.7973 | 0.8194 | 0.8082 | 72 | 0.7045 | 0.8267 | 0.7607 | 75 | 0.8758 | 0.9371 | 0.9054 | 143 | 0.6552 | 0.8261 | 0.7308 | 23 | 0.7857 | 0.8462 | 0.8148 | 26 | 0.7957 | 0.8732 | 0.8326 | 0.9595 | | 0.0108 | 90.0 | 9540 | 0.2376 | 0.7867 | 0.8194 | 0.8027 | 72 | 0.7143 | 0.8 | 0.7547 | 75 | 0.8590 | 0.9371 | 0.8963 | 143 | 0.6552 | 0.8261 | 0.7308 | 23 | 0.7241 | 0.8077 | 0.7636 | 26 | 0.7855 | 0.8643 | 0.8230 | 0.9597 | | 0.0102 | 91.0 | 9646 | 0.2361 | 0.7945 | 0.8056 | 0.8 | 72 | 0.7176 | 0.8133 | 0.7625 | 75 | 0.8581 | 0.9301 | 0.8926 | 143 | 0.6207 | 0.7826 | 0.6923 | 23 | 0.7586 | 0.8462 | 0.8 | 26 | 0.7871 | 0.8614 | 0.8225 | 0.9587 | | 0.0102 | 92.0 | 9752 | 0.2403 | 0.7838 | 0.8056 | 0.7945 | 72 | 0.7126 | 0.8267 | 0.7654 | 75 | 0.8581 | 0.9301 | 0.8926 | 143 | 0.6786 | 0.8261 | 0.7451 | 23 | 0.7931 | 0.8846 | 0.8364 | 26 | 0.7909 | 0.8702 | 0.8287 | 0.9580 | | 0.0108 | 93.0 | 9858 | 0.2382 | 0.8056 | 0.8056 | 0.8056 | 72 | 0.7126 | 0.8267 | 0.7654 | 75 | 0.8701 | 0.9371 | 0.9024 | 143 | 0.6552 | 0.8261 | 0.7308 | 23 | 0.7857 | 0.8462 | 0.8148 | 26 | 0.7973 | 0.8702 | 0.8322 | 0.9600 | | 0.01 | 94.0 | 9964 | 0.2422 | 0.7945 | 0.8056 | 0.8 | 72 | 0.7126 | 0.8267 | 0.7654 | 75 | 0.8535 | 0.9371 | 0.8933 | 143 | 0.6552 | 0.8261 | 0.7308 | 23 | 0.7857 | 0.8462 | 0.8148 | 26 | 0.7888 | 0.8702 | 0.8275 | 0.9592 | | 0.0093 | 95.0 | 10070 | 0.2458 | 0.7867 | 0.8194 | 0.8027 | 72 | 0.7126 | 0.8267 | 0.7654 | 75 | 0.8535 | 0.9371 | 0.8933 | 143 | 0.6552 | 0.8261 | 0.7308 | 23 | 0.7333 | 0.8462 | 0.7857 | 26 | 0.7831 | 0.8732 | 0.8257 | 0.9585 | | 0.01 | 96.0 | 10176 | 0.2405 | 0.7808 | 0.7917 | 0.7862 | 72 | 0.7229 | 0.8 | 0.7595 | 75 | 0.8581 | 0.9301 | 0.8926 | 143 | 0.6207 | 0.7826 | 0.6923 | 23 | 0.7857 | 0.8462 | 0.8148 | 26 | 0.7880 | 0.8555 | 0.8204 | 0.9587 | | 0.0092 | 97.0 | 10282 | 0.2446 | 0.7973 | 0.8194 | 0.8082 | 72 | 0.7045 | 0.8267 | 0.7607 | 75 | 0.8645 | 0.9371 | 0.8993 | 143 | 0.6207 | 0.7826 | 0.6923 | 23 | 0.7857 | 0.8462 | 0.8148 | 26 | 0.7888 | 0.8702 | 0.8275 | 0.9592 | | 0.0102 | 98.0 | 10388 | 0.2452 | 0.7973 | 0.8194 | 0.8082 | 72 | 0.7045 | 0.8267 | 0.7607 | 75 | 0.8590 | 0.9371 | 0.8963 | 143 | 0.6552 | 0.8261 | 0.7308 | 23 | 0.7857 | 0.8462 | 0.8148 | 26 | 0.7893 | 0.8732 | 0.8291 | 0.9595 | | 0.0102 | 99.0 | 10494 | 0.2437 | 0.7973 | 0.8194 | 0.8082 | 72 | 0.7045 | 0.8267 | 0.7607 | 75 | 0.8590 | 0.9371 | 0.8963 | 143 | 0.6552 | 0.8261 | 0.7308 | 23 | 0.7857 | 0.8462 | 0.8148 | 26 | 0.7893 | 0.8732 | 0.8291 | 0.9595 | | 0.0098 | 100.0 | 10600 | 0.2436 | 0.7973 | 0.8194 | 0.8082 | 72 | 0.7045 | 0.8267 | 0.7607 | 75 | 0.8590 | 0.9371 | 0.8963 | 143 | 0.6552 | 0.8261 | 0.7308 | 23 | 0.7857 | 0.8462 | 0.8148 | 26 | 0.7893 | 0.8732 | 0.8291 | 0.9595 | ### Framework versions - Transformers 4.39.3 - Pytorch 2.3.0+cu121 - Datasets 2.19.1 - Tokenizers 0.15.2
sj21867/ai_art_exp3_mobilenetv2
sj21867
2024-06-03T23:25:00Z
193
0
transformers
[ "transformers", "tensorboard", "safetensors", "mobilenet_v2", "image-classification", "generated_from_trainer", "base_model:google/mobilenet_v2_1.0_224", "base_model:finetune:google/mobilenet_v2_1.0_224", "license:other", "autotrain_compatible", "endpoints_compatible", "region:us" ]
image-classification
2024-06-03T23:23:07Z
--- license: other base_model: google/mobilenet_v2_1.0_224 tags: - generated_from_trainer metrics: - accuracy model-index: - name: ai_art_exp3_mobilenetv2 results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # ai_art_exp3_mobilenetv2 This model is a fine-tuned version of [google/mobilenet_v2_1.0_224](https://huggingface.co/google/mobilenet_v2_1.0_224) on an unknown dataset. It achieves the following results on the evaluation set: - Accuracy: {'accuracy': 0.65} - Loss: 0.8813 - Overall Accuracy: 0.65 - Human Accuracy: 0.34 - Ld Accuracy: 0.84 - Sd Accuracy: 0.77 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 1 ### Training results | Training Loss | Epoch | Step | Accuracy | Validation Loss | Overall Accuracy | Human Accuracy | Ld Accuracy | Sd Accuracy | |:-------------:|:-----:|:----:|:--------------------------------:|:---------------:|:----------------:|:--------------:|:-----------:|:-----------:| | 1.0707 | 0.96 | 18 | {'accuracy': 0.6333333333333333} | 0.8947 | 0.6333 | 0.3426 | 0.8485 | 0.7419 | ### Framework versions - Transformers 4.41.1 - Pytorch 2.3.0+cu121 - Datasets 2.19.2 - Tokenizers 0.19.1