azdin shr commited on
Commit
f65cb3b
·
verified ·
1 Parent(s): cc03aa6

Add README

Browse files
Files changed (1) hide show
  1. README.md +35 -47
README.md CHANGED
@@ -1,62 +1,50 @@
1
  ---
 
2
  base_model: Qwen/Qwen2-VL-7B-Instruct
3
- library_name: peft
4
- model_name: adalora_weather_model
5
  tags:
6
- - base_model:adapter:Qwen/Qwen2-VL-7B-Instruct
7
- - lora
8
- - sft
9
- - transformers
10
- - trl
11
- licence: license
12
- pipeline_tag: text-generation
13
  ---
14
 
15
- # Model Card for adalora_weather_model
16
 
17
- This model is a fine-tuned version of [Qwen/Qwen2-VL-7B-Instruct](https://huggingface.co/Qwen/Qwen2-VL-7B-Instruct).
18
- It has been trained using [TRL](https://github.com/huggingface/trl).
19
 
20
- ## Quick start
21
 
22
- ```python
23
- from transformers import pipeline
24
-
25
- question = "If you had a time machine, but could only go to the past or the future once and never return, which would you choose and why?"
26
- generator = pipeline("text-generation", model="None", device="cuda")
27
- output = generator([{"role": "user", "content": question}], max_new_tokens=128, return_full_text=False)[0]
28
- print(output["generated_text"])
29
- ```
30
-
31
- ## Training procedure
32
 
33
- [<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="150" height="24"/>](https://wandb.ai/azdinsahir11-university-mohamed-v/qwen2vl-adalora-weather/runs/6q7w54jc)
34
 
 
 
 
35
 
36
- This model was trained with SFT.
37
-
38
- ### Framework versions
39
-
40
- - PEFT 0.16.0
41
- - TRL: 0.19.1
42
- - Transformers: 4.53.2
43
- - Pytorch: 2.6.0+cu124
44
- - Datasets: 4.0.0
45
- - Tokenizers: 0.21.2
46
 
47
- ## Citations
 
48
 
 
 
49
 
 
50
 
51
- Cite TRL as:
52
-
53
- ```bibtex
54
- @misc{vonwerra2022trl,
55
- title = {{TRL: Transformer Reinforcement Learning}},
56
- author = {Leandro von Werra and Younes Belkada and Lewis Tunstall and Edward Beeching and Tristan Thrush and Nathan Lambert and Shengyi Huang and Kashif Rasul and Quentin Gallou{\'e}dec},
57
- year = 2020,
58
- journal = {GitHub repository},
59
- publisher = {GitHub},
60
- howpublished = {\url{https://github.com/huggingface/trl}}
61
- }
62
- ```
 
1
  ---
2
+ license: apache-2.0
3
  base_model: Qwen/Qwen2-VL-7B-Instruct
 
 
4
  tags:
5
+ - qwen2-vl
6
+ - weather
7
+ - satellite
8
+ - morocco
9
+ - meteorology
10
+ - adalora
11
+ - fine-tuned
12
  ---
13
 
14
+ # Qwen2-VL Weather Analysis - AdaLoRA
15
 
16
+ Fine-tuned using **AdaLoRA** technique for weather satellite imagery analysis.
 
17
 
18
+ ## Model Details
19
 
20
+ - **Base Model:** Qwen/Qwen2-VL-7B-Instruct
21
+ - **Technique:** AdaLoRA
22
+ - **Domain:** Weather satellite imagery analysis
23
+ - **Dataset:** Weather satellite images with meteorological metadata
 
 
 
 
 
 
24
 
25
+ ## Usage
26
 
27
+ ```python
28
+ from transformers import Qwen2VLForConditionalGeneration, Qwen2VLProcessor
29
+ import torch
30
 
31
+ # Load base model
32
+ model = Qwen2VLForConditionalGeneration.from_pretrained(
33
+ "Qwen/Qwen2-VL-7B-Instruct",
34
+ torch_dtype=torch.bfloat16,
35
+ device_map="auto"
36
+ )
37
+ processor = Qwen2VLProcessor.from_pretrained("Qwen/Qwen2-VL-7B-Instruct")
 
 
 
38
 
39
+ # Load fine-tuned adapter
40
+ model.load_adapter("azdin/qwen2-vl-weather-adalora")
41
 
42
+ # Use for weather analysis...
43
+ ```
44
 
45
+ ## Training Details
46
 
47
+ - **Technique:** AdaLoRA
48
+ - **Quantization:** 4-bit NF4
49
+ - **Training Data:** Weather satellite imagery with metadata
50
+ - **Target Modules:** Attention layers