pipeline_tag
stringclasses 48
values | library_name
stringclasses 198
values | text
stringlengths 1
900k
| metadata
stringlengths 2
438k
| id
stringlengths 5
122
| last_modified
null | tags
listlengths 1
1.84k
| sha
null | created_at
stringlengths 25
25
| arxiv
listlengths 0
201
| languages
listlengths 0
1.83k
| tags_str
stringlengths 17
9.34k
| text_str
stringlengths 0
389k
| text_lists
listlengths 0
722
| processed_texts
listlengths 1
723
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
text-generation
|
transformers
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://images.genius.com/2fa03267661cbc8112b4ef31685e2721.220x220x1.png')">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">ABBA</div>
<a href="https://genius.com/artists/abba">
<div style="text-align: center; font-size: 14px;">@abba</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from ABBA.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/abba).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/abba")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/3pc6wfre/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on ABBA's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/3b7wqd1w) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/3b7wqd1w/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/abba')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/abba")
model = AutoModelWithLMHead.from_pretrained("huggingartists/abba")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/abba"], "widget": [{"text": "I am"}]}
|
huggingartists/abba
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/abba",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/abba #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">ABBA</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@abba</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from ABBA.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on ABBA's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Adele</div>
<a href="https://genius.com/artists/adele">
<div style="text-align: center; font-size: 14px;">@adele</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Adele.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/adele).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/adele")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/1yyqw6ss/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Adele's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/3qruwjpr) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/3qruwjpr/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/adele')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/adele")
model = AutoModelWithLMHead.from_pretrained("huggingartists/adele")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/adele"], "widget": [{"text": "I am"}]}
|
huggingartists/adele
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/adele",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/adele #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Adele</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@adele</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Adele.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Adele's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Агата Кристи (Agata Christie)</div>
<a href="https://genius.com/artists/agata-christie">
<div style="text-align: center; font-size: 14px;">@agata-christie</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Агата Кристи (Agata Christie).
Dataset is available [here](https://huggingface.co/datasets/huggingartists/agata-christie).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/agata-christie")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/1dtf6ia5/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Агата Кристи (Agata Christie)'s lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/q27fvz1h) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/q27fvz1h/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/agata-christie')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/agata-christie")
model = AutoModelWithLMHead.from_pretrained("huggingartists/agata-christie")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/agata-christie"], "widget": [{"text": "I am"}]}
|
huggingartists/agata-christie
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/agata-christie",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/agata-christie #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Агата Кристи (Agata Christie)</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@agata-christie</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Агата Кристи (Agata Christie).
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Агата Кристи (Agata Christie)'s lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.",
"## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on Агата Кристи (Agata Christie)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.",
"## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:",
"## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.",
"## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.",
"## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on Агата Кристи (Agata Christie)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.",
"## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:",
"## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.",
"## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">aikko</div>
<a href="https://genius.com/artists/aikko">
<div style="text-align: center; font-size: 14px;">@aikko</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from aikko.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/aikko).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/aikko")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/1cfdpsrg/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on aikko's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/oesyn53g) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/oesyn53g/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/aikko')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/aikko")
model = AutoModelWithLMHead.from_pretrained("huggingartists/aikko")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/aikko"], "widget": [{"text": "I am"}]}
|
huggingartists/aikko
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/aikko",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/aikko #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">aikko</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@aikko</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from aikko.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on aikko's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Aimer</div>
<a href="https://genius.com/artists/aimer">
<div style="text-align: center; font-size: 14px;">@aimer</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Aimer.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/aimer).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/aimer")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/1rtjxc8q/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Aimer's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/2rguugmg) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/2rguugmg/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/aimer')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/aimer")
model = AutoModelWithLMHead.from_pretrained("huggingartists/aimer")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/aimer"], "widget": [{"text": "I am"}]}
|
huggingartists/aimer
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/aimer",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/aimer #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Aimer</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@aimer</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Aimer.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Aimer's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Alan Walker</div>
<a href="https://genius.com/artists/alan-walker">
<div style="text-align: center; font-size: 14px;">@alan-walker</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Alan Walker.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/alan-walker).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/alan-walker")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/3oyxxcos/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Alan Walker's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/huoxll6m) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/huoxll6m/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/alan-walker')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/alan-walker")
model = AutoModelWithLMHead.from_pretrained("huggingartists/alan-walker")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/alan-walker"], "widget": [{"text": "I am"}]}
|
huggingartists/alan-walker
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/alan-walker",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/alan-walker #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Alan Walker</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@alan-walker</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Alan Walker.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Alan Walker's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">André 3000</div>
<a href="https://genius.com/artists/andre-3000">
<div style="text-align: center; font-size: 14px;">@andre-3000</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from André 3000.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/andre-3000).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/andre-3000")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/2hnhboqf/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on André 3000's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/1mydp6nh) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/1mydp6nh/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/andre-3000')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/andre-3000")
model = AutoModelWithLMHead.from_pretrained("huggingartists/andre-3000")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/andre-3000"], "widget": [{"text": "I am"}]}
|
huggingartists/andre-3000
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/andre-3000",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/andre-3000 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">André 3000</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@andre-3000</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from André 3000.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on André 3000's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Arash</div>
<a href="https://genius.com/artists/arash">
<div style="text-align: center; font-size: 14px;">@arash</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Arash.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/arash).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/arash")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/27u6df87/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Arash's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/3eav8xpf) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/3eav8xpf/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/arash')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/arash")
model = AutoModelWithLMHead.from_pretrained("huggingartists/arash")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/arash"], "widget": [{"text": "I am"}]}
|
huggingartists/arash
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/arash",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/arash #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Arash</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@arash</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Arash.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Arash's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Architects</div>
<a href="https://genius.com/artists/architects">
<div style="text-align: center; font-size: 14px;">@architects</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Architects.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/architects).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/architects")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/licizuue/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Architects's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/1a9mrzf8) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/1a9mrzf8/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/architects')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/architects")
model = AutoModelWithLMHead.from_pretrained("huggingartists/architects")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/architects"], "widget": [{"text": "I am"}]}
|
huggingartists/architects
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/architects",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/architects #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Architects</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@architects</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Architects.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Architects's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Arctic Monkeys</div>
<a href="https://genius.com/artists/arctic-monkeys">
<div style="text-align: center; font-size: 14px;">@arctic-monkeys</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Arctic Monkeys.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/arctic-monkeys).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/arctic-monkeys")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/1x4ii6qz/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Arctic Monkeys's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/bmnqvn53) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/bmnqvn53/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/arctic-monkeys')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/arctic-monkeys")
model = AutoModelWithLMHead.from_pretrained("huggingartists/arctic-monkeys")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/arctic-monkeys"], "widget": [{"text": "I am"}]}
|
huggingartists/arctic-monkeys
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/arctic-monkeys",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/arctic-monkeys #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Arctic Monkeys</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@arctic-monkeys</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Arctic Monkeys.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Arctic Monkeys's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Ariana Grande</div>
<a href="https://genius.com/artists/ariana-grande">
<div style="text-align: center; font-size: 14px;">@ariana-grande</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Ariana Grande.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/ariana-grande).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/ariana-grande")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/2nfg7v7i/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Ariana Grande's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/3u3sn1bx) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/3u3sn1bx/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/ariana-grande')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/ariana-grande")
model = AutoModelWithLMHead.from_pretrained("huggingartists/ariana-grande")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/ariana-grande"], "widget": [{"text": "I am"}]}
|
huggingartists/ariana-grande
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/ariana-grande",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/ariana-grande #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Ariana Grande</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@ariana-grande</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Ariana Grande.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Ariana Grande's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Ария (Ariya)</div>
<a href="https://genius.com/artists/ariya">
<div style="text-align: center; font-size: 14px;">@ariya</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Ария (Ariya).
Dataset is available [here](https://huggingface.co/datasets/huggingartists/ariya).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/ariya")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/uo73s5z1/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Ария (Ariya)'s lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/69c1r7ea) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/69c1r7ea/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/ariya')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/ariya")
model = AutoModelWithLMHead.from_pretrained("huggingartists/ariya")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/ariya"], "widget": [{"text": "I am"}]}
|
huggingartists/ariya
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/ariya",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/ariya #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Ария (Ariya)</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@ariya</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Ария (Ariya).
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Ария (Ariya)'s lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.",
"## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on Ария (Ariya)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.",
"## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:",
"## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.",
"## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.",
"## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on Ария (Ariya)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.",
"## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:",
"## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.",
"## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Armin van Buuren</div>
<a href="https://genius.com/artists/armin-van-buuren">
<div style="text-align: center; font-size: 14px;">@armin-van-buuren</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Armin van Buuren.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/armin-van-buuren).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/armin-van-buuren")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/hrrfc55y/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Armin van Buuren's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/3q93rwo8) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/3q93rwo8/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/armin-van-buuren')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/armin-van-buuren")
model = AutoModelWithLMHead.from_pretrained("huggingartists/armin-van-buuren")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/armin-van-buuren"], "widget": [{"text": "I am"}]}
|
huggingartists/armin-van-buuren
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/armin-van-buuren",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/armin-van-buuren #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Armin van Buuren</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@armin-van-buuren</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Armin van Buuren.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Armin van Buuren's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">As I Lay Dying</div>
<a href="https://genius.com/artists/as-i-lay-dying">
<div style="text-align: center; font-size: 14px;">@as-i-lay-dying</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from As I Lay Dying.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/as-i-lay-dying).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/as-i-lay-dying")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/2zq9ub8b/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on As I Lay Dying's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/cjg5ac7f) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/cjg5ac7f/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/as-i-lay-dying')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/as-i-lay-dying")
model = AutoModelWithLMHead.from_pretrained("huggingartists/as-i-lay-dying")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/as-i-lay-dying"], "widget": [{"text": "I am"}]}
|
huggingartists/as-i-lay-dying
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/as-i-lay-dying",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/as-i-lay-dying #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">As I Lay Dying</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@as-i-lay-dying</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from As I Lay Dying.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on As I Lay Dying's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">BAKLAN</div>
<a href="https://genius.com/artists/baklan">
<div style="text-align: center; font-size: 14px;">@baklan</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from BAKLAN.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/baklan).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/baklan")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/2k5w5yhe/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on BAKLAN's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/28fvfef4) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/28fvfef4/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/baklan')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/baklan")
model = AutoModelWithLMHead.from_pretrained("huggingartists/baklan")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/baklan"], "widget": [{"text": "I am"}]}
|
huggingartists/baklan
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/baklan",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/baklan #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">BAKLAN</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@baklan</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from BAKLAN.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on BAKLAN's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Big Baby Tape</div>
<a href="https://genius.com/artists/big-baby-tape">
<div style="text-align: center; font-size: 14px;">@big-baby-tape</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Big Baby Tape.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/big-baby-tape).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/big-baby-tape")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/1mu9ki6z/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Big Baby Tape's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/30qklxvh) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/30qklxvh/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/big-baby-tape')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/big-baby-tape")
model = AutoModelWithLMHead.from_pretrained("huggingartists/big-baby-tape")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/big-baby-tape"], "widget": [{"text": "I am"}]}
|
huggingartists/big-baby-tape
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/big-baby-tape",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/big-baby-tape #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Big Baby Tape</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@big-baby-tape</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Big Baby Tape.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Big Baby Tape's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Big Russian Boss</div>
<a href="https://genius.com/artists/big-russian-boss">
<div style="text-align: center; font-size: 14px;">@big-russian-boss</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Big Russian Boss.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/big-russian-boss).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/big-russian-boss")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/1ju9bqqi/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Big Russian Boss's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/3820n7qx) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/3820n7qx/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/big-russian-boss')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/big-russian-boss")
model = AutoModelWithLMHead.from_pretrained("huggingartists/big-russian-boss")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/big-russian-boss"], "widget": [{"text": "I am"}]}
|
huggingartists/big-russian-boss
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/big-russian-boss",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/big-russian-boss #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Big Russian Boss</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@big-russian-boss</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Big Russian Boss.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Big Russian Boss's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Bill Wurtz</div>
<a href="https://genius.com/artists/bill-wurtz">
<div style="text-align: center; font-size: 14px;">@bill-wurtz</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Bill Wurtz.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/bill-wurtz).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/bill-wurtz")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/27ysbe74/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Bill Wurtz's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/2f8oa51l) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/2f8oa51l/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/bill-wurtz')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/bill-wurtz")
model = AutoModelWithLMHead.from_pretrained("huggingartists/bill-wurtz")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/bill-wurtz"], "widget": [{"text": "I am"}]}
|
huggingartists/bill-wurtz
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/bill-wurtz",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/bill-wurtz #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Bill Wurtz</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@bill-wurtz</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Bill Wurtz.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Bill Wurtz's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Billie Eilish</div>
<a href="https://genius.com/artists/billie-eilish">
<div style="text-align: center; font-size: 14px;">@billie-eilish</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Billie Eilish.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/billie-eilish).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/billie-eilish")
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/billie-eilish")
model = AutoModelWithLMHead.from_pretrained("huggingartists/billie-eilish")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/3l1r2mnu/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Billie Eilish's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/209kskmi) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/209kskmi/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/billie-eilish')
generator("I am", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/billie-eilish"], "widget": [{"text": "I am"}]}
|
huggingartists/billie-eilish
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/billie-eilish",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/billie-eilish #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Billie Eilish</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@billie-eilish</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Billie Eilish.
Dataset is available here.
And can be used with:
Or with Transformers library:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Billie Eilish's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Billy Talent</div>
<a href="https://genius.com/artists/billy-talent">
<div style="text-align: center; font-size: 14px;">@billy-talent</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Billy Talent.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/billy-talent).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/billy-talent")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/37amfbe8/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Billy Talent's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/pyw6tj9v) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/pyw6tj9v/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/billy-talent')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/billy-talent")
model = AutoModelWithLMHead.from_pretrained("huggingartists/billy-talent")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/billy-talent"], "widget": [{"text": "I am"}]}
|
huggingartists/billy-talent
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/billy-talent",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/billy-talent #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Billy Talent</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@billy-talent</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Billy Talent.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Billy Talent's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Bladee</div>
<a href="https://genius.com/artists/bladee">
<div style="text-align: center; font-size: 14px;">@bladee</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Bladee.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/bladee).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/bladee")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/326nmhkf/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Bladee's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/28bmutxl) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/28bmutxl/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/bladee')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/bladee")
model = AutoModelWithLMHead.from_pretrained("huggingartists/bladee")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/bladee"], "widget": [{"text": "I am"}]}
|
huggingartists/bladee
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/bladee",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/bladee #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Bladee</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@bladee</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Bladee.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Bladee's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Bob Dylan</div>
<a href="https://genius.com/artists/bob-dylan">
<div style="text-align: center; font-size: 14px;">@bob-dylan</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Bob Dylan.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/bob-dylan).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/bob-dylan")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/3mj0lvel/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Bob Dylan's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/2rt8ywgd) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/2rt8ywgd/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/bob-dylan')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/bob-dylan")
model = AutoModelWithLMHead.from_pretrained("huggingartists/bob-dylan")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/bob-dylan"], "widget": [{"text": "I am"}]}
|
huggingartists/bob-dylan
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/bob-dylan",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/bob-dylan #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Bob Dylan</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@bob-dylan</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Bob Dylan.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Bob Dylan's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">BONES</div>
<a href="https://genius.com/artists/bones">
<div style="text-align: center; font-size: 14px;">@bones</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from BONES.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/bones).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/bones")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/26h7sojw/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on BONES's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/1yr1mvc2) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/1yr1mvc2/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/bones')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/bones")
model = AutoModelWithLMHead.from_pretrained("huggingartists/bones")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/bones"], "widget": [{"text": "I am"}]}
|
huggingartists/bones
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/bones",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/bones #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">BONES</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@bones</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from BONES.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on BONES's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Борис Гребенщиков (Boris Grebenshikov)</div>
<a href="https://genius.com/artists/boris-grebenshikov">
<div style="text-align: center; font-size: 14px;">@boris-grebenshikov</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Борис Гребенщиков (Boris Grebenshikov).
Dataset is available [here](https://huggingface.co/datasets/huggingartists/boris-grebenshikov).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/boris-grebenshikov")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/3nb43gls/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Борис Гребенщиков (Boris Grebenshikov)'s lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/34p8ye7k) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/34p8ye7k/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/boris-grebenshikov')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/boris-grebenshikov")
model = AutoModelWithLMHead.from_pretrained("huggingartists/boris-grebenshikov")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/boris-grebenshikov"], "widget": [{"text": "I am"}]}
|
huggingartists/boris-grebenshikov
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/boris-grebenshikov",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/boris-grebenshikov #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Борис Гребенщиков (Boris Grebenshikov)</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@boris-grebenshikov</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Борис Гребенщиков (Boris Grebenshikov).
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Борис Гребенщиков (Boris Grebenshikov)'s lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.",
"## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on Борис Гребенщиков (Boris Grebenshikov)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.",
"## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:",
"## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.",
"## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.",
"## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on Борис Гребенщиков (Boris Grebenshikov)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.",
"## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:",
"## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.",
"## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Bring Me The Horizon</div>
<a href="https://genius.com/artists/bring-me-the-horizon">
<div style="text-align: center; font-size: 14px;">@bring-me-the-horizon</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Bring Me The Horizon.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/bring-me-the-horizon).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/bring-me-the-horizon")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/1e9181i6/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Bring Me The Horizon's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/3p7pncir) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/3p7pncir/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/bring-me-the-horizon')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/bring-me-the-horizon")
model = AutoModelWithLMHead.from_pretrained("huggingartists/bring-me-the-horizon")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/bring-me-the-horizon"], "widget": [{"text": "I am"}]}
|
huggingartists/bring-me-the-horizon
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/bring-me-the-horizon",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/bring-me-the-horizon #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Bring Me The Horizon</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@bring-me-the-horizon</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Bring Me The Horizon.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Bring Me The Horizon's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Bruce Springsteen</div>
<a href="https://genius.com/artists/bruce-springsteen">
<div style="text-align: center; font-size: 14px;">@bruce-springsteen</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Bruce Springsteen.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/bruce-springsteen).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/bruce-springsteen")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/28yd4w57/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Bruce Springsteen's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/6qq7wbab) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/6qq7wbab/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/bruce-springsteen')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/bruce-springsteen")
model = AutoModelWithLMHead.from_pretrained("huggingartists/bruce-springsteen")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/bruce-springsteen"], "widget": [{"text": "I am"}]}
|
huggingartists/bruce-springsteen
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/bruce-springsteen",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/bruce-springsteen #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Bruce Springsteen</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@bruce-springsteen</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Bruce Springsteen.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Bruce Springsteen's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Bryan Adams</div>
<a href="https://genius.com/artists/bryan-adams">
<div style="text-align: center; font-size: 14px;">@bryan-adams</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Bryan Adams.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/bryan-adams).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/bryan-adams")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/22ksbpsz/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Bryan Adams's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/3b0c22fu) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/3b0c22fu/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/bryan-adams')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/bryan-adams")
model = AutoModelWithLMHead.from_pretrained("huggingartists/bryan-adams")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/bryan-adams"], "widget": [{"text": "I am"}]}
|
huggingartists/bryan-adams
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/bryan-adams",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/bryan-adams #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Bryan Adams</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@bryan-adams</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Bryan Adams.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Bryan Adams's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Burzum</div>
<a href="https://genius.com/artists/burzum">
<div style="text-align: center; font-size: 14px;">@burzum</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Burzum.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/burzum).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/burzum")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/j34qgww2/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Burzum's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/3579mrib) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/3579mrib/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/burzum')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/burzum")
model = AutoModelWithLMHead.from_pretrained("huggingartists/burzum")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/burzum"], "widget": [{"text": "I am"}]}
|
huggingartists/burzum
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/burzum",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/burzum #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Burzum</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@burzum</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Burzum.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Burzum's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">BUSHIDO ZHO</div>
<a href="https://genius.com/artists/bushido-zho">
<div style="text-align: center; font-size: 14px;">@bushido-zho</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from BUSHIDO ZHO.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/bushido-zho).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/bushido-zho")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/vtfjc0qi/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on BUSHIDO ZHO's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/iwclgqsj) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/iwclgqsj/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/bushido-zho')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/bushido-zho")
model = AutoModelWithLMHead.from_pretrained("huggingartists/bushido-zho")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/bushido-zho"], "widget": [{"text": "I am"}]}
|
huggingartists/bushido-zho
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/bushido-zho",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/bushido-zho #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">BUSHIDO ZHO</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@bushido-zho</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from BUSHIDO ZHO.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on BUSHIDO ZHO's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Cardi B</div>
<a href="https://genius.com/artists/cardi-b">
<div style="text-align: center; font-size: 14px;">@cardi-b</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Cardi B.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/cardi-b).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/cardi-b")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/2794795e/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Cardi B's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/1buiv5nf) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/1buiv5nf/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/cardi-b')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/cardi-b")
model = AutoModelWithLMHead.from_pretrained("huggingartists/cardi-b")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/cardi-b"], "widget": [{"text": "I am"}]}
|
huggingartists/cardi-b
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/cardi-b",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/cardi-b #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Cardi B</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@cardi-b</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Cardi B.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Cardi B's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Chester Bennington</div>
<a href="https://genius.com/artists/chester-bennington">
<div style="text-align: center; font-size: 14px;">@chester-bennington</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Chester Bennington.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/chester-bennington).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/chester-bennington")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/3pq3bd6d/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Chester Bennington's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/1sxpshrc) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/1sxpshrc/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/chester-bennington')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/chester-bennington")
model = AutoModelWithLMHead.from_pretrained("huggingartists/chester-bennington")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/chester-bennington"], "widget": [{"text": "I am"}]}
|
huggingartists/chester-bennington
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/chester-bennington",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/chester-bennington #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Chester Bennington</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@chester-bennington</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Chester Bennington.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Chester Bennington's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Cocomelon</div>
<a href="https://genius.com/artists/cocomelon">
<div style="text-align: center; font-size: 14px;">@cocomelon</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Cocomelon.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/cocomelon).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/cocomelon")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/1avk18yc/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Cocomelon's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/3s0b2uix) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/3s0b2uix/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/cocomelon')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/cocomelon")
model = AutoModelWithLMHead.from_pretrained("huggingartists/cocomelon")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/cocomelon"], "widget": [{"text": "I am"}]}
|
huggingartists/cocomelon
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/cocomelon",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/cocomelon #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Cocomelon</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@cocomelon</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Cocomelon.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Cocomelon's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Coldplay</div>
<a href="https://genius.com/artists/coldplay">
<div style="text-align: center; font-size: 14px;">@coldplay</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Coldplay.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/coldplay).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/coldplay")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/34tqcy7u/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Coldplay's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/23h7o09h) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/23h7o09h/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/coldplay')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/coldplay")
model = AutoModelWithLMHead.from_pretrained("huggingartists/coldplay")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/coldplay"], "widget": [{"text": "I am"}]}
|
huggingartists/coldplay
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/coldplay",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/coldplay #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Coldplay</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@coldplay</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Coldplay.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Coldplay's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">DaBaby</div>
<a href="https://genius.com/artists/dababy">
<div style="text-align: center; font-size: 14px;">@dababy</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from DaBaby.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/dababy).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/dababy")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/qnkumvdw/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on DaBaby's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/24o367up) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/24o367up/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/dababy')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/dababy")
model = AutoModelWithLMHead.from_pretrained("huggingartists/dababy")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/dababy"], "widget": [{"text": "I am"}]}
|
huggingartists/dababy
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/dababy",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/dababy #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">DaBaby</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@dababy</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from DaBaby.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on DaBaby's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">DDT</div>
<a href="https://genius.com/artists/ddt">
<div style="text-align: center; font-size: 14px;">@ddt</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from DDT.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/ddt).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/ddt")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/2t9xnx5c/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on DDT's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/33zphjtk) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/33zphjtk/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/ddt')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/ddt")
model = AutoModelWithLMHead.from_pretrained("huggingartists/ddt")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/ddt"], "widget": [{"text": "I am"}]}
|
huggingartists/ddt
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/ddt",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/ddt #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">DDT</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@ddt</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from DDT.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on DDT's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Death Grips</div>
<a href="https://genius.com/artists/death-grips">
<div style="text-align: center; font-size: 14px;">@death-grips</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Death Grips.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/death-grips).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/death-grips")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/2hmeenl7/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Death Grips's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/226ak5bw) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/226ak5bw/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/death-grips')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/death-grips")
model = AutoModelWithLMHead.from_pretrained("huggingartists/death-grips")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/death-grips"], "widget": [{"text": "I am"}]}
|
huggingartists/death-grips
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/death-grips",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/death-grips #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Death Grips</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@death-grips</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Death Grips.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Death Grips's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Deep Purple</div>
<a href="https://genius.com/artists/deep-purple">
<div style="text-align: center; font-size: 14px;">@deep-purple</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Deep Purple.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/deep-purple).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/deep-purple")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/2sybcajo/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Deep Purple's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/3evu15qv) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/3evu15qv/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/deep-purple')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/deep-purple")
model = AutoModelWithLMHead.from_pretrained("huggingartists/deep-purple")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/deep-purple"], "widget": [{"text": "I am"}]}
|
huggingartists/deep-purple
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/deep-purple",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/deep-purple #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Deep Purple</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@deep-purple</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Deep Purple.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Deep Purple's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">DenDerty</div>
<a href="https://genius.com/artists/denderty">
<div style="text-align: center; font-size: 14px;">@denderty</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from DenDerty.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/denderty).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/denderty")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/gu1nyrga/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on DenDerty's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/2hx5b1gk) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/2hx5b1gk/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/denderty')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/denderty")
model = AutoModelWithLMHead.from_pretrained("huggingartists/denderty")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/denderty"], "widget": [{"text": "I am"}]}
|
huggingartists/denderty
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/denderty",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/denderty #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">DenDerty</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@denderty</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from DenDerty.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on DenDerty's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">DJ Artem Artemov</div>
<a href="https://genius.com/artists/dj-artem-artemov">
<div style="text-align: center; font-size: 14px;">@dj-artem-artemov</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from DJ Artem Artemov.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/dj-artem-artemov).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/dj-artem-artemov")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/2yaf9hon/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on DJ Artem Artemov's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/crwya5am) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/crwya5am/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/dj-artem-artemov')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/dj-artem-artemov")
model = AutoModelWithLMHead.from_pretrained("huggingartists/dj-artem-artemov")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/dj-artem-artemov"], "widget": [{"text": "I am"}]}
|
huggingartists/dj-artem-artemov
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/dj-artem-artemov",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/dj-artem-artemov #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">DJ Artem Artemov</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@dj-artem-artemov</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from DJ Artem Artemov.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on DJ Artem Artemov's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Doja Cat</div>
<a href="https://genius.com/artists/doja-cat">
<div style="text-align: center; font-size: 14px;">@doja-cat</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Doja Cat.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/doja-cat).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/doja-cat")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/1qxclk1g/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Doja Cat's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/2lqvdntl) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/2lqvdntl/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/doja-cat')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/doja-cat")
model = AutoModelWithLMHead.from_pretrained("huggingartists/doja-cat")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/doja-cat"], "widget": [{"text": "I am"}]}
|
huggingartists/doja-cat
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/doja-cat",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/doja-cat #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Doja Cat</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@doja-cat</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Doja Cat.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Doja Cat's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Drake</div>
<a href="https://genius.com/artists/drake">
<div style="text-align: center; font-size: 14px;">@drake</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Drake.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/drake).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/drake")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/l3lz2q80/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Drake's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/033yz8al) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/033yz8al/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/drake')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/drake")
model = AutoModelWithLMHead.from_pretrained("huggingartists/drake")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/drake"], "widget": [{"text": "I am"}]}
|
huggingartists/drake
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/drake",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/drake #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Drake</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@drake</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Drake.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Drake's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Dua Lipa</div>
<a href="https://genius.com/artists/dua-lipa">
<div style="text-align: center; font-size: 14px;">@dua-lipa</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Dua Lipa.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/dua-lipa).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/dua-lipa")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/2wxz1liw/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Dua Lipa's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/3uj930yj) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/3uj930yj/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/dua-lipa')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/dua-lipa")
model = AutoModelWithLMHead.from_pretrained("huggingartists/dua-lipa")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/dua-lipa"], "widget": [{"text": "I am"}]}
|
huggingartists/dua-lipa
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/dua-lipa",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/dua-lipa #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Dua Lipa</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@dua-lipa</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Dua Lipa.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Dua Lipa's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Duran Duran</div>
<a href="https://genius.com/artists/duran-duran">
<div style="text-align: center; font-size: 14px;">@duran-duran</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Duran Duran.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/duran-duran).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/duran-duran")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/dy133fuf/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Duran Duran's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/386u7cc3) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/386u7cc3/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/duran-duran')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/duran-duran")
model = AutoModelWithLMHead.from_pretrained("huggingartists/duran-duran")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/duran-duran"], "widget": [{"text": "I am"}]}
|
huggingartists/duran-duran
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/duran-duran",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/duran-duran #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Duran Duran</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@duran-duran</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Duran Duran.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Duran Duran's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Джизус (Dzhizus)</div>
<a href="https://genius.com/artists/dzhizus">
<div style="text-align: center; font-size: 14px;">@dzhizus</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Джизус (Dzhizus).
Dataset is available [here](https://huggingface.co/datasets/huggingartists/dzhizus).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/dzhizus")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/35paacn1/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Джизус (Dzhizus)'s lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/1ug3yebo) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/1ug3yebo/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/dzhizus')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/dzhizus")
model = AutoModelWithLMHead.from_pretrained("huggingartists/dzhizus")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/dzhizus"], "widget": [{"text": "I am"}]}
|
huggingartists/dzhizus
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/dzhizus",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/dzhizus #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Джизус (Dzhizus)</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@dzhizus</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Джизус (Dzhizus).
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Джизус (Dzhizus)'s lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.",
"## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on Джизус (Dzhizus)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.",
"## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:",
"## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.",
"## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.",
"## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on Джизус (Dzhizus)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.",
"## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:",
"## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.",
"## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Ed Sheeran</div>
<a href="https://genius.com/artists/ed-sheeran">
<div style="text-align: center; font-size: 14px;">@ed-sheeran</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Ed Sheeran.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/ed-sheeran).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/ed-sheeran")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/3nju68bo/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Ed Sheeran's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/3hu7zc76) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/3hu7zc76/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/ed-sheeran')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/ed-sheeran")
model = AutoModelWithLMHead.from_pretrained("huggingartists/ed-sheeran")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/ed-sheeran"], "widget": [{"text": "I am"}]}
|
huggingartists/ed-sheeran
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/ed-sheeran",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/ed-sheeran #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Ed Sheeran</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@ed-sheeran</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Ed Sheeran.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Ed Sheeran's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">ЕГОР КРИД (EGOR KREED)</div>
<a href="https://genius.com/artists/egor-kreed">
<div style="text-align: center; font-size: 14px;">@egor-kreed</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from ЕГОР КРИД (EGOR KREED).
Dataset is available [here](https://huggingface.co/datasets/huggingartists/egor-kreed).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/egor-kreed")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/3l7nf6hj/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on ЕГОР КРИД (EGOR KREED)'s lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/1mtfkshl) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/1mtfkshl/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/egor-kreed')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/egor-kreed")
model = AutoModelWithLMHead.from_pretrained("huggingartists/egor-kreed")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/egor-kreed"], "widget": [{"text": "I am"}]}
|
huggingartists/egor-kreed
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/egor-kreed",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/egor-kreed #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">ЕГОР КРИД (EGOR KREED)</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@egor-kreed</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from ЕГОР КРИД (EGOR KREED).
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on ЕГОР КРИД (EGOR KREED)'s lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.",
"## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on ЕГОР КРИД (EGOR KREED)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.",
"## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:",
"## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.",
"## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.",
"## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on ЕГОР КРИД (EGOR KREED)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.",
"## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:",
"## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.",
"## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Егор Летов (Egor Letov)</div>
<a href="https://genius.com/artists/egor-letov">
<div style="text-align: center; font-size: 14px;">@egor-letov</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Егор Летов (Egor Letov).
Dataset is available [here](https://huggingface.co/datasets/huggingartists/egor-letov).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/egor-letov")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/1omrcegx/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Егор Летов (Egor Letov)'s lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/3lk60u9h) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/3lk60u9h/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/egor-letov')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/egor-letov")
model = AutoModelWithLMHead.from_pretrained("huggingartists/egor-letov")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/egor-letov"], "widget": [{"text": "I am"}]}
|
huggingartists/egor-letov
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/egor-letov",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/egor-letov #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Егор Летов (Egor Letov)</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@egor-letov</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Егор Летов (Egor Letov).
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Егор Летов (Egor Letov)'s lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.",
"## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on Егор Летов (Egor Letov)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.",
"## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:",
"## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.",
"## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.",
"## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on Егор Летов (Egor Letov)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.",
"## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:",
"## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.",
"## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Elton John</div>
<a href="https://genius.com/artists/elton-john">
<div style="text-align: center; font-size: 14px;">@elton-john</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Elton John.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/elton-john).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/elton-john")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/188xpm2n/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Elton John's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/1rgstntu) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/1rgstntu/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/elton-john')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/elton-john")
model = AutoModelWithLMHead.from_pretrained("huggingartists/elton-john")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/elton-john"], "widget": [{"text": "I am"}]}
|
huggingartists/elton-john
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/elton-john",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/elton-john #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Elton John</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@elton-john</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Elton John.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Elton John's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Eminem</div>
<a href="https://genius.com/artists/eminem">
<div style="text-align: center; font-size: 14px;">@eminem</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Eminem.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/eminem).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/eminem")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/391kfg7f/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Eminem's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/1361uz9o) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/1361uz9o/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/eminem')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/eminem")
model = AutoModelWithLMHead.from_pretrained("huggingartists/eminem")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/eminem"], "widget": [{"text": "I am"}]}
|
huggingartists/eminem
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/eminem",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/eminem #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Eminem</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@eminem</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Eminem.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Eminem's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Enigma</div>
<a href="https://genius.com/artists/enigma">
<div style="text-align: center; font-size: 14px;">@enigma</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Enigma.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/enigma).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/enigma")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/8bx90lw6/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Enigma's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/1c1t20ji) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/1c1t20ji/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/enigma')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/enigma")
model = AutoModelWithLMHead.from_pretrained("huggingartists/enigma")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/enigma"], "widget": [{"text": "I am"}]}
|
huggingartists/enigma
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/enigma",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/enigma #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Enigma</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@enigma</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Enigma.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Enigma's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Enya</div>
<a href="https://genius.com/artists/enya">
<div style="text-align: center; font-size: 14px;">@enya</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Enya.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/enya).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/enya")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/16cuy8yb/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Enya's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/il8ldqo8) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/il8ldqo8/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/enya')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/enya")
model = AutoModelWithLMHead.from_pretrained("huggingartists/enya")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/enya"], "widget": [{"text": "I am"}]}
|
huggingartists/enya
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/enya",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/enya #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Enya</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@enya</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Enya.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Enya's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Epic Rap Battles of History</div>
<a href="https://genius.com/artists/epic-rap-battles-of-history">
<div style="text-align: center; font-size: 14px;">@epic-rap-battles-of-history</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Epic Rap Battles of History.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/epic-rap-battles-of-history).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/epic-rap-battles-of-history")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/ujomrrjb/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Epic Rap Battles of History's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/1s03lfls) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/1s03lfls/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/epic-rap-battles-of-history')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/epic-rap-battles-of-history")
model = AutoModelWithLMHead.from_pretrained("huggingartists/epic-rap-battles-of-history")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/epic-rap-battles-of-history"], "widget": [{"text": "I am"}]}
|
huggingartists/epic-rap-battles-of-history
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/epic-rap-battles-of-history",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/epic-rap-battles-of-history #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Epic Rap Battles of History</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@epic-rap-battles-of-history</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Epic Rap Battles of History.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Epic Rap Battles of History's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">FACE</div>
<a href="https://genius.com/artists/face">
<div style="text-align: center; font-size: 14px;">@face</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from FACE.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/face).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/face")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/xtozoqtm/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on FACE's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/knkqp5iy) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/knkqp5iy/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/face')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/face")
model = AutoModelWithLMHead.from_pretrained("huggingartists/face")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/face"], "widget": [{"text": "I am"}]}
|
huggingartists/face
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/face",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/face #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">FACE</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@face</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from FACE.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on FACE's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Fascinoma</div>
<a href="https://genius.com/artists/fascinoma">
<div style="text-align: center; font-size: 14px;">@fascinoma</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Fascinoma.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/fascinoma).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/fascinoma")
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/fascinoma")
model = AutoModelWithLMHead.from_pretrained("huggingartists/fascinoma")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/za989b3u/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Fascinoma's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/kklye04t) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/kklye04t/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/fascinoma')
generator("I am", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/fascinoma"], "widget": [{"text": "I am"}]}
|
huggingartists/fascinoma
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/fascinoma",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/fascinoma #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Fascinoma</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@fascinoma</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Fascinoma.
Dataset is available here.
And can be used with:
Or with Transformers library:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Fascinoma's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Fear Factory</div>
<a href="https://genius.com/artists/fear-factory">
<div style="text-align: center; font-size: 14px;">@fear-factory</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Fear Factory.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/fear-factory).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/fear-factory")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/24xjxpf5/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Fear Factory's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/3gju7udi) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/3gju7udi/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/fear-factory')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/fear-factory")
model = AutoModelWithLMHead.from_pretrained("huggingartists/fear-factory")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/fear-factory"], "widget": [{"text": "I am"}]}
|
huggingartists/fear-factory
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/fear-factory",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/fear-factory #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Fear Factory</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@fear-factory</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Fear Factory.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Fear Factory's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Florence + The Machine</div>
<a href="https://genius.com/artists/florence-the-machine">
<div style="text-align: center; font-size: 14px;">@florence-the-machine</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Florence + The Machine.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/florence-the-machine).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/florence-the-machine")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/icjt5evm/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Florence + The Machine's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/1zfb9y24) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/1zfb9y24/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/florence-the-machine')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/florence-the-machine")
model = AutoModelWithLMHead.from_pretrained("huggingartists/florence-the-machine")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/florence-the-machine"], "widget": [{"text": "I am"}]}
|
huggingartists/florence-the-machine
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/florence-the-machine",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/florence-the-machine #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Florence + The Machine</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@florence-the-machine</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Florence + The Machine.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Florence + The Machine's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Ghost</div>
<a href="https://genius.com/artists/ghost">
<div style="text-align: center; font-size: 14px;">@ghost</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Ghost.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/ghost).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/ghost")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/1n8515nl/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Ghost's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/2qimq3aa) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/2qimq3aa/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/ghost')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/ghost")
model = AutoModelWithLMHead.from_pretrained("huggingartists/ghost")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/ghost"], "widget": [{"text": "I am"}]}
|
huggingartists/ghost
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/ghost",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/ghost #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Ghost</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@ghost</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Ghost.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Ghost's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Ghostemane</div>
<a href="https://genius.com/artists/ghostemane">
<div style="text-align: center; font-size: 14px;">@ghostemane</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Ghostemane.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/ghostemane).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/ghostemane")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/1ou29taa/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Ghostemane's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/futdflju) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/futdflju/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/ghostemane')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/ghostemane")
model = AutoModelWithLMHead.from_pretrained("huggingartists/ghostemane")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/ghostemane"], "widget": [{"text": "I am"}]}
|
huggingartists/ghostemane
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/ghostemane",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/ghostemane #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Ghostemane</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@ghostemane</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Ghostemane.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Ghostemane's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">gizmo</div>
<a href="https://genius.com/artists/gizmo">
<div style="text-align: center; font-size: 14px;">@gizmo</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from gizmo.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/gizmo).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/gizmo")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/3lolgugy/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on gizmo's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/31nxia6i) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/31nxia6i/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/gizmo')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/gizmo")
model = AutoModelWithLMHead.from_pretrained("huggingartists/gizmo")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/gizmo"], "widget": [{"text": "I am"}]}
|
huggingartists/gizmo
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/gizmo",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/gizmo #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">gizmo</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@gizmo</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from gizmo.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on gizmo's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Gorillaz</div>
<a href="https://genius.com/artists/gorillaz">
<div style="text-align: center; font-size: 14px;">@gorillaz</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Gorillaz.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/gorillaz).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/gorillaz")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/3tuzza9u/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Gorillaz's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/12uilegj) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/12uilegj/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/gorillaz')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/gorillaz")
model = AutoModelWithLMHead.from_pretrained("huggingartists/gorillaz")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/gorillaz"], "widget": [{"text": "I am"}]}
|
huggingartists/gorillaz
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/gorillaz",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/gorillaz #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Gorillaz</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@gorillaz</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Gorillaz.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Gorillaz's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Green Day</div>
<a href="https://genius.com/artists/green-day">
<div style="text-align: center; font-size: 14px;">@green-day</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Green Day.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/green-day).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/green-day")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/22eap04b/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Green Day's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/183da0m9) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/183da0m9/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/green-day')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/green-day")
model = AutoModelWithLMHead.from_pretrained("huggingartists/green-day")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/green-day"], "widget": [{"text": "I am"}]}
|
huggingartists/green-day
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/green-day",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/green-day #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Green Day</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@green-day</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Green Day.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Green Day's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Григорий Лепс (Grigory Leps)</div>
<a href="https://genius.com/artists/grigory-leps">
<div style="text-align: center; font-size: 14px;">@grigory-leps</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Григорий Лепс (Grigory Leps).
Dataset is available [here](https://huggingface.co/datasets/huggingartists/grigory-leps).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/grigory-leps")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/32wqexib/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Григорий Лепс (Grigory Leps)'s lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/1j0f6nwb) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/1j0f6nwb/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/grigory-leps')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/grigory-leps")
model = AutoModelWithLMHead.from_pretrained("huggingartists/grigory-leps")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/grigory-leps"], "widget": [{"text": "I am"}]}
|
huggingartists/grigory-leps
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/grigory-leps",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/grigory-leps #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Григорий Лепс (Grigory Leps)</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@grigory-leps</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Григорий Лепс (Grigory Leps).
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Григорий Лепс (Grigory Leps)'s lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.",
"## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on Григорий Лепс (Grigory Leps)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.",
"## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:",
"## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.",
"## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.",
"## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on Григорий Лепс (Grigory Leps)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.",
"## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:",
"## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.",
"## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Grimes</div>
<a href="https://genius.com/artists/grimes">
<div style="text-align: center; font-size: 14px;">@grimes</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Grimes.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/grimes).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/grimes")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/3796ng30/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Grimes's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/ourv0tjj) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/ourv0tjj/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/grimes')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/grimes")
model = AutoModelWithLMHead.from_pretrained("huggingartists/grimes")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/grimes"], "widget": [{"text": "I am"}]}
|
huggingartists/grimes
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/grimes",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/grimes #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Grimes</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@grimes</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Grimes.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Grimes's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">GSPD</div>
<a href="https://genius.com/artists/gspd">
<div style="text-align: center; font-size: 14px;">@gspd</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from GSPD.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/gspd).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/gspd")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/3jof0sex/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on GSPD's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/2nxhrny4) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/2nxhrny4/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/gspd')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/gspd")
model = AutoModelWithLMHead.from_pretrained("huggingartists/gspd")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/gspd"], "widget": [{"text": "I am"}]}
|
huggingartists/gspd
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/gspd",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/gspd #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">GSPD</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@gspd</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from GSPD.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on GSPD's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Gunna</div>
<a href="https://genius.com/artists/gunna">
<div style="text-align: center; font-size: 14px;">@gunna</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Gunna.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/gunna).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/gunna")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/vcyblers/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Gunna's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/3c1xymw6) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/3c1xymw6/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/gunna')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/gunna")
model = AutoModelWithLMHead.from_pretrained("huggingartists/gunna")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/gunna"], "widget": [{"text": "I am"}]}
|
huggingartists/gunna
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/gunna",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/gunna #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Gunna</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@gunna</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Gunna.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Gunna's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">HyunA (현아)</div>
<a href="https://genius.com/artists/hyuna">
<div style="text-align: center; font-size: 14px;">@hyuna</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from HyunA (현아).
Dataset is available [here](https://huggingface.co/datasets/huggingartists/hyuna).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/hyuna")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/3uo94mxd/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on HyunA (현아)'s lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/1o8t0mq0) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/1o8t0mq0/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/hyuna')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/hyuna")
model = AutoModelWithLMHead.from_pretrained("huggingartists/hyuna")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/hyuna"], "widget": [{"text": "I am"}]}
|
huggingartists/hyuna
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/hyuna",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/hyuna #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">HyunA (현아)</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@hyuna</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from HyunA (현아).
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on HyunA (현아)'s lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.",
"## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on HyunA (현아)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.",
"## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:",
"## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.",
"## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.",
"## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on HyunA (현아)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.",
"## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:",
"## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.",
"## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">I DONT KNOW HOW BUT THEY FOUND ME</div>
<a href="https://genius.com/artists/i-dont-know-how-but-they-found-me">
<div style="text-align: center; font-size: 14px;">@i-dont-know-how-but-they-found-me</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from I DONT KNOW HOW BUT THEY FOUND ME.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/i-dont-know-how-but-they-found-me).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/i-dont-know-how-but-they-found-me")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/1j7uofwh/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on I DONT KNOW HOW BUT THEY FOUND ME's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/1abhthz2) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/1abhthz2/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/i-dont-know-how-but-they-found-me')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/i-dont-know-how-but-they-found-me")
model = AutoModelWithLMHead.from_pretrained("huggingartists/i-dont-know-how-but-they-found-me")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/i-dont-know-how-but-they-found-me"], "widget": [{"text": "I am"}]}
|
huggingartists/i-dont-know-how-but-they-found-me
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/i-dont-know-how-but-they-found-me",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/i-dont-know-how-but-they-found-me #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">I DONT KNOW HOW BUT THEY FOUND ME</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@i-dont-know-how-but-they-found-me</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from I DONT KNOW HOW BUT THEY FOUND ME.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on I DONT KNOW HOW BUT THEY FOUND ME's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Imagine Dragons</div>
<a href="https://genius.com/artists/imagine-dragons">
<div style="text-align: center; font-size: 14px;">@imagine-dragons</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Imagine Dragons.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/imagine-dragons).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/imagine-dragons")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/dln6ixis/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Imagine Dragons's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/3cj3c8z1) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/3cj3c8z1/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/imagine-dragons')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/imagine-dragons")
model = AutoModelWithLMHead.from_pretrained("huggingartists/imagine-dragons")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/imagine-dragons"], "widget": [{"text": "I am"}]}
|
huggingartists/imagine-dragons
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/imagine-dragons",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/imagine-dragons #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Imagine Dragons</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@imagine-dragons</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Imagine Dragons.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Imagine Dragons's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">John K. Samson</div>
<a href="https://genius.com/artists/john-k-samson">
<div style="text-align: center; font-size: 14px;">@john-k-samson</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from John K. Samson.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/john-k-samson).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/john-k-samson")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/2s15m338/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on John K. Samson's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/18ill893) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/18ill893/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/john-k-samson')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/john-k-samson")
model = AutoModelWithLMHead.from_pretrained("huggingartists/john-k-samson")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/john-k-samson"], "widget": [{"text": "I am"}]}
|
huggingartists/john-k-samson
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/john-k-samson",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/john-k-samson #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">John K. Samson</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@john-k-samson</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from John K. Samson.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on John K. Samson's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">John Lennon</div>
<a href="https://genius.com/artists/john-lennon">
<div style="text-align: center; font-size: 14px;">@john-lennon</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from John Lennon.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/john-lennon).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/john-lennon")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/f3d8fseh/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on John Lennon's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/36mtogkg) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/36mtogkg/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/john-lennon')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/john-lennon")
model = AutoModelWithLMHead.from_pretrained("huggingartists/john-lennon")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/john-lennon"], "widget": [{"text": "I am"}]}
|
huggingartists/john-lennon
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/john-lennon",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/john-lennon #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">John Lennon</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@john-lennon</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from John Lennon.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on John Lennon's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Joji</div>
<a href="https://genius.com/artists/joji">
<div style="text-align: center; font-size: 14px;">@joji</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Joji.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/joji).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/joji")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/ns61e8zi/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Joji's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/jz3ft48t) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/jz3ft48t/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/joji')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/joji")
model = AutoModelWithLMHead.from_pretrained("huggingartists/joji")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/joji"], "widget": [{"text": "I am"}]}
|
huggingartists/joji
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/joji",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/joji #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Joji</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@joji</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Joji.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Joji's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Joni Mitchell</div>
<a href="https://genius.com/artists/joni-mitchell">
<div style="text-align: center; font-size: 14px;">@joni-mitchell</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Joni Mitchell.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/joni-mitchell).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/joni-mitchell")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/1m5n59kk/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Joni Mitchell's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/34saoh5x) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/34saoh5x/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/joni-mitchell')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/joni-mitchell")
model = AutoModelWithLMHead.from_pretrained("huggingartists/joni-mitchell")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/joni-mitchell"], "widget": [{"text": "I am"}]}
|
huggingartists/joni-mitchell
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/joni-mitchell",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/joni-mitchell #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Joni Mitchell</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@joni-mitchell</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Joni Mitchell.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Joni Mitchell's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Kanye West</div>
<a href="https://genius.com/artists/kanye-west">
<div style="text-align: center; font-size: 14px;">@kanye-west</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Kanye West.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/kanye-west).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/kanye-west")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/hl7afoso/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Kanye West's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/28dw8m5v) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/28dw8m5v/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/kanye-west')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/kanye-west")
model = AutoModelWithLMHead.from_pretrained("huggingartists/kanye-west")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/kanye-west"], "widget": [{"text": "I am"}]}
|
huggingartists/kanye-west
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/kanye-west",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/kanye-west #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Kanye West</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@kanye-west</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Kanye West.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Kanye West's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Каста (Kasta)</div>
<a href="https://genius.com/artists/kasta">
<div style="text-align: center; font-size: 14px;">@kasta</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Каста (Kasta).
Dataset is available [here](https://huggingface.co/datasets/huggingartists/kasta).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/kasta")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/3k79xvbx/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Каста (Kasta)'s lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/1rphmch0) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/1rphmch0/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/kasta')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/kasta")
model = AutoModelWithLMHead.from_pretrained("huggingartists/kasta")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/kasta"], "widget": [{"text": "I am"}]}
|
huggingartists/kasta
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/kasta",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/kasta #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Каста (Kasta)</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@kasta</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Каста (Kasta).
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Каста (Kasta)'s lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.",
"## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on Каста (Kasta)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.",
"## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:",
"## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.",
"## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.",
"## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on Каста (Kasta)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.",
"## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:",
"## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.",
"## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Kehlani</div>
<a href="https://genius.com/artists/kehlani">
<div style="text-align: center; font-size: 14px;">@kehlani</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Kehlani.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/kehlani).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/kehlani")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/3t2b2m5y/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Kehlani's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/35pweb11) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/35pweb11/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/kehlani')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/kehlani")
model = AutoModelWithLMHead.from_pretrained("huggingartists/kehlani")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/kehlani"], "widget": [{"text": "I am"}]}
|
huggingartists/kehlani
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/kehlani",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/kehlani #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Kehlani</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@kehlani</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Kehlani.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Kehlani's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Кипелов (Kipelov)</div>
<a href="https://genius.com/artists/kipelov">
<div style="text-align: center; font-size: 14px;">@kipelov</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Кипелов (Kipelov).
Dataset is available [here](https://huggingface.co/datasets/huggingartists/kipelov).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/kipelov")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/225m5y65/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Кипелов (Kipelov)'s lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/38es269x) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/38es269x/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/kipelov')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/kipelov")
model = AutoModelWithLMHead.from_pretrained("huggingartists/kipelov")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/kipelov"], "widget": [{"text": "I am"}]}
|
huggingartists/kipelov
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/kipelov",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/kipelov #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Кипелов (Kipelov)</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@kipelov</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Кипелов (Kipelov).
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Кипелов (Kipelov)'s lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.",
"## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on Кипелов (Kipelov)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.",
"## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:",
"## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.",
"## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.",
"## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on Кипелов (Kipelov)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.",
"## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:",
"## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.",
"## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Кишлак (Kishlak)</div>
<a href="https://genius.com/artists/kishlak">
<div style="text-align: center; font-size: 14px;">@kishlak</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Кишлак (Kishlak).
Dataset is available [here](https://huggingface.co/datasets/huggingartists/kishlak).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/kishlak")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/2654f8ic/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Кишлак (Kishlak)'s lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/12gu37uv) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/12gu37uv/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/kishlak')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/kishlak")
model = AutoModelWithLMHead.from_pretrained("huggingartists/kishlak")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/kishlak"], "widget": [{"text": "I am"}]}
|
huggingartists/kishlak
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/kishlak",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/kishlak #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Кишлак (Kishlak)</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@kishlak</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Кишлак (Kishlak).
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Кишлак (Kishlak)'s lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.",
"## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on Кишлак (Kishlak)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.",
"## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:",
"## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.",
"## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.",
"## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on Кишлак (Kishlak)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.",
"## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:",
"## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.",
"## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">kizaru</div>
<a href="https://genius.com/artists/kizaru">
<div style="text-align: center; font-size: 14px;">@kizaru</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from kizaru.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/kizaru).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/kizaru")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/2goru0fu/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on kizaru's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/1zni18k7) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/1zni18k7/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/kizaru')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/kizaru")
model = AutoModelWithLMHead.from_pretrained("huggingartists/kizaru")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/kizaru"], "widget": [{"text": "I am"}]}
|
huggingartists/kizaru
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/kizaru",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/kizaru #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">kizaru</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@kizaru</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from kizaru.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on kizaru's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Krechet</div>
<a href="https://genius.com/artists/krechet">
<div style="text-align: center; font-size: 14px;">@krechet</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Krechet.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/krechet).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/krechet")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/1c2yk38s/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Krechet's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/39bxkroc) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/39bxkroc/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/krechet')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/krechet")
model = AutoModelWithLMHead.from_pretrained("huggingartists/krechet")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/krechet"], "widget": [{"text": "I am"}]}
|
huggingartists/krechet
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/krechet",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/krechet #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Krechet</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@krechet</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Krechet.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Krechet's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Kurt Cobain</div>
<a href="https://genius.com/artists/kurt-cobain">
<div style="text-align: center; font-size: 14px;">@kurt-cobain</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Kurt Cobain.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/kurt-cobain).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/kurt-cobain")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/tjfuj6tr/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Kurt Cobain's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/3enopofm) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/3enopofm/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/kurt-cobain')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/kurt-cobain")
model = AutoModelWithLMHead.from_pretrained("huggingartists/kurt-cobain")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/kurt-cobain"], "widget": [{"text": "I am"}]}
|
huggingartists/kurt-cobain
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/kurt-cobain",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/kurt-cobain #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Kurt Cobain</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@kurt-cobain</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Kurt Cobain.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Kurt Cobain's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Lady Gaga</div>
<a href="https://genius.com/artists/lady-gaga">
<div style="text-align: center; font-size: 14px;">@lady-gaga</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Lady Gaga.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/lady-gaga).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/lady-gaga")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/17c0d4ej/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Lady Gaga's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/2j7yp9qd) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/2j7yp9qd/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/lady-gaga')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/lady-gaga")
model = AutoModelWithLMHead.from_pretrained("huggingartists/lady-gaga")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/lady-gaga"], "widget": [{"text": "I am"}]}
|
huggingartists/lady-gaga
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/lady-gaga",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/lady-gaga #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Lady Gaga</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@lady-gaga</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Lady Gaga.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Lady Gaga's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Lazy Jay</div>
<a href="https://genius.com/artists/lazy-jay">
<div style="text-align: center; font-size: 14px;">@lazy-jay</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Lazy Jay.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/lazy-jay).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/lazy-jay")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/tlb735a4/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Lazy Jay's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/36z52xfj) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/36z52xfj/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/lazy-jay')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/lazy-jay")
model = AutoModelWithLMHead.from_pretrained("huggingartists/lazy-jay")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/lazy-jay"], "widget": [{"text": "I am"}]}
|
huggingartists/lazy-jay
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/lazy-jay",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/lazy-jay #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Lazy Jay</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@lazy-jay</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Lazy Jay.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Lazy Jay's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Led Zeppelin</div>
<a href="https://genius.com/artists/led-zeppelin">
<div style="text-align: center; font-size: 14px;">@led-zeppelin</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Led Zeppelin.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/led-zeppelin).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/led-zeppelin")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/cpexpb1w/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Led Zeppelin's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/bna2epba) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/bna2epba/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/led-zeppelin')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/led-zeppelin")
model = AutoModelWithLMHead.from_pretrained("huggingartists/led-zeppelin")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/led-zeppelin"], "widget": [{"text": "I am"}]}
|
huggingartists/led-zeppelin
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/led-zeppelin",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/led-zeppelin #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Led Zeppelin</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@led-zeppelin</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Led Zeppelin.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Led Zeppelin's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Lil Baby</div>
<a href="https://genius.com/artists/lil-baby">
<div style="text-align: center; font-size: 14px;">@lil-baby</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Lil Baby.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/lil-baby).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/lil-baby")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/vueaothh/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Lil Baby's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/257bod1h) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/257bod1h/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/lil-baby')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/lil-baby")
model = AutoModelWithLMHead.from_pretrained("huggingartists/lil-baby")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/lil-baby"], "widget": [{"text": "I am"}]}
|
huggingartists/lil-baby
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/lil-baby",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/lil-baby #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Lil Baby</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@lil-baby</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Lil Baby.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Lil Baby's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Lil Nas X</div>
<a href="https://genius.com/artists/lil-nas-x">
<div style="text-align: center; font-size: 14px;">@lil-nas-x</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Lil Nas X.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/lil-nas-x).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/lil-nas-x")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/n5s2tj7p/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Lil Nas X's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/334lnf7p) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/334lnf7p/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/lil-nas-x')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/lil-nas-x")
model = AutoModelWithLMHead.from_pretrained("huggingartists/lil-nas-x")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/lil-nas-x"], "widget": [{"text": "I am"}]}
|
huggingartists/lil-nas-x
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/lil-nas-x",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/lil-nas-x #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Lil Nas X</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@lil-nas-x</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Lil Nas X.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Lil Nas X's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Lil Peep</div>
<a href="https://genius.com/artists/lil-peep">
<div style="text-align: center; font-size: 14px;">@lil-peep</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Lil Peep.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/lil-peep).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/lil-peep")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/39q6kspr/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Lil Peep's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/g0nxk974) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/g0nxk974/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/lil-peep')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/lil-peep")
model = AutoModelWithLMHead.from_pretrained("huggingartists/lil-peep")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/lil-peep"], "widget": [{"text": "I am"}]}
|
huggingartists/lil-peep
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/lil-peep",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/lil-peep #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Lil Peep</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@lil-peep</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Lil Peep.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Lil Peep's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Lil Uzi Vert</div>
<a href="https://genius.com/artists/lil-uzi-vert">
<div style="text-align: center; font-size: 14px;">@lil-uzi-vert</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Lil Uzi Vert.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/lil-uzi-vert).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/lil-uzi-vert")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/14mmkidw/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Lil Uzi Vert's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/3s5iqd7v) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/3s5iqd7v/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/lil-uzi-vert')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/lil-uzi-vert")
model = AutoModelWithLMHead.from_pretrained("huggingartists/lil-uzi-vert")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/lil-uzi-vert"], "widget": [{"text": "I am"}]}
|
huggingartists/lil-uzi-vert
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/lil-uzi-vert",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/lil-uzi-vert #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Lil Uzi Vert</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@lil-uzi-vert</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Lil Uzi Vert.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Lil Uzi Vert's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Linkin Park</div>
<a href="https://genius.com/artists/linkin-park">
<div style="text-align: center; font-size: 14px;">@linkin-park</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Linkin Park.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/linkin-park).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/linkin-park")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/3mtr0u4z/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Linkin Park's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/fxn4brd6) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/fxn4brd6/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/linkin-park')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/linkin-park")
model = AutoModelWithLMHead.from_pretrained("huggingartists/linkin-park")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/linkin-park"], "widget": [{"text": "I am"}]}
|
huggingartists/linkin-park
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/linkin-park",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/linkin-park #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Linkin Park</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@linkin-park</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Linkin Park.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Linkin Park's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Little Big</div>
<a href="https://genius.com/artists/little-big">
<div style="text-align: center; font-size: 14px;">@little-big</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Little Big.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/little-big).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/little-big")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/2rjstm9q/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Little Big's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/289c46fn) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/289c46fn/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/little-big')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/little-big")
model = AutoModelWithLMHead.from_pretrained("huggingartists/little-big")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/little-big"], "widget": [{"text": "I am"}]}
|
huggingartists/little-big
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/little-big",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/little-big #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Little Big</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@little-big</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Little Big.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Little Big's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Logic</div>
<a href="https://genius.com/artists/logic">
<div style="text-align: center; font-size: 14px;">@logic</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Logic.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/logic).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/logic")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/2rp89nd3/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Logic's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/25a9752b) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/25a9752b/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/logic')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/logic")
model = AutoModelWithLMHead.from_pretrained("huggingartists/logic")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/logic"], "widget": [{"text": "I am"}]}
|
huggingartists/logic
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/logic",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/logic #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Logic</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@logic</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Logic.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Logic's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Loud Luxury</div>
<a href="https://genius.com/artists/loud-luxury">
<div style="text-align: center; font-size: 14px;">@loud-luxury</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Loud Luxury.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/loud-luxury).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/loud-luxury")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/2a6kq74a/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Loud Luxury's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/2l3op3mf) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/2l3op3mf/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/loud-luxury')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/loud-luxury")
model = AutoModelWithLMHead.from_pretrained("huggingartists/loud-luxury")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/loud-luxury"], "widget": [{"text": "I am"}]}
|
huggingartists/loud-luxury
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/loud-luxury",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/loud-luxury #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Loud Luxury</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@loud-luxury</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Loud Luxury.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Loud Luxury's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">LoveRance</div>
<a href="https://genius.com/artists/loverance">
<div style="text-align: center; font-size: 14px;">@loverance</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from LoveRance.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/loverance).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/loverance")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/2cr3cjd1/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on LoveRance's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/18xbgyqf) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/18xbgyqf/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/loverance')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/loverance")
model = AutoModelWithLMHead.from_pretrained("huggingartists/loverance")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/loverance"], "widget": [{"text": "I am"}]}
|
huggingartists/loverance
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/loverance",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/loverance #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">LoveRance</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@loverance</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from LoveRance.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on LoveRance's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">LOVV66</div>
<a href="https://genius.com/artists/lovv66">
<div style="text-align: center; font-size: 14px;">@lovv66</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from LOVV66.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/lovv66).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/lovv66")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/1t6a2fxs/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on LOVV66's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/1de08pf6) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/1de08pf6/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/lovv66')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/lovv66")
model = AutoModelWithLMHead.from_pretrained("huggingartists/lovv66")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/lovv66"], "widget": [{"text": "I am"}]}
|
huggingartists/lovv66
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/lovv66",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/lovv66 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">LOVV66</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@lovv66</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from LOVV66.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on LOVV66's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Lumen</div>
<a href="https://genius.com/artists/lumen">
<div style="text-align: center; font-size: 14px;">@lumen</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Lumen.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/lumen).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/lumen")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/2fkqbnvl/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Lumen's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/1vhfm4ch) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/1vhfm4ch/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/lumen')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/lumen")
model = AutoModelWithLMHead.from_pretrained("huggingartists/lumen")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/lumen"], "widget": [{"text": "I am"}]}
|
huggingartists/lumen
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/lumen",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/lumen #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Lumen</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@lumen</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Lumen.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Lumen's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Ляпис Трубецкой (Lyapis Trubetskoy)</div>
<a href="https://genius.com/artists/lyapis-trubetskoy">
<div style="text-align: center; font-size: 14px;">@lyapis-trubetskoy</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Ляпис Трубецкой (Lyapis Trubetskoy).
Dataset is available [here](https://huggingface.co/datasets/huggingartists/lyapis-trubetskoy).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/lyapis-trubetskoy")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/1ycs0usm/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Ляпис Трубецкой (Lyapis Trubetskoy)'s lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/uz1xtq0k) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/uz1xtq0k/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/lyapis-trubetskoy')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/lyapis-trubetskoy")
model = AutoModelWithLMHead.from_pretrained("huggingartists/lyapis-trubetskoy")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/lyapis-trubetskoy"], "widget": [{"text": "I am"}]}
|
huggingartists/lyapis-trubetskoy
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/lyapis-trubetskoy",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/lyapis-trubetskoy #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Ляпис Трубецкой (Lyapis Trubetskoy)</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@lyapis-trubetskoy</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Ляпис Трубецкой (Lyapis Trubetskoy).
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Ляпис Трубецкой (Lyapis Trubetskoy)'s lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.",
"## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on Ляпис Трубецкой (Lyapis Trubetskoy)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.",
"## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:",
"## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.",
"## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.",
"## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on Ляпис Трубецкой (Lyapis Trubetskoy)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.",
"## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:",
"## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.",
"## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">MACAN</div>
<a href="https://genius.com/artists/macan">
<div style="text-align: center; font-size: 14px;">@macan</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from MACAN.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/macan).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/macan")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/3u3vx3xp/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on MACAN's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/23krf2tu) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/23krf2tu/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/macan')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/macan")
model = AutoModelWithLMHead.from_pretrained("huggingartists/macan")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/macan"], "widget": [{"text": "I am"}]}
|
huggingartists/macan
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/macan",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/macan #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">MACAN</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@macan</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from MACAN.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on MACAN's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Machine Gun Kelly</div>
<a href="https://genius.com/artists/machine-gun-kelly">
<div style="text-align: center; font-size: 14px;">@machine-gun-kelly</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Machine Gun Kelly.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/machine-gun-kelly).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/machine-gun-kelly")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/33f2ce6m/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Machine Gun Kelly's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/2bbn6fvb) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/2bbn6fvb/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/machine-gun-kelly')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/machine-gun-kelly")
model = AutoModelWithLMHead.from_pretrained("huggingartists/machine-gun-kelly")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/machine-gun-kelly"], "widget": [{"text": "I am"}]}
|
huggingartists/machine-gun-kelly
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/machine-gun-kelly",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/machine-gun-kelly #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Machine Gun Kelly</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@machine-gun-kelly</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Machine Gun Kelly.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Machine Gun Kelly's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Madonna</div>
<a href="https://genius.com/artists/madonna">
<div style="text-align: center; font-size: 14px;">@madonna</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Madonna.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/madonna).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/madonna")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/2abhif57/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Madonna's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/2eok9fmu) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/2eok9fmu/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/madonna')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/madonna")
model = AutoModelWithLMHead.from_pretrained("huggingartists/madonna")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/madonna"], "widget": [{"text": "I am"}]}
|
huggingartists/madonna
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/madonna",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/madonna #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Madonna</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@madonna</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Madonna.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Madonna's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Marillion</div>
<a href="https://genius.com/artists/marillion">
<div style="text-align: center; font-size: 14px;">@marillion</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Marillion.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/marillion).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/marillion")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/bajnt52i/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Marillion's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/wi2lgudb) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/wi2lgudb/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/marillion')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/marillion")
model = AutoModelWithLMHead.from_pretrained("huggingartists/marillion")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/marillion"], "widget": [{"text": "I am"}]}
|
huggingartists/marillion
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/marillion",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/marillion #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Marillion</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@marillion</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Marillion.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Marillion's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Maroon 5</div>
<a href="https://genius.com/artists/maroon-5">
<div style="text-align: center; font-size: 14px;">@maroon-5</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Maroon 5.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/maroon-5).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/maroon-5")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/38629b22/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Maroon 5's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/2ylk8pym) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/2ylk8pym/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/maroon-5')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/maroon-5")
model = AutoModelWithLMHead.from_pretrained("huggingartists/maroon-5")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/maroon-5"], "widget": [{"text": "I am"}]}
|
huggingartists/maroon-5
| null |
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/maroon-5",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/maroon-5 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Maroon 5</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@maroon-5</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Maroon 5.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Maroon 5's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n![GitHub stars](URL"
] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.