pipeline_tag
stringclasses
48 values
library_name
stringclasses
198 values
text
stringlengths
1
900k
metadata
stringlengths
2
438k
id
stringlengths
5
122
last_modified
null
tags
listlengths
1
1.84k
sha
null
created_at
stringlengths
25
25
arxiv
listlengths
0
201
languages
listlengths
0
1.83k
tags_str
stringlengths
17
9.34k
text_str
stringlengths
0
389k
text_lists
listlengths
0
722
processed_texts
listlengths
1
723
text-generation
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1324348950145544192/_NgUzqaJ_400x400.jpg&#39;)"> </div> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1394641363740905478/eNKpHxUd_400x400.jpg&#39;)"> </div> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1379154157098110977/lajO-om1_400x400.jpg&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI CYBORG 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">jaypo & Fardeg & Nial</div> <div style="text-align: center; font-size: 14px;">@fardeg1-jaypomeister-shortdaggerdick</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from jaypo & Fardeg & Nial. | Data | jaypo | Fardeg | Nial | | --- | --- | --- | --- | | Tweets downloaded | 399 | 3130 | 441 | | Retweets | 31 | 392 | 46 | | Short tweets | 168 | 785 | 202 | | Tweets kept | 200 | 1953 | 193 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/f0npandx/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @fardeg1-jaypomeister-shortdaggerdick's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/v3bv5lt7) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/v3bv5lt7/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/fardeg1-jaypomeister-shortdaggerdick') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/fardeg1-jaypomeister-shortdaggerdick/1623707785167/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/fardeg1-jaypomeister-shortdaggerdick
null
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI CYBORG jaypo & Fardeg & Nial @fardeg1-jaypomeister-shortdaggerdick I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from jaypo & Fardeg & Nial. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @fardeg1-jaypomeister-shortdaggerdick's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1400488156345126914/R1JrzEHO_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Farid</div> <div style="text-align: center; font-size: 14px;">@farid_0v</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Farid. | Data | Farid | | --- | --- | | Tweets downloaded | 3222 | | Retweets | 565 | | Short tweets | 338 | | Tweets kept | 2319 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3jw6z4gy/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @farid_0v's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/uplo21dc) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/uplo21dc/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/farid_0v') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/farid_0v/1627279407665/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/farid_0v
null
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT Farid @farid\_0v I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Farid. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @farid\_0v's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1330826999548366848/LjVI40IO_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Cock & Ball Nurture 🤖 AI Bot </div> <div style="font-size: 15px">@fartydoodooman bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@fartydoodooman's tweets](https://twitter.com/fartydoodooman). | Data | Quantity | | --- | --- | | Tweets downloaded | 3237 | | Retweets | 41 | | Short tweets | 710 | | Tweets kept | 2486 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2qujd2zx/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @fartydoodooman's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/17h7xprc) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/17h7xprc/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/fartydoodooman') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true", "widget": [{"text": "My dream is"}]}
huggingtweets/fartydoodooman
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Cock & Ball Nurture AI Bot @fartydoodooman bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @fartydoodooman's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @fartydoodooman's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/713653445262237696/mdyVSGoj_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">fastfwd</div> <div style="text-align: center; font-size: 14px;">@fastfwdco</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from fastfwd. | Data | fastfwd | | --- | --- | | Tweets downloaded | 947 | | Retweets | 60 | | Short tweets | 5 | | Tweets kept | 882 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/35uhk2zt/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @fastfwdco's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/24nk44tw) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/24nk44tw/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/fastfwdco') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/fastfwdco/1633019095463/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/fastfwdco
null
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT fastfwd @fastfwdco I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from fastfwd. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @fastfwdco's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1375565484142247936/O4bEMEUL_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Coriander 🤖 AI Bot </div> <div style="font-size: 15px">@fatuisv bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@fatuisv's tweets](https://twitter.com/fatuisv). | Data | Quantity | | --- | --- | | Tweets downloaded | 3246 | | Retweets | 145 | | Short tweets | 1007 | | Tweets kept | 2094 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/gxoztns2/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @fatuisv's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3imhaxow) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3imhaxow/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/fatuisv') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/fatuisv/1617499521191/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/fatuisv
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Coriander AI Bot @fatuisv bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @fatuisv's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @fatuisv's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<link rel="stylesheet" href="https://unpkg.com/@tailwindcss/typography@0.2.x/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('http://pbs.twimg.com/profile_images/1234692331263016960/7uR-nYW0_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">François Chollet 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@fchollet bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@fchollet's tweets](https://twitter.com/fchollet). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3231</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>682</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>82</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>2467</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/2rv5any2/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @fchollet's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/3ajhtw99) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/3ajhtw99/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/fchollet'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true", "widget": [{"text": "My dream is"}]}
huggingtweets/fchollet
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
<link rel="stylesheet" href="URL <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('URL </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">François Chollet AI Bot </div> <div style="font-size: 15px; color: #657786">@fchollet bot</div> </div> I was made with huggingtweets. Create your own bot based on your favorite user with the demo! ## How does it work? The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. ## Training data The model was trained on @fchollet's tweets. <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3231</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>682</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>82</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>2467</td> </tr> </tbody> </table> Explore the data, which is tracked with W&B artifacts at every step of the pipeline. ## Training procedure The model is based on a pre-trained GPT-2 which is fine-tuned on @fchollet's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/fchollet'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> ![Follow](URL <section class='prose'> For more details, visit the project repository. </section> ![GitHub stars](URL
[ "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @fchollet's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3231</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>682</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>82</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>2467</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @fchollet's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/fchollet'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @fchollet's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3231</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>682</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>82</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>2467</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @fchollet's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/fchollet'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/3294236108/2af3b3e10bf3c1488d84e6c9190f5c05_400x400.jpeg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Fred 🤖 AI Bot </div> <div style="font-size: 15px">@fdgwhite bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@fdgwhite's tweets](https://twitter.com/fdgwhite). | Data | Quantity | | --- | --- | | Tweets downloaded | 241 | | Retweets | 40 | | Short tweets | 21 | | Tweets kept | 180 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2d5jxswv/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @fdgwhite's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/zmh9ui27) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/zmh9ui27/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/fdgwhite') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/fdgwhite/1613440735468/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/fdgwhite
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Fred AI Bot @fdgwhite bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @fdgwhite's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @fdgwhite's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1347701526286848000/suIjtTqI_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">FebreezyXD 🤖 AI Bot </div> <div style="font-size: 15px">@febreezyxd bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@febreezyxd's tweets](https://twitter.com/febreezyxd). | Data | Quantity | | --- | --- | | Tweets downloaded | 2579 | | Retweets | 281 | | Short tweets | 633 | | Tweets kept | 1665 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1uhj4h75/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @febreezyxd's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/37c0iqc2) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/37c0iqc2/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/febreezyxd') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/febreezyxd/1614137805621/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/febreezyxd
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
FebreezyXD AI Bot @febreezyxd bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @febreezyxd's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @febreezyxd's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1375086596374941701/W31MndHq_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">fe 🤖 AI Bot </div> <div style="font-size: 15px">@felipe3867 bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@felipe3867's tweets](https://twitter.com/felipe3867). | Data | Quantity | | --- | --- | | Tweets downloaded | 3158 | | Retweets | 537 | | Short tweets | 512 | | Tweets kept | 2109 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/31fmna12/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @felipe3867's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1nhj5ov2) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1nhj5ov2/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/felipe3867') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/felipe3867/1616687750762/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/felipe3867
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
fe AI Bot @felipe3867 bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @felipe3867's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @felipe3867's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1374932590105395204/VnIg8IKQ_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Felipe Pereira 🤖 AI Bot </div> <div style="font-size: 15px">@felipenpereira bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@felipenpereira's tweets](https://twitter.com/felipenpereira). | Data | Quantity | | --- | --- | | Tweets downloaded | 1725 | | Retweets | 782 | | Short tweets | 90 | | Tweets kept | 853 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/w73n9a8d/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @felipenpereira's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1krn3d14) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1krn3d14/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/felipenpereira') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/felipenpereira/1616698040097/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/felipenpereira
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Felipe Pereira AI Bot @felipenpereira bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @felipenpereira's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @felipenpereira's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<link rel="stylesheet" href="https://unpkg.com/@tailwindcss/typography@0.2.x/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1301465003308982273/R8kAG77__400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">liam 🎅🗡🔜 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@femawalmart bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@femawalmart's tweets](https://twitter.com/femawalmart). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3077</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>497</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>620</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>1960</td> </tr> </tbody> </table> [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2q9q9o6r/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @femawalmart's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3ogpvjlp) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3ogpvjlp/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/femawalmart'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/femawalmart/1609143235673/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/femawalmart
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
<link rel="stylesheet" href="URL <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('URL </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">liam AI Bot </div> <div style="font-size: 15px; color: #657786">@femawalmart bot</div> </div> I was made with huggingtweets. Create your own bot based on your favorite user with the demo! ## How does it work? The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. ## Training data The model was trained on @femawalmart's tweets. <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3077</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>497</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>620</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>1960</td> </tr> </tbody> </table> Explore the data, which is tracked with W&B artifacts at every step of the pipeline. ## Training procedure The model is based on a pre-trained GPT-2 which is fine-tuned on @femawalmart's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/femawalmart'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> ![Follow](URL <section class='prose'> For more details, visit the project repository. </section> ![GitHub stars](URL
[ "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @femawalmart's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3077</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>497</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>620</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>1960</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @femawalmart's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/femawalmart'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @femawalmart's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3077</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>497</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>620</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>1960</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @femawalmart's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/femawalmart'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1355330349623111680/KUgdYM0o_400x400.png')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">femboj zizek 🤖 AI Bot </div> <div style="font-size: 15px">@fembojj bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@fembojj's tweets](https://twitter.com/fembojj). | Data | Quantity | | --- | --- | | Tweets downloaded | 3241 | | Retweets | 86 | | Short tweets | 1064 | | Tweets kept | 2091 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/hzix93pw/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @fembojj's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2xgawags) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2xgawags/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/fembojj') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/fembojj/1614095493647/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/fembojj
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
femboj zizek AI Bot @fembojj bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @fembojj's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @fembojj's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1370404573374976005/WyjvD-FA_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Storm | 嵐 🤖 AI Bot </div> <div style="font-size: 15px">@femboympreg bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@femboympreg's tweets](https://twitter.com/femboympreg). | Data | Quantity | | --- | --- | | Tweets downloaded | 3212 | | Retweets | 594 | | Short tweets | 969 | | Tweets kept | 1649 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/30bwh0wo/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @femboympreg's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/8vc73356) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/8vc73356/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/femboympreg') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/femboympreg/1617809081812/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/femboympreg
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Storm | 嵐 AI Bot @femboympreg bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @femboympreg's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @femboympreg's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1569453578493763590/MerXNdrF_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">shitbrain dyke upside down era</div> <div style="text-align: center; font-size: 14px;">@femoidfurry</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from shitbrain dyke upside down era. | Data | shitbrain dyke upside down era | | --- | --- | | Tweets downloaded | 3211 | | Retweets | 1977 | | Short tweets | 106 | | Tweets kept | 1128 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/34ui7fp9/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @femoidfurry's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/177yzikv) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/177yzikv/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/femoidfurry') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "http://www.huggingtweets.com/femoidfurry/1666785376927/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/femoidfurry
null
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT shitbrain dyke upside down era @femoidfurry I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from shitbrain dyke upside down era. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @femoidfurry's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<link rel="stylesheet" href="https://unpkg.com/@tailwindcss/typography@0.2.x/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('http://pbs.twimg.com/profile_images/1299864951063019521/bjlvTUMN_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Fernando A. Iglesias 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@feriglesias bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@feriglesias's tweets](https://twitter.com/feriglesias). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3203</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>380</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>465</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>2358</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/355taxah/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @feriglesias's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/1unu5cwm) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/1unu5cwm/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/feriglesias'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "http://res.cloudinary.com/huggingtweets/image/upload/v1600051758/feriglesias.jpg", "widget": [{"text": "My dream is"}]}
huggingtweets/feriglesias
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
<link rel="stylesheet" href="URL <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('URL </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Fernando A. Iglesias AI Bot </div> <div style="font-size: 15px; color: #657786">@feriglesias bot</div> </div> I was made with huggingtweets. Create your own bot based on your favorite user with the demo! ## How does it work? The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. ## Training data The model was trained on @feriglesias's tweets. <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3203</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>380</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>465</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>2358</td> </tr> </tbody> </table> Explore the data, which is tracked with W&B artifacts at every step of the pipeline. ## Training procedure The model is based on a pre-trained GPT-2 which is fine-tuned on @feriglesias's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/feriglesias'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> ![Follow](URL <section class='prose'> For more details, visit the project repository. </section> ![GitHub stars](URL
[ "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @feriglesias's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3203</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>380</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>465</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>2358</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @feriglesias's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/feriglesias'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @feriglesias's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3203</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>380</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>465</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>2358</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @feriglesias's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/feriglesias'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
text-generation
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1172580448662372353/SwJNqDQl_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Fesshole 🧻</div> <div style="text-align: center; font-size: 14px;">@fesshole</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Fesshole 🧻. | Data | Fesshole 🧻 | | --- | --- | | Tweets downloaded | 3250 | | Retweets | 14 | | Short tweets | 1 | | Tweets kept | 3235 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3473th10/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @fesshole's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1wz2ncbz) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1wz2ncbz/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/fesshole') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true", "widget": [{"text": "My dream is"}]}
huggingtweets/fesshole
null
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT Fesshole @fesshole I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Fesshole . Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @fesshole's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1370161206158360579/_G9rCdzT_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Rory Dean ☭ 🤖 AI Bot </div> <div style="font-size: 15px">@feyerabender bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@feyerabender's tweets](https://twitter.com/feyerabender). | Data | Quantity | | --- | --- | | Tweets downloaded | 3195 | | Retweets | 722 | | Short tweets | 363 | | Tweets kept | 2110 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1cjspfal/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @feyerabender's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/17iujs5g) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/17iujs5g/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/feyerabender') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/feyerabender/1616669524008/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/feyerabender
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Rory Dean AI Bot @feyerabender bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @feyerabender's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @feyerabender's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<link rel="stylesheet" href="https://unpkg.com/@tailwindcss/typography@0.2.x/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1278360830367674368/SfqcgSVD_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Fidelity Investments 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@fidelity bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@fidelity's tweets](https://twitter.com/fidelity). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3241</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>103</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>1</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>3137</td> </tr> </tbody> </table> [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1ow5lds5/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @fidelity's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/30ibmpq1) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/30ibmpq1/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/fidelity'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets) <!--- random size file -->
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/fidelity/1607118440881/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/fidelity
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
<link rel="stylesheet" href="URL <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('URL </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Fidelity Investments AI Bot </div> <div style="font-size: 15px; color: #657786">@fidelity bot</div> </div> I was made with huggingtweets. Create your own bot based on your favorite user with the demo! ## How does it work? The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. ## Training data The model was trained on @fidelity's tweets. <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3241</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>103</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>1</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>3137</td> </tr> </tbody> </table> Explore the data, which is tracked with W&B artifacts at every step of the pipeline. ## Training procedure The model is based on a pre-trained GPT-2 which is fine-tuned on @fidelity's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/fidelity'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> ![Follow](URL <section class='prose'> For more details, visit the project repository. </section> ![GitHub stars](URL
[ "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @fidelity's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3241</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>103</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>1</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>3137</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @fidelity's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/fidelity'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @fidelity's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3241</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>103</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>1</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>3137</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @fidelity's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/fidelity'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
text-generation
transformers
<link rel="stylesheet" href="https://unpkg.com/@tailwindcss/typography@0.2.x/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('http://pbs.twimg.com/profile_images/1300740232485068800/KpNhyts7_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Fiersa Besari 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@fiersabesari bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@fiersabesari's tweets](https://twitter.com/fiersabesari). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3238</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>32</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>636</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>2570</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/11ffqe7z/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @fiersabesari's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/3q5publ5) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/3q5publ5/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/fiersabesari'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://github.com/borisdayma/huggingtweets/blob/master/img/logo_share.png?raw=true", "widget": [{"text": "My dream is"}]}
huggingtweets/fiersabesari
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
<link rel="stylesheet" href="URL <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('URL </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Fiersa Besari AI Bot </div> <div style="font-size: 15px; color: #657786">@fiersabesari bot</div> </div> I was made with huggingtweets. Create your own bot based on your favorite user with the demo! ## How does it work? The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. ## Training data The model was trained on @fiersabesari's tweets. <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3238</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>32</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>636</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>2570</td> </tr> </tbody> </table> Explore the data, which is tracked with W&B artifacts at every step of the pipeline. ## Training procedure The model is based on a pre-trained GPT-2 which is fine-tuned on @fiersabesari's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/fiersabesari'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> ![Follow](URL <section class='prose'> For more details, visit the project repository. </section> ![GitHub stars](URL
[ "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @fiersabesari's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3238</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>32</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>636</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>2570</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @fiersabesari's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/fiersabesari'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @fiersabesari's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3238</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>32</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>636</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>2570</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @fiersabesari's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/fiersabesari'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1270466520859267076/CwFFAx0q_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">FIFER Mods 🤖 AI Bot </div> <div style="font-size: 15px">@fifer_mods bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@fifer_mods's tweets](https://twitter.com/fifer_mods). | Data | Quantity | | --- | --- | | Tweets downloaded | 3249 | | Retweets | 471 | | Short tweets | 660 | | Tweets kept | 2118 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3p1w2iyo/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @fifer_mods's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/d0niqoiy) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/d0niqoiy/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/fifer_mods') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/fifer_mods/1617766950611/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/fifer_mods
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
FIFER Mods AI Bot @fifer\_mods bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @fifer\_mods's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @fifer\_mods's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1513191641921765388/rToX3RpX_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">15</div> <div style="text-align: center; font-size: 14px;">@fifteenai</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from 15. | Data | 15 | | --- | --- | | Tweets downloaded | 111 | | Retweets | 9 | | Short tweets | 10 | | Tweets kept | 92 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/169wgrhk/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @fifteenai's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/390dyi5s) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/390dyi5s/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/fifteenai') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "http://www.huggingtweets.com/fifteenai/1658549683215/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/fifteenai
null
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT 15 @fifteenai I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from 15. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @fifteenai's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<link rel="stylesheet" href="https://unpkg.com/@tailwindcss/typography@0.2.x/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/771731027882422272/ysb3KvNr_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Filip Podstavec ⛏ 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@filippodstavec bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@filippodstavec's tweets](https://twitter.com/filippodstavec). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3076</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>1232</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>84</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>1760</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/jyrecnux/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @filippodstavec's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/14l6h1ca) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/14l6h1ca/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/filippodstavec'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets) <!--- random size file -->
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true", "widget": [{"text": "My dream is"}]}
huggingtweets/filippodstavec
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
<link rel="stylesheet" href="URL <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('URL </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Filip Podstavec AI Bot </div> <div style="font-size: 15px; color: #657786">@filippodstavec bot</div> </div> I was made with huggingtweets. Create your own bot based on your favorite user with the demo! ## How does it work? The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. ## Training data The model was trained on @filippodstavec's tweets. <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3076</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>1232</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>84</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>1760</td> </tr> </tbody> </table> Explore the data, which is tracked with W&B artifacts at every step of the pipeline. ## Training procedure The model is based on a pre-trained GPT-2 which is fine-tuned on @filippodstavec's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/filippodstavec'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> ![Follow](URL <section class='prose'> For more details, visit the project repository. </section> ![GitHub stars](URL
[ "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @filippodstavec's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3076</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>1232</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>84</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>1760</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @filippodstavec's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/filippodstavec'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @filippodstavec's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3076</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>1232</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>84</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>1760</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @filippodstavec's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/filippodstavec'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1356115738046717953/9nN4Gj3R_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Filler Username 🤖 AI Bot </div> <div style="font-size: 15px">@filler_username bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@filler_username's tweets](https://twitter.com/filler_username). | Data | Quantity | | --- | --- | | Tweets downloaded | 3187 | | Retweets | 123 | | Short tweets | 827 | | Tweets kept | 2237 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3n0vde62/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @filler_username's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2vmqixu2) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2vmqixu2/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/filler_username') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/filler_username/1617904327234/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/filler_username
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Filler Username AI Bot @filler\_username bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @filler\_username's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @filler\_username's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<link rel="stylesheet" href="https://unpkg.com/@tailwindcss/typography@0.2.x/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1222122622307241984/4rIV3vU6_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Alex Riviere 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@fimion bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@fimion's tweets](https://twitter.com/fimion). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3240</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>585</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>459</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>2196</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/2dtfbkrf/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @fimion's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/31skg71x) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/31skg71x/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/fimion'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets) <!--- random size file -->
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/fimion/1602258159865/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/fimion
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
<link rel="stylesheet" href="URL <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('URL </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Alex Riviere AI Bot </div> <div style="font-size: 15px; color: #657786">@fimion bot</div> </div> I was made with huggingtweets. Create your own bot based on your favorite user with the demo! ## How does it work? The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. ## Training data The model was trained on @fimion's tweets. <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3240</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>585</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>459</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>2196</td> </tr> </tbody> </table> Explore the data, which is tracked with W&B artifacts at every step of the pipeline. ## Training procedure The model is based on a pre-trained GPT-2 which is fine-tuned on @fimion's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/fimion'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> ![Follow](URL <section class='prose'> For more details, visit the project repository. </section> ![GitHub stars](URL
[ "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @fimion's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3240</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>585</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>459</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>2196</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @fimion's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/fimion'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @fimion's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3240</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>585</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>459</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>2196</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @fimion's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/fimion'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
text-generation
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1397676905403457536/TUd6TAFf_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">☀️Fiona☀️</div> <div style="text-align: center; font-size: 14px;">@fiodeer</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from ☀️Fiona☀️. | Data | ☀️Fiona☀️ | | --- | --- | | Tweets downloaded | 3242 | | Retweets | 462 | | Short tweets | 565 | | Tweets kept | 2215 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3cgmdugf/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @fiodeer's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1z9bw9h6) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1z9bw9h6/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/fiodeer') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/fiodeer/1624477503382/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/fiodeer
null
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT ️Fiona️ @fiodeer I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from ️Fiona️. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @fiodeer's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1371972159418068993/YOAhNp9n_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">oskar is having a moment 🤖 AI Bot </div> <div style="font-size: 15px">@fishbeelamp bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@fishbeelamp's tweets](https://twitter.com/fishbeelamp). | Data | Quantity | | --- | --- | | Tweets downloaded | 1384 | | Retweets | 198 | | Short tweets | 333 | | Tweets kept | 853 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2sbub9s2/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @fishbeelamp's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1v7uxmqu) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1v7uxmqu/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/fishbeelamp') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/fishbeelamp/1616689100015/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/fishbeelamp
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
oskar is having a moment AI Bot @fishbeelamp bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @fishbeelamp's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @fishbeelamp's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<link rel="stylesheet" href="https://unpkg.com/@tailwindcss/typography@0.2.x/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1302368056266625024/DjCeJU-T_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Florian Onur 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@fkuhlmeier bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@fkuhlmeier's tweets](https://twitter.com/fkuhlmeier). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>168</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>96</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>12</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>60</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/yb58yalp/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @fkuhlmeier's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/9cqebx43) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/9cqebx43/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/fkuhlmeier'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets) <!--- random size file -->
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/fkuhlmeier/1603890209601/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/fkuhlmeier
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
<link rel="stylesheet" href="URL <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('URL </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Florian Onur AI Bot </div> <div style="font-size: 15px; color: #657786">@fkuhlmeier bot</div> </div> I was made with huggingtweets. Create your own bot based on your favorite user with the demo! ## How does it work? The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. ## Training data The model was trained on @fkuhlmeier's tweets. <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>168</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>96</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>12</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>60</td> </tr> </tbody> </table> Explore the data, which is tracked with W&B artifacts at every step of the pipeline. ## Training procedure The model is based on a pre-trained GPT-2 which is fine-tuned on @fkuhlmeier's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/fkuhlmeier'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> ![Follow](URL <section class='prose'> For more details, visit the project repository. </section> ![GitHub stars](URL
[ "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @fkuhlmeier's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>168</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>96</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>12</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>60</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @fkuhlmeier's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/fkuhlmeier'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @fkuhlmeier's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>168</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>96</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>12</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>60</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @fkuhlmeier's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/fkuhlmeier'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1149796520402784256/VIu-RJTA_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">FlairMax 🤖 AI Bot </div> <div style="font-size: 15px">@flairmaxuwp bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@flairmaxuwp's tweets](https://twitter.com/flairmaxuwp). | Data | Quantity | | --- | --- | | Tweets downloaded | 230 | | Retweets | 33 | | Short tweets | 25 | | Tweets kept | 172 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/v2sbjd88/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @flairmaxuwp's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2tprbf8h) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2tprbf8h/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/flairmaxuwp') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/flairmaxuwp/1617311982893/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/flairmaxuwp
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
FlairMax AI Bot @flairmaxuwp bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @flairmaxuwp's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @flairmaxuwp's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<link rel="stylesheet" href="https://unpkg.com/@tailwindcss/typography@0.2.x/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1278450406843125762/f5u_F2ng_400x400.png')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Flatiron School (at 🏡) 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@flatironschool bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@flatironschool's tweets](https://twitter.com/flatironschool). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3202</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>1068</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>582</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>1552</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/179qzrny/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @flatironschool's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/174rjbb8) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/174rjbb8/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/flatironschool'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets) <!--- random size file -->
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/flatironschool/1603341000640/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/flatironschool
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
<link rel="stylesheet" href="URL <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('URL </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Flatiron School (at ) AI Bot </div> <div style="font-size: 15px; color: #657786">@flatironschool bot</div> </div> I was made with huggingtweets. Create your own bot based on your favorite user with the demo! ## How does it work? The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. ## Training data The model was trained on @flatironschool's tweets. <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3202</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>1068</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>582</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>1552</td> </tr> </tbody> </table> Explore the data, which is tracked with W&B artifacts at every step of the pipeline. ## Training procedure The model is based on a pre-trained GPT-2 which is fine-tuned on @flatironschool's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/flatironschool'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> ![Follow](URL <section class='prose'> For more details, visit the project repository. </section> ![GitHub stars](URL
[ "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @flatironschool's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3202</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>1068</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>582</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>1552</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @flatironschool's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/flatironschool'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @flatironschool's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3202</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>1068</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>582</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>1552</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @flatironschool's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/flatironschool'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1324181168615432193/TW4ddzsh_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">⚾️Method Ma'am⚾️ 🤖 AI Bot </div> <div style="font-size: 15px">@fletcherfidelis bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@fletcherfidelis's tweets](https://twitter.com/fletcherfidelis). | Data | Quantity | | --- | --- | | Tweets downloaded | 2075 | | Retweets | 426 | | Short tweets | 306 | | Tweets kept | 1343 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1dst3vk7/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @fletcherfidelis's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/eigb7j9r) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/eigb7j9r/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/fletcherfidelis') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/fletcherfidelis/1617901836091/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/fletcherfidelis
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
️Method Ma'am️ AI Bot @fletcherfidelis bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @fletcherfidelis's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @fletcherfidelis's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1471692544157405184/P3FUX4w9_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Ali ψ</div> <div style="text-align: center; font-size: 14px;">@flightlessmilfs</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Ali ψ. | Data | Ali ψ | | --- | --- | | Tweets downloaded | 1815 | | Retweets | 642 | | Short tweets | 181 | | Tweets kept | 992 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1yuw97j7/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @flightlessmilfs's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/31esgsfh) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/31esgsfh/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/flightlessmilfs') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "http://www.huggingtweets.com/flightlessmilfs/1643422380331/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/flightlessmilfs
null
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT Ali ψ @flightlessmilfs I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Ali ψ. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @flightlessmilfs's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1374903267277340679/T2ztG3zQ_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">lucas 🤖 AI Bot </div> <div style="font-size: 15px">@florestantan bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@florestantan's tweets](https://twitter.com/florestantan). | Data | Quantity | | --- | --- | | Tweets downloaded | 3235 | | Retweets | 137 | | Short tweets | 509 | | Tweets kept | 2589 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/35n5ntba/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @florestantan's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1uo0luuy) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1uo0luuy/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/florestantan') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/florestantan/1617209255342/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/florestantan
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
lucas AI Bot @florestantan bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @florestantan's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @florestantan's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1286445868166582273/lsl6r9tw_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Greg Florez 🤖 AI Bot </div> <div style="font-size: 15px">@florezgregory bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@florezgregory's tweets](https://twitter.com/florezgregory). | Data | Quantity | | --- | --- | | Tweets downloaded | 3136 | | Retweets | 1644 | | Short tweets | 247 | | Tweets kept | 1245 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/h16lorzp/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @florezgregory's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3asfrvve) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3asfrvve/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/florezgregory') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/florezgregory/1616684382614/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/florezgregory
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Greg Florez AI Bot @florezgregory bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @florezgregory's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @florezgregory's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1427988024646717441/3WW-7dhn_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">tea and oranges</div> <div style="text-align: center; font-size: 14px;">@floristree92</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from tea and oranges. | Data | tea and oranges | | --- | --- | | Tweets downloaded | 2510 | | Retweets | 1363 | | Short tweets | 109 | | Tweets kept | 1038 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2fuokdip/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @floristree92's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1e0xd79p) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1e0xd79p/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/floristree92') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "http://www.huggingtweets.com/floristree92/1639415459410/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/floristree92
null
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT tea and oranges @floristree92 I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from tea and oranges. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @floristree92's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1414421050415329283/SnA_5soV_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">stable lacker</div> <div style="text-align: center; font-size: 14px;">@flower_dommy</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from stable lacker. | Data | stable lacker | | --- | --- | | Tweets downloaded | 1549 | | Retweets | 270 | | Short tweets | 210 | | Tweets kept | 1069 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/301dw1ni/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @flower_dommy's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/kf0leede) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/kf0leede/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/flower_dommy') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/flower_dommy/1632937534684/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/flower_dommy
null
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT stable lacker @flower\_dommy I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from stable lacker. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @flower\_dommy's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1414421050415329283/SnA_5soV_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">label stacker</div> <div style="text-align: center; font-size: 14px;">@flower_zaddy</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from label stacker. | Data | label stacker | | --- | --- | | Tweets downloaded | 992 | | Retweets | 209 | | Short tweets | 119 | | Tweets kept | 664 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/qsem7akp/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @flower_zaddy's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/c2jwdb2x) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/c2jwdb2x/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/flower_zaddy') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/flower_zaddy/1627601426529/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/flower_zaddy
null
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT label stacker @flower\_zaddy I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from label stacker. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @flower\_zaddy's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1346711262869086210/KPshm_gK_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">G a b r i e l - I g l e s i a s</div> <div style="text-align: center; font-size: 14px;">@fluffyguy</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from G a b r i e l - I g l e s i a s. | Data | G a b r i e l - I g l e s i a s | | --- | --- | | Tweets downloaded | 3246 | | Retweets | 264 | | Short tweets | 132 | | Tweets kept | 2850 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/24pz59rj/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @fluffyguy's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/36h0hs6l) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/36h0hs6l/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/fluffyguy') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/fluffyguy/1631662825404/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/fluffyguy
null
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT G a b r i e l - I g l e s i a s @fluffyguy I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from G a b r i e l - I g l e s i a s. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @fluffyguy's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1175884636624510976/KtBI_1GE_400x400.jpg&#39;)"> </div> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1245550936807874560/j_zCtKSJ_400x400.jpg&#39;)"> </div> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1175469370975367169/tn1O7RHW_400x400.jpg&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI CYBORG 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">nomes foda de dj & nomes de gato & Foda-se Tudo</div> <div style="text-align: center; font-size: 14px;">@fodase_bot-nomesdegato-nomesdj</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from nomes foda de dj & nomes de gato & Foda-se Tudo. | Data | nomes foda de dj | nomes de gato | Foda-se Tudo | | --- | --- | --- | --- | | Tweets downloaded | 3250 | 3209 | 3250 | | Retweets | 7 | 69 | 0 | | Short tweets | 731 | 1710 | 3118 | | Tweets kept | 2512 | 1430 | 132 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2z3mswab/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @fodase_bot-nomesdegato-nomesdj's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/25vut5iu) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/25vut5iu/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/fodase_bot-nomesdegato-nomesdj') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "http://www.huggingtweets.com/fodase_bot-nomesdegato-nomesdj/1639503647273/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/fodase_bot-nomesdegato-nomesdj
null
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI CYBORG nomes foda de dj & nomes de gato & Foda-se Tudo @fodase\_bot-nomesdegato-nomesdj I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from nomes foda de dj & nomes de gato & Foda-se Tudo. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @fodase\_bot-nomesdegato-nomesdj's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1395089186538115072/oehHqb54_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Food Network</div> <div style="text-align: center; font-size: 14px;">@foodnetwork</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Food Network. | Data | Food Network | | --- | --- | | Tweets downloaded | 3237 | | Retweets | 938 | | Short tweets | 49 | | Tweets kept | 2250 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2x1lok4q/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @foodnetwork's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2yjxdjcm) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2yjxdjcm/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/foodnetwork') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/foodnetwork/1631662887881/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/foodnetwork
null
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT Food Network @foodnetwork I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Food Network. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @foodnetwork's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<link rel="stylesheet" href="https://unpkg.com/@tailwindcss/typography@0.2.x/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/913057066243231744/3pa5pBzl_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Footy Headlines 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@footy_headlines bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@footy_headlines's tweets](https://twitter.com/footy_headlines). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3215</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>20</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>505</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>2690</td> </tr> </tbody> </table> [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/35awxvyw/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @footy_headlines's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1tc1ld77) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1tc1ld77/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/footy_headlines'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets) <!--- random size file -->
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/footy_headlines/1606774412916/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/footy_headlines
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
<link rel="stylesheet" href="URL <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('URL </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Footy Headlines AI Bot </div> <div style="font-size: 15px; color: #657786">@footy_headlines bot</div> </div> I was made with huggingtweets. Create your own bot based on your favorite user with the demo! ## How does it work? The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. ## Training data The model was trained on @footy_headlines's tweets. <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3215</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>20</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>505</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>2690</td> </tr> </tbody> </table> Explore the data, which is tracked with W&B artifacts at every step of the pipeline. ## Training procedure The model is based on a pre-trained GPT-2 which is fine-tuned on @footy_headlines's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/footy_headlines'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> ![Follow](URL <section class='prose'> For more details, visit the project repository. </section> ![GitHub stars](URL
[ "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @footy_headlines's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3215</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>20</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>505</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>2690</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @footy_headlines's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/footy_headlines'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @footy_headlines's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3215</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>20</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>505</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>2690</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @footy_headlines's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/footy_headlines'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1340949437397385217/g_G-ZToS_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Andrew Burton 🤖 AI Bot </div> <div style="font-size: 15px">@foraburton bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@foraburton's tweets](https://twitter.com/foraburton). | Data | Quantity | | --- | --- | | Tweets downloaded | 1512 | | Retweets | 276 | | Short tweets | 72 | | Tweets kept | 1164 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/35eak77p/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @foraburton's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/zrto18xz) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/zrto18xz/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/foraburton') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/foraburton/1617034949571/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/foraburton
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Andrew Burton AI Bot @foraburton bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @foraburton's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @foraburton's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1426270160311099396/RCvfusRc_400x400.jpg&#39;)"> </div> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1426556736610226179/6XDFWyJh_400x400.jpg&#39;)"> </div> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1416488512300503052/FgE6teHE_400x400.jpg&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI CYBORG 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">NaN & MaX 🎤 & ivy 🥩🎙️</div> <div style="text-align: center; font-size: 14px;">@formernumber-wmason_iv-wyattmaxon</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from NaN & MaX 🎤 & ivy 🥩🎙️. | Data | NaN | MaX 🎤 | ivy 🥩🎙️ | | --- | --- | --- | --- | | Tweets downloaded | 3250 | 3250 | 3249 | | Retweets | 148 | 420 | 266 | | Short tweets | 507 | 232 | 372 | | Tweets kept | 2595 | 2598 | 2611 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1s1v908g/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @formernumber-wmason_iv-wyattmaxon's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3j3kexu1) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3j3kexu1/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/formernumber-wmason_iv-wyattmaxon') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/formernumber-wmason_iv-wyattmaxon/1629747957743/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/formernumber-wmason_iv-wyattmaxon
null
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI CYBORG NaN & MaX & ivy ️ @formernumber-wmason\_iv-wyattmaxon I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from NaN & MaX & ivy ️. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @formernumber-wmason\_iv-wyattmaxon's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1430593525108903940/vrSks7ph_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">NaN</div> <div style="text-align: center; font-size: 14px;">@formernumber</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from NaN. | Data | NaN | | --- | --- | | Tweets downloaded | 3250 | | Retweets | 146 | | Short tweets | 554 | | Tweets kept | 2550 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1cmch3y4/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @formernumber's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2iurxhit) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2iurxhit/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/formernumber') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/formernumber/1630962355855/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/formernumber
null
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT NaN @formernumber I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from NaN. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @formernumber's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1323383004350218240/RGFOPBNJ_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Ray Doraisamy 🤖 AI Bot </div> <div style="font-size: 15px">@forshaper bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@forshaper's tweets](https://twitter.com/forshaper). | Data | Quantity | | --- | --- | | Tweets downloaded | 3241 | | Retweets | 181 | | Short tweets | 413 | | Tweets kept | 2647 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/392kuq3o/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @forshaper's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3askelvq) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3askelvq/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/forshaper') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/forshaper/1616646541286/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/forshaper
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Ray Doraisamy AI Bot @forshaper bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @forshaper's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @forshaper's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1445910806420344839/Rm_oWBH0_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Pakun/Foxe</div> <div style="text-align: center; font-size: 14px;">@foxehhyz</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Pakun/Foxe. | Data | Pakun/Foxe | | --- | --- | | Tweets downloaded | 3243 | | Retweets | 413 | | Short tweets | 192 | | Tweets kept | 2638 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/urqo8vqu/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @foxehhyz's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/38j8w9y5) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/38j8w9y5/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/foxehhyz') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "http://www.huggingtweets.com/foxehhyz/1638928181616/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/foxehhyz
null
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT Pakun/Foxe @foxehhyz I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Pakun/Foxe. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @foxehhyz's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1397375635845222400/-N68I_0K_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">legally required to</div> <div style="text-align: center; font-size: 14px;">@foxlius</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from legally required to. | Data | legally required to | | --- | --- | | Tweets downloaded | 3224 | | Retweets | 1459 | | Short tweets | 631 | | Tweets kept | 1134 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/h54z72kn/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @foxlius's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/6fffkgwp) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/6fffkgwp/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/foxlius') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/foxlius/1623071923782/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/foxlius
null
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT legally required to @foxlius I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from legally required to. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @foxlius's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1459143267673677853/xtIvtfZp_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Fox News</div> <div style="text-align: center; font-size: 14px;">@foxnews</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Fox News. | Data | Fox News | | --- | --- | | Tweets downloaded | 3250 | | Retweets | 84 | | Short tweets | 0 | | Tweets kept | 3166 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3gz4o7tf/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @foxnews's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/10czim3i) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/10czim3i/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/foxnews') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "http://www.huggingtweets.com/foxnews/1649192783021/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/foxnews
null
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT Fox News @foxnews I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Fox News. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @foxnews's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1402947843351056396/TICIsTPK_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Francisco Foz</div> <div style="text-align: center; font-size: 14px;">@fozfrancisco</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Francisco Foz. | Data | Francisco Foz | | --- | --- | | Tweets downloaded | 118 | | Retweets | 17 | | Short tweets | 25 | | Tweets kept | 76 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1htqvjv1/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @fozfrancisco's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3283z3u2) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3283z3u2/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/fozfrancisco') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "http://www.huggingtweets.com/fozfrancisco/1638989498165/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/fozfrancisco
null
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT Francisco Foz @fozfrancisco I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Francisco Foz. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @fozfrancisco's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1379071861401780231/cG8XDfAy_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">fr3fou!! 🤖 AI Bot </div> <div style="font-size: 15px">@fr3fou bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@fr3fou's tweets](https://twitter.com/fr3fou). | Data | Quantity | | --- | --- | | Tweets downloaded | 2760 | | Retweets | 1652 | | Short tweets | 377 | | Tweets kept | 731 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1w16ltwm/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @fr3fou's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/34cp6cbj) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/34cp6cbj/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/fr3fou') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/fr3fou/1617962537530/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/fr3fou
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
fr3fou!! AI Bot @fr3fou bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @fr3fou's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @fr3fou's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1358283159759167489/6h6CFiXX_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Frankie i'm going to die 🤖 AI Bot </div> <div style="font-size: 15px">@frankietime bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@frankietime's tweets](https://twitter.com/frankietime). | Data | Quantity | | --- | --- | | Tweets downloaded | 1336 | | Retweets | 511 | | Short tweets | 167 | | Tweets kept | 658 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2gnhwo1u/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @frankietime's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2mifzn5p) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2mifzn5p/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/frankietime') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/frankietime/1614097133105/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/frankietime
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Frankie i'm going to die AI Bot @frankietime bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @frankietime's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @frankietime's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1326746410243428353/09C_PBPD_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Frank Cabrera 🤖 AI Bot </div> <div style="font-size: 15px">@frankviii bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@frankviii's tweets](https://twitter.com/frankviii). | Data | Quantity | | --- | --- | | Tweets downloaded | 97 | | Retweets | 17 | | Short tweets | 9 | | Tweets kept | 71 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/zidaqanj/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @frankviii's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/z4esfnfx) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/z4esfnfx/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/frankviii') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/frankviii/1616724264151/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/frankviii
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Frank Cabrera AI Bot @frankviii bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @frankviii's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @frankviii's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1329501028593627140/StRKBYOo_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Chris Frantz 🤖 AI Bot </div> <div style="font-size: 15px">@frantzfries bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@frantzfries's tweets](https://twitter.com/frantzfries). | Data | Quantity | | --- | --- | | Tweets downloaded | 2423 | | Retweets | 260 | | Short tweets | 150 | | Tweets kept | 2013 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1n1iicrq/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @frantzfries's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/16qkb131) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/16qkb131/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/frantzfries') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/frantzfries/1617932907170/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/frantzfries
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Chris Frantz AI Bot @frantzfries bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @frantzfries's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @frantzfries's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1340735982727880704/rm7b1jWn_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">OnlyFranxx 🤖 AI Bot </div> <div style="font-size: 15px">@franxxfurt bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@franxxfurt's tweets](https://twitter.com/franxxfurt). | Data | Quantity | | --- | --- | | Tweets downloaded | 3118 | | Retweets | 1317 | | Short tweets | 267 | | Tweets kept | 1534 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1n7v3881/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @franxxfurt's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3ax82159) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3ax82159/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/franxxfurt') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/franxxfurt/1617765541385/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/franxxfurt
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
OnlyFranxx AI Bot @franxxfurt bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @franxxfurt's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @franxxfurt's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1361002916396564485/7QCaJO1o_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Fusti 🤖 AI Bot </div> <div style="font-size: 15px">@fraskungfu bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@fraskungfu's tweets](https://twitter.com/fraskungfu). | Data | Quantity | | --- | --- | | Tweets downloaded | 3197 | | Retweets | 975 | | Short tweets | 736 | | Tweets kept | 1486 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1yg8xrqo/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @fraskungfu's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/252l408y) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/252l408y/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/fraskungfu') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/fraskungfu/1617920632144/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/fraskungfu
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Fusti AI Bot @fraskungfu bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @fraskungfu's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @fraskungfu's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1155938695662505984/H3RmD4Fq_400x400.jpg&#39;)"> </div> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/861903051669610496/dvuuio0A_400x400.jpg&#39;)"> </div> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1362638938549018626/O2jBlckS_400x400.jpg&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI CYBORG 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Inspiring Quotes - Be Positive & Motivation & Motivation & Success</div> <div style="text-align: center; font-size: 14px;">@freakytheory-insprepositive-masterythink</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Inspiring Quotes - Be Positive & Motivation & Motivation & Success. | Data | Inspiring Quotes - Be Positive | Motivation | Motivation & Success | | --- | --- | --- | --- | | Tweets downloaded | 3250 | 3233 | 706 | | Retweets | 789 | 13 | 4 | | Short tweets | 2 | 10 | 14 | | Tweets kept | 2459 | 3210 | 688 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3aupxbxm/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @freakytheory-insprepositive-masterythink's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/p03go3pp) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/p03go3pp/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/freakytheory-insprepositive-masterythink') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/freakytheory-insprepositive-masterythink/1631276702724/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/freakytheory-insprepositive-masterythink
null
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI CYBORG Inspiring Quotes - Be Positive & Motivation & Motivation & Success @freakytheory-insprepositive-masterythink I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Inspiring Quotes - Be Positive & Motivation & Motivation & Success. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @freakytheory-insprepositive-masterythink's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1421105879408066565/hBHx-Rvl_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Rica af, she/her 🗽🏳️‍🌈</div> <div style="text-align: center; font-size: 14px;">@fredricksonra</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Rica af, she/her 🗽🏳️‍🌈. | Data | Rica af, she/her 🗽🏳️‍🌈 | | --- | --- | | Tweets downloaded | 3208 | | Retweets | 2893 | | Short tweets | 47 | | Tweets kept | 268 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3k0pcnmp/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @fredricksonra's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/123sil9f) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/123sil9f/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/fredricksonra') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/fredricksonra/1632796041349/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/fredricksonra
null
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT Rica af, she/her ️‍ @fredricksonra I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Rica af, she/her ️‍. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @fredricksonra's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1324181613547118592/3Hz_hHDx_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Megan Fritts 🤖 AI Bot </div> <div style="font-size: 15px">@freganmitts bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@freganmitts's tweets](https://twitter.com/freganmitts). | Data | Quantity | | --- | --- | | Tweets downloaded | 3246 | | Retweets | 85 | | Short tweets | 374 | | Tweets kept | 2787 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1ijtbgod/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @freganmitts's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/28vco2qy) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/28vco2qy/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/freganmitts') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/freganmitts/1616724707442/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/freganmitts
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Megan Fritts AI Bot @freganmitts bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @freganmitts's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @freganmitts's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1107690887/2010-08-budapest_400x400.png')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Frans 🤖 AI Bot </div> <div style="font-size: 15px">@frenzie bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@frenzie's tweets](https://twitter.com/frenzie). | Data | Quantity | | --- | --- | | Tweets downloaded | 1949 | | Retweets | 187 | | Short tweets | 167 | | Tweets kept | 1595 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2pst9rn9/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @frenzie's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1klwq88y) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1klwq88y/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/frenzie') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/frenzie/1617876740719/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/frenzie
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Frans AI Bot @frenzie bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @frenzie's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @frenzie's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1410804877538869249/sFFdL9zJ_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">acousticConductor (quadrants filled edition!! ♥♦♠)</div> <div style="text-align: center; font-size: 14px;">@frepno_mytoff</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from acousticConductor (quadrants filled edition!! ♥♦♠). | Data | acousticConductor (quadrants filled edition!! ♥♦♠) | | --- | --- | | Tweets downloaded | 3218 | | Retweets | 1944 | | Short tweets | 487 | | Tweets kept | 787 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/aujqwhay/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @frepno_mytoff's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2i5d4dgv) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2i5d4dgv/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/frepno_mytoff') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/frepno_mytoff/1628013500631/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/frepno_mytoff
null
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT acousticConductor (quadrants filled edition!! ) @frepno\_mytoff I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from acousticConductor (quadrants filled edition!! ). Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @frepno\_mytoff's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1412918415703019521/J2TQHTDo_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Evelyn🪶🇰🇵</div> <div style="text-align: center; font-size: 14px;">@freudotheism</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Evelyn🪶🇰🇵. | Data | Evelyn🪶🇰🇵 | | --- | --- | | Tweets downloaded | 3231 | | Retweets | 333 | | Short tweets | 968 | | Tweets kept | 1930 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3rbzyyts/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @freudotheism's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/elt06ed5) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/elt06ed5/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/freudotheism') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/freudotheism/1625867628365/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/freudotheism
null
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT Evelyn🇰🇵 @freudotheism I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Evelyn🇰🇵. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @freudotheism's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1366527087255949315/tKFBJBSW_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">𝕱𝕽𝕰𝖄𝕵𝕬 ディア 🤖 AI Bot </div> <div style="font-size: 15px">@freyjihad bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@freyjihad's tweets](https://twitter.com/freyjihad). | Data | Quantity | | --- | --- | | Tweets downloaded | 3235 | | Retweets | 670 | | Short tweets | 534 | | Tweets kept | 2031 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/373eguz3/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @freyjihad's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3lo1vdk7) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3lo1vdk7/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/freyjihad') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/freyjihad/1617789162482/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/freyjihad
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
𝕱𝕽𝕰𝖄𝕵𝕬 ディア AI Bot @freyjihad bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @freyjihad's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @freyjihad's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1336810992857210880/3msMJdlg_400x400.jpg&#39;)"> </div> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/483133814596595713/KOvTKS5s_400x400.jpeg&#39;)"> </div> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1389233037393727491/gIo9q6nS_400x400.jpg&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI CYBORG 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Karol Wiśniewski & SA Wardega & Sergiusz G.</div> <div style="text-align: center; font-size: 14px;">@friztoja-sawardega-thenitrozyniak</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Karol Wiśniewski & SA Wardega & Sergiusz G.. | Data | Karol Wiśniewski | SA Wardega | Sergiusz G. | | --- | --- | --- | --- | | Tweets downloaded | 271 | 141 | 3249 | | Retweets | 3 | 1 | 23 | | Short tweets | 33 | 32 | 671 | | Tweets kept | 235 | 108 | 2555 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1zlovf5t/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @friztoja-sawardega-thenitrozyniak's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3sy723ri) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3sy723ri/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/friztoja-sawardega-thenitrozyniak') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/friztoja-sawardega-thenitrozyniak/1630099755324/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/friztoja-sawardega-thenitrozyniak
null
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI CYBORG Karol Wiśniewski & SA Wardega & Sergiusz G. @friztoja-sawardega-thenitrozyniak I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Karol Wiśniewski & SA Wardega & Sergiusz G.. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @friztoja-sawardega-thenitrozyniak's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1424095619061141504/0FhWxHzI_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">frobenis</div> <div style="text-align: center; font-size: 14px;">@frobenis</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from frobenis. | Data | frobenis | | --- | --- | | Tweets downloaded | 245 | | Retweets | 1 | | Short tweets | 62 | | Tweets kept | 182 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1c5hws47/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @frobenis's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3ee5bpsa) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3ee5bpsa/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/frobenis') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/frobenis/1628616938616/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/frobenis
null
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT frobenis @frobenis I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from frobenis. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @frobenis's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1355266232841351170/8qLOMOZv_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Floppa ethan 🤖 AI Bot </div> <div style="font-size: 15px">@frogethan bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@frogethan's tweets](https://twitter.com/frogethan). | Data | Quantity | | --- | --- | | Tweets downloaded | 3207 | | Retweets | 203 | | Short tweets | 677 | | Tweets kept | 2327 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3u0b7jjl/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @frogethan's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1jqete5m) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1jqete5m/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/frogethan') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/frogethan/1614101371132/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/frogethan
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Floppa ethan AI Bot @frogethan bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @frogethan's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @frogethan's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1345057374625738753/UORuzXiL_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Froo 🤖 AI Bot </div> <div style="font-size: 15px">@frootcakee bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@frootcakee's tweets](https://twitter.com/frootcakee). | Data | Quantity | | --- | --- | | Tweets downloaded | 3189 | | Retweets | 993 | | Short tweets | 723 | | Tweets kept | 1473 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3l5bwy96/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @frootcakee's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2fx0tm4j) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2fx0tm4j/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/frootcakee') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/frootcakee/1617907737980/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/frootcakee
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Froo AI Bot @frootcakee bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @frootcakee's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @frootcakee's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1200867464466317313/_Q24D6X9_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">FM 🤖 AI Bot </div> <div style="font-size: 15px">@ftuuky bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@ftuuky's tweets](https://twitter.com/ftuuky). | Data | Quantity | | --- | --- | | Tweets downloaded | 966 | | Retweets | 342 | | Short tweets | 74 | | Tweets kept | 550 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3ogn4aj0/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @ftuuky's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/11u77t7x) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/11u77t7x/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/ftuuky') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/ftuuky/1616618051190/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/ftuuky
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
FM AI Bot @ftuuky bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @ftuuky's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @ftuuky's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<link rel="stylesheet" href="https://unpkg.com/@tailwindcss/typography@0.2.x/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://abs.twimg.com/sticky/default_profile_images/default_profile_400x400.png')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Bivek 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@fucko_el bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@fucko_el's tweets](https://twitter.com/fucko_el). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>2841</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>761</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>104</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>1976</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/249ga3z7/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @fucko_el's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/ut8q3ybx) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/ut8q3ybx/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/fucko_el'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets) <!--- random size file -->
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/fucko_el/1600841976446/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/fucko_el
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
<link rel="stylesheet" href="URL <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('URL </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Bivek AI Bot </div> <div style="font-size: 15px; color: #657786">@fucko_el bot</div> </div> I was made with huggingtweets. Create your own bot based on your favorite user with the demo! ## How does it work? The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. ## Training data The model was trained on @fucko_el's tweets. <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>2841</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>761</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>104</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>1976</td> </tr> </tbody> </table> Explore the data, which is tracked with W&B artifacts at every step of the pipeline. ## Training procedure The model is based on a pre-trained GPT-2 which is fine-tuned on @fucko_el's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/fucko_el'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> ![Follow](URL <section class='prose'> For more details, visit the project repository. </section> ![GitHub stars](URL
[ "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @fucko_el's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>2841</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>761</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>104</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>1976</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @fucko_el's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/fucko_el'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @fucko_el's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>2841</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>761</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>104</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>1976</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @fucko_el's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/fucko_el'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
text-generation
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/909490250149281792/loptFKY0_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Heretic</div> <div style="text-align: center; font-size: 14px;">@fuckthefocus</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Heretic. | Data | Heretic | | --- | --- | | Tweets downloaded | 3113 | | Retweets | 475 | | Short tweets | 396 | | Tweets kept | 2242 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/274nvr6f/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @fuckthefocus's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/oevst9bx) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/oevst9bx/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/fuckthefocus') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/fuckthefocus/1621363208946/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/fuckthefocus
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT Heretic @fuckthefocus I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Heretic. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @fuckthefocus's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1272946288389050368/OtPFPpC7_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Fullbitchscholar 🤖 AI Bot </div> <div style="font-size: 15px">@fullbitchschol1 bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@fullbitchschol1's tweets](https://twitter.com/fullbitchschol1). | Data | Quantity | | --- | --- | | Tweets downloaded | 3248 | | Retweets | 20 | | Short tweets | 224 | | Tweets kept | 3004 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1em7u8my/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @fullbitchschol1's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2u9ua2kl) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2u9ua2kl/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/fullbitchschol1') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/fullbitchschol1/1616889911749/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/fullbitchschol1
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Fullbitchscholar AI Bot @fullbitchschol1 bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @fullbitchschol1's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @fullbitchschol1's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/894956741573525504/YFg6jiNP_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Funny Or Die</div> <div style="text-align: center; font-size: 14px;">@funnyordie</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Funny Or Die. | Data | Funny Or Die | | --- | --- | | Tweets downloaded | 3250 | | Retweets | 237 | | Short tweets | 190 | | Tweets kept | 2823 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/zjkuy05u/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @funnyordie's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2jaeb619) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2jaeb619/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/funnyordie') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true", "widget": [{"text": "My dream is"}]}
huggingtweets/funnyordie
null
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT Funny Or Die @funnyordie I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Funny Or Die. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @funnyordie's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1880783603/avatar_mari-glasses_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Furinkan 🤖 AI Bot </div> <div style="font-size: 15px">@furinkan bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@furinkan's tweets](https://twitter.com/furinkan). | Data | Quantity | | --- | --- | | Tweets downloaded | 3212 | | Retweets | 1642 | | Short tweets | 114 | | Tweets kept | 1456 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/y9ze4kqs/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @furinkan's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2rdo1j34) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2rdo1j34/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/furinkan') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/furinkan/1618066660498/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/furinkan
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Furinkan AI Bot @furinkan bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @furinkan's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @furinkan's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<link rel="stylesheet" href="https://unpkg.com/@tailwindcss/typography@0.2.x/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1339300525007835137/YpAMPovA_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Micky The Weirdo from Taranto 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@furrymicky bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@furrymicky's tweets](https://twitter.com/furrymicky). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>459</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>14</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>91</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>354</td> </tr> </tbody> </table> [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/109l35nl/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @furrymicky's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/q3uw2fui) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/q3uw2fui/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/furrymicky'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true", "widget": [{"text": "My dream is"}]}
huggingtweets/furrymicky
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
<link rel="stylesheet" href="URL <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('URL </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Micky The Weirdo from Taranto AI Bot </div> <div style="font-size: 15px; color: #657786">@furrymicky bot</div> </div> I was made with huggingtweets. Create your own bot based on your favorite user with the demo! ## How does it work? The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. ## Training data The model was trained on @furrymicky's tweets. <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>459</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>14</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>91</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>354</td> </tr> </tbody> </table> Explore the data, which is tracked with W&B artifacts at every step of the pipeline. ## Training procedure The model is based on a pre-trained GPT-2 which is fine-tuned on @furrymicky's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/furrymicky'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> ![Follow](URL <section class='prose'> For more details, visit the project repository. </section> ![GitHub stars](URL
[ "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @furrymicky's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>459</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>14</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>91</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>354</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @furrymicky's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/furrymicky'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @furrymicky's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>459</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>14</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>91</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>354</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @furrymicky's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/furrymicky'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1256573356033273856/4iRYlwTb_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">ansq@漫画読みすぎ 🤖 AI Bot </div> <div style="font-size: 15px">@fuurawa bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@fuurawa's tweets](https://twitter.com/fuurawa). | Data | Quantity | | --- | --- | | Tweets downloaded | 1867 | | Retweets | 1276 | | Short tweets | 102 | | Tweets kept | 489 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2q0sdp5o/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @fuurawa's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/24t10y8h) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/24t10y8h/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/fuurawa') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/fuurawa/1616936220610/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/fuurawa
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
ansq@漫画読みすぎ AI Bot @fuurawa bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @fuurawa's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @fuurawa's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1387157870391832578/xWRJkuq__400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Gabriel Boric Font</div> <div style="text-align: center; font-size: 14px;">@gabrielboric</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Gabriel Boric Font. | Data | Gabriel Boric Font | | --- | --- | | Tweets downloaded | 3166 | | Retweets | 1575 | | Short tweets | 261 | | Tweets kept | 1330 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/sgtq44wg/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @gabrielboric's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/wl4b6qky) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/wl4b6qky/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/gabrielboric') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/gabrielboric/1628117067958/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/gabrielboric
null
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT Gabriel Boric Font @gabrielboric I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Gabriel Boric Font. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @gabrielboric's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<link rel="stylesheet" href="https://unpkg.com/@tailwindcss/typography@0.2.x/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1312899140615979008/ulnJKPCT_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">ZOZANZI ♤☆♤ VIRAGO 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@gadgetgreen bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@gadgetgreen's tweets](https://twitter.com/gadgetgreen). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3189</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>1537</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>215</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>1437</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/1f29q7ag/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @gadgetgreen's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/1df6ql9u) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/1df6ql9u/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/gadgetgreen'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets) <!--- random size file -->
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/gadgetgreen/1602201219260/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/gadgetgreen
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "has_space", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us
<link rel="stylesheet" href="URL <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('URL </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">ZOZANZI VIRAGO AI Bot </div> <div style="font-size: 15px; color: #657786">@gadgetgreen bot</div> </div> I was made with huggingtweets. Create your own bot based on your favorite user with the demo! ## How does it work? The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. ## Training data The model was trained on @gadgetgreen's tweets. <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3189</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>1537</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>215</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>1437</td> </tr> </tbody> </table> Explore the data, which is tracked with W&B artifacts at every step of the pipeline. ## Training procedure The model is based on a pre-trained GPT-2 which is fine-tuned on @gadgetgreen's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/gadgetgreen'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> ![Follow](URL <section class='prose'> For more details, visit the project repository. </section> ![GitHub stars](URL
[ "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @gadgetgreen's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3189</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>1537</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>215</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>1437</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @gadgetgreen's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/gadgetgreen'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us \n", "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @gadgetgreen's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3189</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>1537</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>215</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>1437</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @gadgetgreen's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/gadgetgreen'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1372021004969472002/J07dtn_B_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">GⒶge H Leibman 🤖 AI Bot </div> <div style="font-size: 15px">@gagehleibman bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@gagehleibman's tweets](https://twitter.com/gagehleibman). | Data | Quantity | | --- | --- | | Tweets downloaded | 3117 | | Retweets | 600 | | Short tweets | 486 | | Tweets kept | 2031 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3vjxnqnf/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @gagehleibman's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/67jfcjhk) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/67jfcjhk/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/gagehleibman') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/gagehleibman/1616696622775/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/gagehleibman
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
GⒶge H Leibman AI Bot @gagehleibman bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @gagehleibman's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @gagehleibman's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<link rel="stylesheet" href="https://unpkg.com/@tailwindcss/typography@0.2.x/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1306714515094921217/cH_rXwuk_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Gail Simone RED HEADED WOMAN NOT BEAR 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@gailsimone bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@gailsimone's tweets](https://twitter.com/gailsimone). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3205</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>1400</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>322</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>1483</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/1u34kgh5/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @gailsimone's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/3krfygi5) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/3krfygi5/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/gailsimone'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets) <!--- random size file -->
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/gailsimone/1601276450894/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/gailsimone
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
<link rel="stylesheet" href="URL <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('URL </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Gail Simone RED HEADED WOMAN NOT BEAR AI Bot </div> <div style="font-size: 15px; color: #657786">@gailsimone bot</div> </div> I was made with huggingtweets. Create your own bot based on your favorite user with the demo! ## How does it work? The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. ## Training data The model was trained on @gailsimone's tweets. <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3205</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>1400</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>322</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>1483</td> </tr> </tbody> </table> Explore the data, which is tracked with W&B artifacts at every step of the pipeline. ## Training procedure The model is based on a pre-trained GPT-2 which is fine-tuned on @gailsimone's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/gailsimone'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> ![Follow](URL <section class='prose'> For more details, visit the project repository. </section> ![GitHub stars](URL
[ "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @gailsimone's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3205</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>1400</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>322</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>1483</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @gailsimone's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/gailsimone'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @gailsimone's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3205</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>1400</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>322</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>1483</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @gailsimone's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/gailsimone'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
text-generation
transformers
<link rel="stylesheet" href="https://unpkg.com/@tailwindcss/typography@0.2.x/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1276957687507496962/zy4w13io_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Gal Shapira 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@galjudo bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@galjudo's tweets](https://twitter.com/galjudo). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3211</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>420</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>653</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>2138</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/1iczn33x/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @galjudo's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/14zzhtt9) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/14zzhtt9/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/galjudo'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets) <!--- random size file -->
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/galjudo/1602233220657/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/galjudo
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
<link rel="stylesheet" href="URL <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('URL </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Gal Shapira AI Bot </div> <div style="font-size: 15px; color: #657786">@galjudo bot</div> </div> I was made with huggingtweets. Create your own bot based on your favorite user with the demo! ## How does it work? The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. ## Training data The model was trained on @galjudo's tweets. <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3211</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>420</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>653</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>2138</td> </tr> </tbody> </table> Explore the data, which is tracked with W&B artifacts at every step of the pipeline. ## Training procedure The model is based on a pre-trained GPT-2 which is fine-tuned on @galjudo's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/galjudo'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> ![Follow](URL <section class='prose'> For more details, visit the project repository. </section> ![GitHub stars](URL
[ "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @galjudo's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3211</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>420</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>653</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>2138</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @galjudo's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/galjudo'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @galjudo's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3211</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>420</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>653</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>2138</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @galjudo's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/galjudo'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
text-generation
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1415310065960198148/w9Yr9mLK_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">gãmbs</div> <div style="text-align: center; font-size: 14px;">@gambsvns</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from gãmbs. | Data | gãmbs | | --- | --- | | Tweets downloaded | 3246 | | Retweets | 86 | | Short tweets | 308 | | Tweets kept | 2852 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2wahjzcj/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @gambsvns's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1td3tcaf) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1td3tcaf/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/gambsvns') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/gambsvns/1626385842515/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/gambsvns
null
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT gãmbs @gambsvns I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from gãmbs. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @gambsvns's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/999953713958739968/NQspJe-0_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Repulse | Iragon is on Kickstarter!</div> <div style="text-align: center; font-size: 14px;">@gamerepulse</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Repulse | Iragon is on Kickstarter!. | Data | Repulse | Iragon is on Kickstarter! | | --- | --- | | Tweets downloaded | 510 | | Retweets | 166 | | Short tweets | 23 | | Tweets kept | 321 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3dqejmdb/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @gamerepulse's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/czq1aton) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/czq1aton/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/gamerepulse') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/gamerepulse/1637857655050/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/gamerepulse
null
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;URL </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> AI BOT </div> <div style="text-align: center; font-size: 16px; font-weight: 800">Repulse | Iragon is on Kickstarter!</div> <div style="text-align: center; font-size: 14px;">@gamerepulse</div> </div> I was made with huggingtweets. Create your own bot based on your favorite user with the demo! ## How does it work? The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. ## Training data The model was trained on tweets from Repulse | Iragon is on Kickstarter!. | Data | Repulse | Iragon is on Kickstarter! | | --- | --- | | Tweets downloaded | 510 | | Retweets | 166 | | Short tweets | 23 | | Tweets kept | 321 | Explore the data, which is tracked with W&B artifacts at every step of the pipeline. ## Training procedure The model is based on a pre-trained GPT-2 which is fine-tuned on @gamerepulse's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ## Limitations and bias The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[ "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on tweets from Repulse | Iragon is on Kickstarter!.\n\n| Data | Repulse | Iragon is on Kickstarter! |\n| --- | --- |\n| Tweets downloaded | 510 |\n| Retweets | 166 |\n| Short tweets | 23 |\n| Tweets kept | 321 |\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @gamerepulse's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## How to use\n\nYou can use this model directly with a pipeline for text generation:", "## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n![Follow](URL\n\nFor more details, visit the project repository.\n\n![GitHub stars](URL" ]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on tweets from Repulse | Iragon is on Kickstarter!.\n\n| Data | Repulse | Iragon is on Kickstarter! |\n| --- | --- |\n| Tweets downloaded | 510 |\n| Retweets | 166 |\n| Short tweets | 23 |\n| Tweets kept | 321 |\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @gamerepulse's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## How to use\n\nYou can use this model directly with a pipeline for text generation:", "## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n![Follow](URL\n\nFor more details, visit the project repository.\n\n![GitHub stars](URL" ]
text-generation
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1410165726506274819/4HVcR7Es_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Gandalf the White (Thulêan Perspective)</div> <div style="text-align: center; font-size: 14px;">@gandalfthewhi19</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Gandalf the White (Thulêan Perspective). | Data | Gandalf the White (Thulêan Perspective) | | --- | --- | | Tweets downloaded | 3244 | | Retweets | 431 | | Short tweets | 225 | | Tweets kept | 2588 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1r47j719/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @gandalfthewhi19's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/u6nhe6ef) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/u6nhe6ef/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/gandalfthewhi19') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "http://www.huggingtweets.com/gandalfthewhi19/1645099160912/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/gandalfthewhi19
null
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT Gandalf the White (Thulêan Perspective) @gandalfthewhi19 I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Gandalf the White (Thulêan Perspective). Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @gandalfthewhi19's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1326680694370734082/wjLz-oO4_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Gary Short</div> <div style="text-align: center; font-size: 14px;">@garyshort</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Gary Short. | Data | Gary Short | | --- | --- | | Tweets downloaded | 3248 | | Retweets | 94 | | Short tweets | 321 | | Tweets kept | 2833 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2vtmlhlj/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @garyshort's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2pfbf1ys) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2pfbf1ys/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/garyshort') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "http://www.huggingtweets.com/garyshort/1647971079915/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/garyshort
null
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT Gary Short @garyshort I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Gary Short. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @garyshort's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/378800000365450982/53f25bdef0dd40bf20b58df314a94770_400x400.jpeg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Gaston Gordillo 🤖 AI Bot </div> <div style="font-size: 15px">@gaston_gordillo bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@gaston_gordillo's tweets](https://twitter.com/gaston_gordillo). | Data | Quantity | | --- | --- | | Tweets downloaded | 705 | | Retweets | 524 | | Short tweets | 5 | | Tweets kept | 176 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3kme4rls/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @gaston_gordillo's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/6zu3yfw0) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/6zu3yfw0/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/gaston_gordillo') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/gaston_gordillo/1617249460228/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/gaston_gordillo
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Gaston Gordillo AI Bot @gaston\_gordillo bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @gaston\_gordillo's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @gaston\_gordillo's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1234322984183226369/3KzZ3P1J_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">gatcha 🤖 AI Bot </div> <div style="font-size: 15px">@gatchabot bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@gatchabot's tweets](https://twitter.com/gatchabot). | Data | Quantity | | --- | --- | | Tweets downloaded | 2200 | | Retweets | 1728 | | Short tweets | 121 | | Tweets kept | 351 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3qhi9616/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @gatchabot's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1o3eonr9) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1o3eonr9/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/gatchabot') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true", "widget": [{"text": "My dream is"}]}
huggingtweets/gatchabot
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
gatcha AI Bot @gatchabot bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @gatchabot's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @gatchabot's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1610957642616406016/YPHe6yn-_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">ᕙ(‾̀◡‾́)ᕗ g</div> <div style="text-align: center; font-size: 14px;">@gaucheian</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from ᕙ(‾̀◡‾́)ᕗ g. | Data | ᕙ(‾̀◡‾́)ᕗ g | | --- | --- | | Tweets downloaded | 2213 | | Retweets | 92 | | Short tweets | 279 | | Tweets kept | 1842 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/x1bx2fez/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @gaucheian's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/0i3i22al) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/0i3i22al/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/gaucheian') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true", "widget": [{"text": "My dream is"}]}
huggingtweets/gaucheian
null
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT ᕙ(‾̀◡‾́)ᕗ g @gaucheian I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from ᕙ(‾̀◡‾́)ᕗ g. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @gaucheian's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1362440200304041986/nLi9iMVI_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Gavi Begtrup</div> <div style="text-align: center; font-size: 14px;">@gavibegtrup</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Gavi Begtrup. | Data | Gavi Begtrup | | --- | --- | | Tweets downloaded | 990 | | Retweets | 67 | | Short tweets | 49 | | Tweets kept | 874 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1kx48u2r/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @gavibegtrup's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1n9nuiku) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1n9nuiku/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/gavibegtrup') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/gavibegtrup/1622127344791/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/gavibegtrup
null
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT Gavi Begtrup @gavibegtrup I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Gavi Begtrup. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @gavibegtrup's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1347975597134385152/zABvUQAs_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">sam 🤖 AI Bot </div> <div style="font-size: 15px">@gayandonline bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@gayandonline's tweets](https://twitter.com/gayandonline). | Data | Quantity | | --- | --- | | Tweets downloaded | 3002 | | Retweets | 290 | | Short tweets | 293 | | Tweets kept | 2419 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3963etnb/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @gayandonline's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/146uc4xj) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/146uc4xj/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/gayandonline') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/gayandonline/1617808083660/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/gayandonline
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
sam AI Bot @gayandonline bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @gayandonline's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @gayandonline's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1361159012990013445/rVk0X1DL_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">alphonse⛓️🦇🌹 🤖 AI Bot </div> <div style="font-size: 15px">@gaybats1999 bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@gaybats1999's tweets](https://twitter.com/gaybats1999). | Data | Quantity | | --- | --- | | Tweets downloaded | 2783 | | Retweets | 999 | | Short tweets | 225 | | Tweets kept | 1559 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/39y8clnw/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @gaybats1999's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2mzsqlq3) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2mzsqlq3/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/gaybats1999') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/gaybats1999/1614135497450/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/gaybats1999
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
alphonse️ AI Bot @gaybats1999 bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @gaybats1999's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @gaybats1999's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1354923690631323652/MZgzGX3P_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">lisa 🏳️‍⚧️ 🤖 AI Bot </div> <div style="font-size: 15px">@gaydeerinc bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@gaydeerinc's tweets](https://twitter.com/gaydeerinc). | Data | Quantity | | --- | --- | | Tweets downloaded | 3214 | | Retweets | 1108 | | Short tweets | 310 | | Tweets kept | 1796 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2nsi7oic/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @gaydeerinc's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3gqx2ecq) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3gqx2ecq/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/gaydeerinc') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/gaydeerinc/1614165768951/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/gaydeerinc
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
lisa ️‍️ AI Bot @gaydeerinc bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @gaydeerinc's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @gaydeerinc's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1329364120311828487/0VjzWPsR_400x400.png')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">rokos basilisk construction advocate 🤖 AI Bot </div> <div style="font-size: 15px">@gayguynewsnet bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@gayguynewsnet's tweets](https://twitter.com/gayguynewsnet). | Data | Quantity | | --- | --- | | Tweets downloaded | 592 | | Retweets | 146 | | Short tweets | 64 | | Tweets kept | 382 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2yoxivok/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @gayguynewsnet's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3dalf0je) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3dalf0je/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/gayguynewsnet') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/gayguynewsnet/1618199553249/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/gayguynewsnet
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
rokos basilisk construction advocate AI Bot @gayguynewsnet bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @gayguynewsnet's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @gayguynewsnet's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1350934729525178370/VHPqhIcr_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">value bastard 🤖 AI Bot </div> <div style="font-size: 15px">@gaypizzaboy bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@gaypizzaboy's tweets](https://twitter.com/gaypizzaboy). | Data | Quantity | | --- | --- | | Tweets downloaded | 3138 | | Retweets | 1548 | | Short tweets | 147 | | Tweets kept | 1443 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1vxwbfva/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @gaypizzaboy's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2t6winba) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2t6winba/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/gaypizzaboy') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/gaypizzaboy/1614169105934/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/gaypizzaboy
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
value bastard AI Bot @gaypizzaboy bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @gaypizzaboy's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @gaypizzaboy's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1428482513417105413/TGlo7HWH_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">العلجوم</div> <div style="text-align: center; font-size: 14px;">@gaytoad2</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from العلجوم. | Data | العلجوم | | --- | --- | | Tweets downloaded | 3232 | | Retweets | 379 | | Short tweets | 1023 | | Tweets kept | 1830 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2w8lap6f/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @gaytoad2's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/34u34diu) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/34u34diu/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/gaytoad2') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/gaytoad2/1629434767014/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/gaytoad2
null
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT العلجوم @gaytoad2 I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from العلجوم. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @gaytoad2's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1349844043291889671/yfQAojJv_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Gender Critical Argument Bot 🤖 AI Bot </div> <div style="font-size: 15px">@gcargumentbot bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@gcargumentbot's tweets](https://twitter.com/gcargumentbot). | Data | Quantity | | --- | --- | | Tweets downloaded | 3250 | | Retweets | 1 | | Short tweets | 223 | | Tweets kept | 3026 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/18f76f7w/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @gcargumentbot's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2lzgykty) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2lzgykty/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/gcargumentbot') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/gcargumentbot/1616766934700/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/gcargumentbot
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Gender Critical Argument Bot AI Bot @gcargumentbot bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @gcargumentbot's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @gcargumentbot's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1377855294655623169/hlahDP3v_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">craz 🗿 🤖 AI Bot </div> <div style="font-size: 15px">@geckogirl0 bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@geckogirl0's tweets](https://twitter.com/geckogirl0). | Data | Quantity | | --- | --- | | Tweets downloaded | 3132 | | Retweets | 1313 | | Short tweets | 225 | | Tweets kept | 1594 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2qub6zq7/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @geckogirl0's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1wc0a99s) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1wc0a99s/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/geckogirl0') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/geckogirl0/1617784269558/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/geckogirl0
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
craz AI Bot @geckogirl0 bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @geckogirl0's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @geckogirl0's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1309828363385622529/7xxDa_4j_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">100 gecs hater 🤖 AI Bot </div> <div style="font-size: 15px">@gecshater bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@gecshater's tweets](https://twitter.com/gecshater). | Data | Quantity | | --- | --- | | Tweets downloaded | 3238 | | Retweets | 67 | | Short tweets | 550 | | Tweets kept | 2621 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1zp0k65t/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @gecshater's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/13yufu4u) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/13yufu4u/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/gecshater') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/gecshater/1617797159320/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/gecshater
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
100 gecs hater AI Bot @gecshater bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @gecshater's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @gecshater's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<link rel="stylesheet" href="https://unpkg.com/@tailwindcss/typography@0.2.x/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('http://pbs.twimg.com/profile_images/517689241812627456/pyBGyEo__400x400.jpeg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Lutz Büch 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@geilehirnbude bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@geilehirnbude's tweets](https://twitter.com/geilehirnbude). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3116</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>2906</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>53</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>157</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/wt8lffrr/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @geilehirnbude's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/1u8augcw) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/1u8augcw/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/geilehirnbude'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://github.com/borisdayma/huggingtweets/blob/master/img/logo_share.png?raw=true", "widget": [{"text": "My dream is"}]}
huggingtweets/geilehirnbude
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
<link rel="stylesheet" href="URL <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('URL </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Lutz Büch AI Bot </div> <div style="font-size: 15px; color: #657786">@geilehirnbude bot</div> </div> I was made with huggingtweets. Create your own bot based on your favorite user with the demo! ## How does it work? The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. ## Training data The model was trained on @geilehirnbude's tweets. <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3116</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>2906</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>53</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>157</td> </tr> </tbody> </table> Explore the data, which is tracked with W&B artifacts at every step of the pipeline. ## Training procedure The model is based on a pre-trained GPT-2 which is fine-tuned on @geilehirnbude's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/geilehirnbude'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> ![Follow](URL <section class='prose'> For more details, visit the project repository. </section> ![GitHub stars](URL
[ "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @geilehirnbude's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3116</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>2906</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>53</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>157</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @geilehirnbude's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/geilehirnbude'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @geilehirnbude's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3116</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>2906</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>53</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>157</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @geilehirnbude's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/geilehirnbude'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
text-generation
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1403413998436032514/QdAbQHYm_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">GEEGA ギガ 🔝</div> <div style="text-align: center; font-size: 14px;">@generalgeega</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from GEEGA ギガ 🔝. | Data | GEEGA ギガ 🔝 | | --- | --- | | Tweets downloaded | 3250 | | Retweets | 127 | | Short tweets | 1477 | | Tweets kept | 1646 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2owkgdxf/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @generalgeega's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/21lavo70) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/21lavo70/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/generalgeega') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/generalgeega/1624741487901/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/generalgeega
null
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT GEEGA ギガ @generalgeega I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from GEEGA ギガ . Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @generalgeega's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1366107880483557377/V58xvEUv_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Genji 🤖 AI Bot </div> <div style="font-size: 15px">@genjitoday bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@genjitoday's tweets](https://twitter.com/genjitoday). | Data | Quantity | | --- | --- | | Tweets downloaded | 515 | | Retweets | 30 | | Short tweets | 72 | | Tweets kept | 413 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2t88j5a6/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @genjitoday's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1uhl7b30) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1uhl7b30/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/genjitoday') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/genjitoday/1617772086820/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/genjitoday
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Genji AI Bot @genjitoday bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @genjitoday's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @genjitoday's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1357875471199969286/yqSYSz1G_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Fishorse • Big Baldsuya 🤖 AI Bot </div> <div style="font-size: 15px">@gentlefishorse bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@gentlefishorse's tweets](https://twitter.com/gentlefishorse). | Data | Quantity | | --- | --- | | Tweets downloaded | 3142 | | Retweets | 1903 | | Short tweets | 159 | | Tweets kept | 1080 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2h9g07c3/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @gentlefishorse's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/ao0ru7g8) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/ao0ru7g8/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/gentlefishorse') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/gentlefishorse/1614214431723/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/gentlefishorse
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Fishorse • Big Baldsuya AI Bot @gentlefishorse bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @gentlefishorse's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @gentlefishorse's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<link rel="stylesheet" href="https://unpkg.com/@tailwindcss/typography@0.2.x/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('http://pbs.twimg.com/profile_images/867463880582340608/b2CozYM-_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Suresh Venkatasubramanian 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@geomblog bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@geomblog's tweets](https://twitter.com/geomblog). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3199</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>1301</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>178</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>1720</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/5jk973vf/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @geomblog's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/3edsvd65) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/3edsvd65/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/geomblog'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "http://www.huggingtweets.com/geomblog/1600332316026/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/geomblog
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
<link rel="stylesheet" href="URL <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('URL </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Suresh Venkatasubramanian AI Bot </div> <div style="font-size: 15px; color: #657786">@geomblog bot</div> </div> I was made with huggingtweets. Create your own bot based on your favorite user with the demo! ## How does it work? The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. ## Training data The model was trained on @geomblog's tweets. <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3199</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>1301</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>178</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>1720</td> </tr> </tbody> </table> Explore the data, which is tracked with W&B artifacts at every step of the pipeline. ## Training procedure The model is based on a pre-trained GPT-2 which is fine-tuned on @geomblog's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/geomblog'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> ![Follow](URL <section class='prose'> For more details, visit the project repository. </section> ![GitHub stars](URL
[ "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @geomblog's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3199</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>1301</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>178</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>1720</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @geomblog's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/geomblog'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @geomblog's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3199</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>1301</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>178</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>1720</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @geomblog's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/geomblog'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]