modelId
stringlengths
5
139
author
stringlengths
2
42
last_modified
timestamp[us, tz=UTC]date
2020-02-15 11:33:14
2025-09-12 18:33:19
downloads
int64
0
223M
likes
int64
0
11.7k
library_name
stringclasses
555 values
tags
listlengths
1
4.05k
pipeline_tag
stringclasses
55 values
createdAt
timestamp[us, tz=UTC]date
2022-03-02 23:29:04
2025-09-12 18:33:14
card
stringlengths
11
1.01M
huggingtweets/fchollet
huggingtweets
2021-05-22T03:56:57Z
5
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true tags: - huggingtweets widget: - text: "My dream is" --- <link rel="stylesheet" href="https://unpkg.com/@tailwindcss/typography@0.2.x/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('http://pbs.twimg.com/profile_images/1234692331263016960/7uR-nYW0_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">François Chollet 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@fchollet bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@fchollet's tweets](https://twitter.com/fchollet). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3231</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>682</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>82</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>2467</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/2rv5any2/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @fchollet's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/3ajhtw99) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/3ajhtw99/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/fchollet'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/fatuisv
huggingtweets
2021-05-22T03:55:50Z
5
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/fatuisv/1617499521191/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1375565484142247936/O4bEMEUL_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Coriander 🤖 AI Bot </div> <div style="font-size: 15px">@fatuisv bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@fatuisv's tweets](https://twitter.com/fatuisv). | Data | Quantity | | --- | --- | | Tweets downloaded | 3246 | | Retweets | 145 | | Short tweets | 1007 | | Tweets kept | 2094 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/gxoztns2/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @fatuisv's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3imhaxow) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3imhaxow/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/fatuisv') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/fartydoodooman
huggingtweets
2021-05-22T03:54:38Z
4
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1330826999548366848/LjVI40IO_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Cock & Ball Nurture 🤖 AI Bot </div> <div style="font-size: 15px">@fartydoodooman bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@fartydoodooman's tweets](https://twitter.com/fartydoodooman). | Data | Quantity | | --- | --- | | Tweets downloaded | 3237 | | Retweets | 41 | | Short tweets | 710 | | Tweets kept | 2486 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2qujd2zx/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @fartydoodooman's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/17h7xprc) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/17h7xprc/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/fartydoodooman') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/fakegirl501
huggingtweets
2021-05-22T03:50:03Z
5
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1359781238626787329/gplqQPUC_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">゚+*:;;:nᏆFYgvgc๓Ꭿjjutdxryjvnlohcxseᕍᖴᖇ๓ioᎾKᏴ๏ᎯᏔtr 🤖 AI Bot </div> <div style="font-size: 15px">@fakegirl501 bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@fakegirl501's tweets](https://twitter.com/fakegirl501). | Data | Quantity | | --- | --- | | Tweets downloaded | 3239 | | Retweets | 0 | | Short tweets | 9 | | Tweets kept | 3230 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/26mq2690/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @fakegirl501's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/25jro0gs) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/25jro0gs/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/fakegirl501') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/failboat103
huggingtweets
2021-05-22T03:48:35Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/failboat103/1615175501983/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1326257325250932737/dc_DqeXe_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Daniel on Earth 🤖 AI Bot </div> <div style="font-size: 15px">@failboat103 bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@failboat103's tweets](https://twitter.com/failboat103). | Data | Quantity | | --- | --- | | Tweets downloaded | 3245 | | Retweets | 273 | | Short tweets | 589 | | Tweets kept | 2383 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1lwsze29/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @failboat103's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1y7f99jd) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1y7f99jd/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/failboat103') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/facebook
huggingtweets
2021-05-22T03:46:37Z
5
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/facebook/1609715229852/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <link rel="stylesheet" href="https://unpkg.com/@tailwindcss/typography@0.2.x/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1305899202703048705/bMHePt35_400x400.png')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Facebook 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@facebook bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@facebook's tweets](https://twitter.com/facebook). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3227</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>165</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>22</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>3040</td> </tr> </tbody> </table> [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/11ylirf6/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @facebook's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/13i0hel1) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/13i0hel1/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/facebook'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/ezraklein
huggingtweets
2021-05-22T03:45:31Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/ezraklein/1602197132592/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <link rel="stylesheet" href="https://unpkg.com/@tailwindcss/typography@0.2.x/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1205000761521819648/JAI78T4j_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Ezra Klein 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@ezraklein bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@ezraklein's tweets](https://twitter.com/ezraklein). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3198</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>727</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>87</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>2384</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/34edyx8b/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @ezraklein's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/iaon36wp) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/iaon36wp/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/ezraklein'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets) <!--- random size file -->
huggingtweets/existentialcoms
huggingtweets
2021-05-22T03:42:47Z
4
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/existentialcoms/1616688856296/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/792916557403795456/d-iEnfPD_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Existential Comics 🤖 AI Bot </div> <div style="font-size: 15px">@existentialcoms bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@existentialcoms's tweets](https://twitter.com/existentialcoms). | Data | Quantity | | --- | --- | | Tweets downloaded | 3250 | | Retweets | 14 | | Short tweets | 17 | | Tweets kept | 3219 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2zwnq29b/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @existentialcoms's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2pb5l6b7) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2pb5l6b7/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/existentialcoms') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/evilbmcats
huggingtweets
2021-05-22T03:38:54Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/evilbmcats/1602273897909/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <link rel="stylesheet" href="https://unpkg.com/@tailwindcss/typography@0.2.x/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/691394350006177793/dV2zgoWt_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Black Metal Cats 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@evilbmcats bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@evilbmcats's tweets](https://twitter.com/evilbmcats). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>2201</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>2</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>240</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>1959</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/2lubo9i1/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @evilbmcats's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/t9cbvdh0) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/t9cbvdh0/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/evilbmcats'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets) <!--- random size file -->
huggingtweets/evetheism
huggingtweets
2021-05-22T03:37:41Z
5
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/evetheism/1618025885286/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1378879915769032705/PJ7_-J0w_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Dżrevelow🪤 (BANNED AGAIN) 🤖 AI Bot </div> <div style="font-size: 15px">@evetheism bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@evetheism's tweets](https://twitter.com/evetheism). | Data | Quantity | | --- | --- | | Tweets downloaded | 1393 | | Retweets | 136 | | Short tweets | 386 | | Tweets kept | 871 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/zgufrxx1/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @evetheism's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/10yml7r1) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/10yml7r1/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/evetheism') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/evanjfields
huggingtweets
2021-05-22T03:36:27Z
4
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/evanjfields/1617973644351/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/504088650364104705/OOCZwfkK_400x400.jpeg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Evan Fields 🤖 AI Bot </div> <div style="font-size: 15px">@evanjfields bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@evanjfields's tweets](https://twitter.com/evanjfields). | Data | Quantity | | --- | --- | | Tweets downloaded | 1323 | | Retweets | 36 | | Short tweets | 43 | | Tweets kept | 1244 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/10tsj3yq/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @evanjfields's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3vdeuk4w) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3vdeuk4w/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/evanjfields') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/evandknox
huggingtweets
2021-05-22T03:35:25Z
4
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1343999703776518144/ZWju8o9V_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Evan Knox 🤖 AI Bot </div> <div style="font-size: 15px">@evandknox bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@evandknox's tweets](https://twitter.com/evandknox). | Data | Quantity | | --- | --- | | Tweets downloaded | 848 | | Retweets | 98 | | Short tweets | 53 | | Tweets kept | 697 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1oe57zgg/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @evandknox's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1zossq8w) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1zossq8w/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/evandknox') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/evancmalone
huggingtweets
2021-05-22T03:34:18Z
4
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/evancmalone/1616776839848/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1370268003426795522/-x4YzCdo_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Evan Malone 🤖 AI Bot </div> <div style="font-size: 15px">@evancmalone bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@evancmalone's tweets](https://twitter.com/evancmalone). | Data | Quantity | | --- | --- | | Tweets downloaded | 774 | | Retweets | 468 | | Short tweets | 15 | | Tweets kept | 291 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2txxz8kc/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @evancmalone's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3qakxbgu) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3qakxbgu/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/evancmalone') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/etcanada
huggingtweets
2021-05-22T03:32:55Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/etcanada/1613324841076/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1159160036125564930/33nAmouA_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">ET Canada 🤖 AI Bot </div> <div style="font-size: 15px">@etcanada bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@etcanada's tweets](https://twitter.com/etcanada). | Data | Quantity | | --- | --- | | Tweets downloaded | 3241 | | Retweets | 27 | | Short tweets | 13 | | Tweets kept | 3201 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1mkfurkr/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @etcanada's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/37a3w2d0) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/37a3w2d0/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/etcanada') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/esjhanez
huggingtweets
2021-05-22T03:27:54Z
4
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1240625242072023044/Jruquom7_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">jhanez jhobbes, jhilosopher 🤖 AI Bot </div> <div style="font-size: 15px">@esjhanez bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@esjhanez's tweets](https://twitter.com/esjhanez). | Data | Quantity | | --- | --- | | Tweets downloaded | 2313 | | Retweets | 266 | | Short tweets | 197 | | Tweets kept | 1850 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3cebik9o/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @esjhanez's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3ot7nfwp) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3ot7nfwp/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/esjhanez') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/erikmcoronado
huggingtweets
2021-05-22T03:25:05Z
11
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/erikmcoronado/1616696972159/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1372627176911212551/zTl50P7T_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Erik 🤖 AI Bot </div> <div style="font-size: 15px">@erikmcoronado bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@erikmcoronado's tweets](https://twitter.com/erikmcoronado). | Data | Quantity | | --- | --- | | Tweets downloaded | 3153 | | Retweets | 880 | | Short tweets | 865 | | Tweets kept | 1408 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1fdtxazs/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @erikmcoronado's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3m6lni5s) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3m6lni5s/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/erikmcoronado') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/cocojamgg
huggingtweets
2021-05-22T03:24:47Z
4
0
transformers
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/cocojamgg/1621653883164/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1081720632042287106/W_D_Ir3Z_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">cocojam™</div> <div style="text-align: center; font-size: 14px;">@cocojamgg</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from cocojam™. | Data | cocojam™ | | --- | --- | | Tweets downloaded | 3228 | | Retweets | 446 | | Short tweets | 523 | | Tweets kept | 2259 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3jjl7x6f/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @cocojamgg's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/sgoyl3sa) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/sgoyl3sa/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/cocojamgg') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/erictopol
huggingtweets
2021-05-22T03:23:47Z
13
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/erictopol/1603504337198/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <link rel="stylesheet" href="https://unpkg.com/@tailwindcss/typography@0.2.x/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1115688713424519169/kHgDk5TV_400x400.png')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Eric Topol 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@erictopol bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@erictopol's tweets](https://twitter.com/erictopol). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3215</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>217</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>110</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>2888</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/3uqm4l1s/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @erictopol's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/2r98oz1j) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/2r98oz1j/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/erictopol'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets) <!--- random size file -->
huggingtweets/epresleyquotes
huggingtweets
2021-05-22T03:16:52Z
5
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/epresleyquotes/1613326026566/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/479324836037001216/d5p4IvF3_400x400.jpeg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Elvis Presley Quotes 🤖 AI Bot </div> <div style="font-size: 15px">@epresleyquotes bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@epresleyquotes's tweets](https://twitter.com/epresleyquotes). | Data | Quantity | | --- | --- | | Tweets downloaded | 3234 | | Retweets | 0 | | Short tweets | 4 | | Tweets kept | 3230 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1rhg677s/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @epresleyquotes's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2plens69) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2plens69/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/epresleyquotes') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/epic_izzy_tacos
huggingtweets
2021-05-22T03:15:23Z
5
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/epic_izzy_tacos/1614108247722/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1270568717676449792/93Uf8QYu_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Nena Bell 🤖 AI Bot </div> <div style="font-size: 15px">@epic_izzy_tacos bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@epic_izzy_tacos's tweets](https://twitter.com/epic_izzy_tacos). | Data | Quantity | | --- | --- | | Tweets downloaded | 3040 | | Retweets | 1609 | | Short tweets | 97 | | Tweets kept | 1334 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3k801nu2/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @epic_izzy_tacos's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/32ahicqm) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/32ahicqm/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/epic_izzy_tacos') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/endlessoffal
huggingtweets
2021-05-22T03:08:50Z
4
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/endlessoffal/1617845283715/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1372304475486965764/JrTJe3i9_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Great Value Abu Ghraib 🤖 AI Bot </div> <div style="font-size: 15px">@endlessoffal bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@endlessoffal's tweets](https://twitter.com/endlessoffal). | Data | Quantity | | --- | --- | | Tweets downloaded | 677 | | Retweets | 354 | | Short tweets | 104 | | Tweets kept | 219 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3a9boxro/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @endlessoffal's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3mfqyzwf) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3mfqyzwf/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/endlessoffal') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/emmashwemma
huggingtweets
2021-05-22T03:05:10Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/emmashwemma/1614163895007/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1358876684943052800/Ax_Y58cS_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">emma 🤖 AI Bot </div> <div style="font-size: 15px">@emmashwemma bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@emmashwemma's tweets](https://twitter.com/emmashwemma). | Data | Quantity | | --- | --- | | Tweets downloaded | 1214 | | Retweets | 247 | | Short tweets | 185 | | Tweets kept | 782 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/10jgi9n1/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @emmashwemma's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/4w8o9jwb) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/4w8o9jwb/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/emmashwemma') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/emailoctopus
huggingtweets
2021-05-22T03:00:30Z
8
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true tags: - huggingtweets widget: - text: "My dream is" --- <link rel="stylesheet" href="https://unpkg.com/@tailwindcss/typography@0.2.x/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1323402841596305408/KLR3mtk8_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">EmailOctopus 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@emailoctopus bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@emailoctopus's tweets](https://twitter.com/emailoctopus). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>2238</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>415</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>100</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>1723</td> </tr> </tbody> </table> [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3cty91ha/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @emailoctopus's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3f0s4i3n) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3f0s4i3n/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/emailoctopus'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/ellxrichardson
huggingtweets
2021-05-22T02:52:17Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/ellxrichardson/1608310113174/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <link rel="stylesheet" href="https://unpkg.com/@tailwindcss/typography@0.2.x/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1339908143439106049/yN3IAyxs_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">𝕰𝖑𝖑𝖆 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@ellxrichardson bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@ellxrichardson's tweets](https://twitter.com/ellxrichardson). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>2223</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>809</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>104</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>1310</td> </tr> </tbody> </table> [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/vmh3a2kg/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @ellxrichardson's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/22k5wpyo) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/22k5wpyo/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/ellxrichardson'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/electronicbolo
huggingtweets
2021-05-22T02:50:08Z
5
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/electronicbolo/1617756795311/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1379185810633789443/vespU1n8_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">ibu 🔴⨀ 🤖 AI Bot </div> <div style="font-size: 15px">@electronicbolo bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@electronicbolo's tweets](https://twitter.com/electronicbolo). | Data | Quantity | | --- | --- | | Tweets downloaded | 3231 | | Retweets | 138 | | Short tweets | 852 | | Tweets kept | 2241 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1ae9gxsi/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @electronicbolo's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2lia1rl2) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2lia1rl2/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/electronicbolo') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/eigenrobot
huggingtweets
2021-05-22T02:46:27Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/eigenrobot/1617781549970/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1375984692730589186/Qc4-J8qM_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">eigenrobot 🤖 AI Bot </div> <div style="font-size: 15px">@eigenrobot bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@eigenrobot's tweets](https://twitter.com/eigenrobot). | Data | Quantity | | --- | --- | | Tweets downloaded | 3245 | | Retweets | 184 | | Short tweets | 709 | | Tweets kept | 2352 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1mw2r5kr/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @eigenrobot's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1p1mtyuv) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1p1mtyuv/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/eigenrobot') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/eggprophet
huggingtweets
2021-05-22T02:43:43Z
4
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/eggprophet/1616613691016/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1373704622196527108/MBVtWtQp_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">egg of carton 🤖 AI Bot </div> <div style="font-size: 15px">@eggprophet bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@eggprophet's tweets](https://twitter.com/eggprophet). | Data | Quantity | | --- | --- | | Tweets downloaded | 3239 | | Retweets | 567 | | Short tweets | 744 | | Tweets kept | 1928 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3mzoiyit/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @eggprophet's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/11pgnwdu) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/11pgnwdu/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/eggprophet') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/econalytics
huggingtweets
2021-05-22T02:38:03Z
5
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/econalytics/1618503540541/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1097281528277549056/cdGaAwrz_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Mirona Constantinescu 🤖 AI Bot </div> <div style="font-size: 15px">@econalytics bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@econalytics's tweets](https://twitter.com/econalytics). | Data | Quantity | | --- | --- | | Tweets downloaded | 874 | | Retweets | 257 | | Short tweets | 80 | | Tweets kept | 537 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3movytf0/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @econalytics's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/14j91sqe) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/14j91sqe/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/econalytics') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/earlofkaleb
huggingtweets
2021-05-22T02:30:42Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/earlofkaleb/1616731018755/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1364741581018857478/341A0WZO_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Kaleb Earl 🤖 AI Bot </div> <div style="font-size: 15px">@earlofkaleb bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@earlofkaleb's tweets](https://twitter.com/earlofkaleb). | Data | Quantity | | --- | --- | | Tweets downloaded | 3178 | | Retweets | 2115 | | Short tweets | 179 | | Tweets kept | 884 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/sbhg8e8n/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @earlofkaleb's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2uv3mv45) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2uv3mv45/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/earlofkaleb') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/dysexliaa
huggingtweets
2021-05-22T02:29:34Z
4
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/dysexliaa/1616691912003/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1333118040414826500/eMcGXEMT_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">SoFine 🤖 AI Bot </div> <div style="font-size: 15px">@dysexliaa bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@dysexliaa's tweets](https://twitter.com/dysexliaa). | Data | Quantity | | --- | --- | | Tweets downloaded | 1387 | | Retweets | 205 | | Short tweets | 224 | | Tweets kept | 958 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/22uxy10l/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @dysexliaa's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/28fr49gk) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/28fr49gk/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/dysexliaa') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/dynamic_proxy
huggingtweets
2021-05-22T02:28:07Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/dynamic_proxy/1616667039166/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1364933895234453506/ljzT7r4B_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">a gnarled woodland spirit 🤖 AI Bot </div> <div style="font-size: 15px">@dynamic_proxy bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@dynamic_proxy's tweets](https://twitter.com/dynamic_proxy). | Data | Quantity | | --- | --- | | Tweets downloaded | 3243 | | Retweets | 204 | | Short tweets | 147 | | Tweets kept | 2892 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/19d2wxay/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @dynamic_proxy's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1ce0iq2v) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1ce0iq2v/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/dynamic_proxy') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/dudeswatcheva
huggingtweets
2021-05-22T02:24:04Z
5
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/dudeswatcheva/1614184345694/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1363154857042071557/Vhr-cXl__400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">jumbudrif 🤖 AI Bot </div> <div style="font-size: 15px">@dudeswatcheva bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@dudeswatcheva's tweets](https://twitter.com/dudeswatcheva). | Data | Quantity | | --- | --- | | Tweets downloaded | 3169 | | Retweets | 691 | | Short tweets | 670 | | Tweets kept | 1808 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3pxmaagu/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @dudeswatcheva's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2oh4eeyp) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2oh4eeyp/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/dudeswatcheva') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/drumbunkerdrag1
huggingtweets
2021-05-22T02:21:16Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1338105955188355074/gOecZ4jM_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Night the aeromorph 🤖 AI Bot </div> <div style="font-size: 15px">@drumbunkerdrag1 bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@drumbunkerdrag1's tweets](https://twitter.com/drumbunkerdrag1). | Data | Quantity | | --- | --- | | Tweets downloaded | 148 | | Retweets | 8 | | Short tweets | 27 | | Tweets kept | 113 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1sfm2bdn/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @drumbunkerdrag1's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2trykl4g) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2trykl4g/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/drumbunkerdrag1') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/dril-horse_ebooks
huggingtweets
2021-05-22T02:13:54Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/dril-horse_ebooks/1621254059055/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/847818629840228354/VXyQHfn0_400x400.jpg&#39;)"> </div> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1096005346/1_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI CYBORG 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">wint & Horse ebooks</div> <div style="text-align: center; font-size: 14px;">@dril-horse_ebooks</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from wint & Horse ebooks. | Data | wint | Horse ebooks | | --- | --- | --- | | Tweets downloaded | 3225 | 3200 | | Retweets | 458 | 0 | | Short tweets | 302 | 421 | | Tweets kept | 2465 | 2779 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/im94xdk8/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @dril-horse_ebooks's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/8vo03p09) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/8vo03p09/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/dril-horse_ebooks') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/dpakman
huggingtweets
2021-05-22T02:09:58Z
4
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1238524243996000257/JtmbZZL-_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">David Pakman 🤖 AI Bot </div> <div style="font-size: 15px">@dpakman bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@dpakman's tweets](https://twitter.com/dpakman). | Data | Quantity | | --- | --- | | Tweets downloaded | 3250 | | Retweets | 49 | | Short tweets | 418 | | Tweets kept | 2783 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/el9fwqxw/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @dpakman's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2esg5gfa) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2esg5gfa/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/dpakman') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/dorkyfolf
huggingtweets
2021-05-22T02:06:44Z
7
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/dorkyfolf/1617804114723/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1326642076298203136/_aPBjlCI_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Thistle Bnuuy 🤖 AI Bot </div> <div style="font-size: 15px">@dorkyfolf bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@dorkyfolf's tweets](https://twitter.com/dorkyfolf). | Data | Quantity | | --- | --- | | Tweets downloaded | 2881 | | Retweets | 1665 | | Short tweets | 255 | | Tweets kept | 961 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2m0yq9vg/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @dorkyfolf's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2wv3osjp) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2wv3osjp/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/dorkyfolf') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/donalddhoffman
huggingtweets
2021-05-22T02:03:12Z
5
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/610933861866835969/wRgRnVOt_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Donald Hoffman 🤖 AI Bot </div> <div style="font-size: 15px">@donalddhoffman bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@donalddhoffman's tweets](https://twitter.com/donalddhoffman). | Data | Quantity | | --- | --- | | Tweets downloaded | 236 | | Retweets | 11 | | Short tweets | 45 | | Tweets kept | 180 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1wzfrcs4/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @donalddhoffman's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2zo2lld7) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2zo2lld7/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/donalddhoffman') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/domandcats
huggingtweets
2021-05-22T01:59:15Z
5
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/domandcats/1616883428985/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1375881363056947208/CpdPn02h_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Dom and Cats 😼 🤖 AI Bot </div> <div style="font-size: 15px">@domandcats bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@domandcats's tweets](https://twitter.com/domandcats). | Data | Quantity | | --- | --- | | Tweets downloaded | 3249 | | Retweets | 69 | | Short tweets | 452 | | Tweets kept | 2728 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/24l3uch3/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @domandcats's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/nsc2js1f) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/nsc2js1f/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/domandcats') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/dobbelaerew
huggingtweets
2021-05-22T01:50:31Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://github.com/borisdayma/huggingtweets/blob/master/img/logo_share.png?raw=true tags: - huggingtweets widget: - text: "My dream is" --- <link rel="stylesheet" href="https://unpkg.com/@tailwindcss/typography@0.2.x/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('http://pbs.twimg.com/profile_images/1193922190955167744/kFfjfcL4_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Matthias Dobbelaere-Welvaert 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@dobbelaerew bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@dobbelaerew's tweets](https://twitter.com/dobbelaerew). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3208</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>344</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>350</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>2514</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/30tqyeyq/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @dobbelaerew's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/2rthb7d0) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/2rthb7d0/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/dobbelaerew'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/dkulchar
huggingtweets
2021-05-22T01:49:00Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1373813806179225602/dfnXLAJp_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">memphis milano enthusiast 🤖 AI Bot </div> <div style="font-size: 15px">@dkulchar bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@dkulchar's tweets](https://twitter.com/dkulchar). | Data | Quantity | | --- | --- | | Tweets downloaded | 3236 | | Retweets | 551 | | Short tweets | 569 | | Tweets kept | 2116 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3ar0h0xc/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @dkulchar's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/v3hyz25i) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/v3hyz25i/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/dkulchar') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/disconcision
huggingtweets
2021-05-22T01:46:36Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/disconcision/1616643733458/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/983773516880232448/XsKqt1c8_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">andrew blinn 🤖 AI Bot </div> <div style="font-size: 15px">@disconcision bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@disconcision's tweets](https://twitter.com/disconcision). | Data | Quantity | | --- | --- | | Tweets downloaded | 2262 | | Retweets | 453 | | Short tweets | 159 | | Tweets kept | 1650 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/n4jrdsqh/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @disconcision's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2f36wyoh) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2f36wyoh/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/disconcision') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/discarddiscord
huggingtweets
2021-05-22T01:45:29Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/discarddiscord/1614246710317/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1029964613029437440/3_fRmZuH_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">luna 🤖 AI Bot </div> <div style="font-size: 15px">@discarddiscord bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@discarddiscord's tweets](https://twitter.com/discarddiscord). | Data | Quantity | | --- | --- | | Tweets downloaded | 1495 | | Retweets | 289 | | Short tweets | 213 | | Tweets kept | 993 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1tvxkurq/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @discarddiscord's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2g2xt22m) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2g2xt22m/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/discarddiscord') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/digitalsoyboy
huggingtweets
2021-05-22T01:43:24Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/digitalsoyboy/1617805776990/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1305154265267212292/BSD6EVuq_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">george w kush 🤖 AI Bot </div> <div style="font-size: 15px">@digitalsoyboy bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@digitalsoyboy's tweets](https://twitter.com/digitalsoyboy). | Data | Quantity | | --- | --- | | Tweets downloaded | 3170 | | Retweets | 462 | | Short tweets | 369 | | Tweets kept | 2339 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/26qiav6i/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @digitalsoyboy's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3b4m8rf4) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3b4m8rf4/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/digitalsoyboy') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/digitalsolver1
huggingtweets
2021-05-22T01:42:12Z
5
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/digitalsolver1/1616653735166/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1233161774444113920/7La7pvBs_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Digitalsolver 🤖 AI Bot </div> <div style="font-size: 15px">@digitalsolver1 bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@digitalsolver1's tweets](https://twitter.com/digitalsolver1). | Data | Quantity | | --- | --- | | Tweets downloaded | 291 | | Retweets | 112 | | Short tweets | 70 | | Tweets kept | 109 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/23z4oayh/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @digitalsolver1's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/237vwzkl) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/237vwzkl/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/digitalsolver1') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/diaz_de_leon
huggingtweets
2021-05-22T01:39:44Z
4
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/diaz_de_leon/1603509315873/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <link rel="stylesheet" href="https://unpkg.com/@tailwindcss/typography@0.2.x/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1245735185787822080/riKefvZr_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Carlos 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@diaz_de_leon bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@diaz_de_leon's tweets](https://twitter.com/diaz_de_leon). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>718</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>167</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>66</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>485</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/w8v47wri/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @diaz_de_leon's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/17bl278f) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/17bl278f/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/diaz_de_leon'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets) <!--- random size file -->
huggingtweets/dgcyt_
huggingtweets
2021-05-22T01:35:05Z
5
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/dgcyt_/1619447035696/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1379969696028581890/nIzf87ii_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">dGc 🤖 AI Bot </div> <div style="font-size: 15px">@dgcyt_ bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@dgcyt_'s tweets](https://twitter.com/dgcyt_). | Data | Quantity | | --- | --- | | Tweets downloaded | 907 | | Retweets | 31 | | Short tweets | 353 | | Tweets kept | 523 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2172uj60/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @dgcyt_'s tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3goormmx) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3goormmx/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/dgcyt_') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/devkoob
huggingtweets
2021-05-22T01:30:05Z
4
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/devkoob/1609552229453/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <link rel="stylesheet" href="https://unpkg.com/@tailwindcss/typography@0.2.x/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1288646558364372994/jgsTkFCl_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">koob85 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@devkoob bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@devkoob's tweets](https://twitter.com/devkoob). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>712</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>27</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>191</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>494</td> </tr> </tbody> </table> [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/qdwtu190/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @devkoob's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/nweh9viw) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/nweh9viw/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/devkoob'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/derweise91
huggingtweets
2021-05-22T01:26:05Z
4
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/derweise91/1616691639404/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1156676050576801792/_i8SOLw3_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">nathan, stuck on magic mountain 🤖 AI Bot </div> <div style="font-size: 15px">@derweise91 bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@derweise91's tweets](https://twitter.com/derweise91). | Data | Quantity | | --- | --- | | Tweets downloaded | 3233 | | Retweets | 468 | | Short tweets | 408 | | Tweets kept | 2357 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2cgk3c79/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @derweise91's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/yxo7yhz5) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/yxo7yhz5/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/derweise91') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/deptofsophistry
huggingtweets
2021-05-22T01:23:26Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/deptofsophistry/1621365721868/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1392219624956219402/HuLQmDB6_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">yo sophist</div> <div style="text-align: center; font-size: 14px;">@deptofsophistry</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from yo sophist. | Data | yo sophist | | --- | --- | | Tweets downloaded | 3215 | | Retweets | 327 | | Short tweets | 762 | | Tweets kept | 2126 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3p698zbi/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @deptofsophistry's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3nt0sevr) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3nt0sevr/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/deptofsophistry') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/demirenjun
huggingtweets
2021-05-22T01:19:12Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/demirenjun/1617917661023/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1354964611586547715/WIIHy349_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">rj bday (season) 🦜🍓💝 🤖 AI Bot </div> <div style="font-size: 15px">@demirenjun bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@demirenjun's tweets](https://twitter.com/demirenjun). | Data | Quantity | | --- | --- | | Tweets downloaded | 3199 | | Retweets | 800 | | Short tweets | 384 | | Tweets kept | 2015 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1bdlmgyb/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @demirenjun's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1ck8cxvw) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1ck8cxvw/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/demirenjun') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/delicious_tacos
huggingtweets
2021-05-22T01:13:28Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true tags: - huggingtweets widget: - text: "My dream is" --- <link rel="stylesheet" href="https://unpkg.com/@tailwindcss/typography@0.2.x/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('http://pbs.twimg.com/profile_images/2300677070/886b0055ea8f8e5ba55a58f8ea82dac8_400x400.png')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Delicious Tacos 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@delicious_tacos bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@delicious_tacos's tweets](https://twitter.com/delicious_tacos). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3214</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>817</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>578</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>1819</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/1573t9o6/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @delicious_tacos's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/39svipdd) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/39svipdd/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/delicious_tacos'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/deityofyoutube
huggingtweets
2021-05-22T01:11:15Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/deityofyoutube/1620252403590/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1389779771396132864/YfpCtmQo_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Just the Tip 🤖 AI Bot </div> <div style="font-size: 15px">@deityofyoutube bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@deityofyoutube's tweets](https://twitter.com/deityofyoutube). | Data | Quantity | | --- | --- | | Tweets downloaded | 1518 | | Retweets | 58 | | Short tweets | 47 | | Tweets kept | 1413 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3o3puxa8/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @deityofyoutube's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3ou2v7h4) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3ou2v7h4/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/deityofyoutube') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/decoratedboar
huggingtweets
2021-05-22T01:04:05Z
4
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/decoratedboar/1617764745447/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1379273929961861122/3GjrPmt5_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">rei ayanami appreciator 🤖 AI Bot </div> <div style="font-size: 15px">@decoratedboar bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@decoratedboar's tweets](https://twitter.com/decoratedboar). | Data | Quantity | | --- | --- | | Tweets downloaded | 3226 | | Retweets | 1039 | | Short tweets | 408 | | Tweets kept | 1779 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3jw8lo8p/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @decoratedboar's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2opldfex) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2opldfex/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/decoratedboar') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/dealingporn
huggingtweets
2021-05-22T01:00:24Z
4
3
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true tags: - huggingtweets widget: - text: "My dream is" --- <link rel="stylesheet" href="https://unpkg.com/@tailwindcss/typography@0.2.x/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('http://pbs.twimg.com/profile_images/1172237959032201219/8tZPfA9n_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">💟 Dealing Porn 💟 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@dealingporn bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@dealingporn's tweets](https://twitter.com/dealingporn). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3213</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>0</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>2141</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>1072</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/10cc3bw9/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @dealingporn's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/2d2vjn6v) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/2d2vjn6v/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/dealingporn'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/deahq
huggingtweets
2021-05-22T00:59:16Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1145694071270326273/GPZwtxlf_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">DEA HQ 🤖 AI Bot </div> <div style="font-size: 15px">@deahq bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@deahq's tweets](https://twitter.com/deahq). | Data | Quantity | | --- | --- | | Tweets downloaded | 3226 | | Retweets | 1568 | | Short tweets | 25 | | Tweets kept | 1633 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1th11ksq/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @deahq's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1vxabqh4) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1vxabqh4/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/deahq') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/david_rccv
huggingtweets
2021-05-22T00:51:10Z
5
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/david_rccv/1602336344954/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <link rel="stylesheet" href="https://unpkg.com/@tailwindcss/typography@0.2.x/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/855423041161105408/f8QTAXnm_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Daviz 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@david_rccv bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@david_rccv's tweets](https://twitter.com/david_rccv). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3117</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>1335</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>187</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>1595</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/2u4p05g9/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @david_rccv's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/2njpujl0) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/2njpujl0/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/david_rccv'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets) <!--- random size file -->
huggingtweets/david_desj
huggingtweets
2021-05-22T00:50:08Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/david_desj/1616642206560/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1333105810851975169/duOCN2P4_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">David desJardins 🤖 AI Bot </div> <div style="font-size: 15px">@david_desj bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@david_desj's tweets](https://twitter.com/david_desj). | Data | Quantity | | --- | --- | | Tweets downloaded | 3250 | | Retweets | 41 | | Short tweets | 23 | | Tweets kept | 3186 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/391ogow1/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @david_desj's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2qeggjsp) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2qeggjsp/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/david_desj') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/davemcnamee3000
huggingtweets
2021-05-22T00:49:11Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/davemcnamee3000/1616698880059/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1344380479852535808/6UHzjMj1_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">the responsible uncle 🤖 AI Bot </div> <div style="font-size: 15px">@davemcnamee3000 bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@davemcnamee3000's tweets](https://twitter.com/davemcnamee3000). | Data | Quantity | | --- | --- | | Tweets downloaded | 3241 | | Retweets | 416 | | Short tweets | 515 | | Tweets kept | 2310 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/13u5lzdf/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @davemcnamee3000's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2rgzuxgk) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2rgzuxgk/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/davemcnamee3000') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/darknessisdark
huggingtweets
2021-05-22T00:45:26Z
4
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/darknessisdark/1614100288472/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1361059617971851274/wo891uB-_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">xX_g3ndr_havr_Xx 🤖 AI Bot </div> <div style="font-size: 15px">@darknessisdark bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@darknessisdark's tweets](https://twitter.com/darknessisdark). | Data | Quantity | | --- | --- | | Tweets downloaded | 2801 | | Retweets | 583 | | Short tweets | 306 | | Tweets kept | 1912 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1dwe171y/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @darknessisdark's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/d0xrvuif) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/d0xrvuif/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/darknessisdark') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/dansalvato
huggingtweets
2021-05-22T00:40:26Z
5
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/dansalvato/1612858230042/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1133218052303048704/1JXz7DT8_400x400.png')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Dan Salvato 🤖 AI Bot </div> <div style="font-size: 15px">@dansalvato bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@dansalvato's tweets](https://twitter.com/dansalvato). | Data | Quantity | | --- | --- | | Tweets downloaded | 3233 | | Retweets | 197 | | Short tweets | 233 | | Tweets kept | 2803 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2jbu3vnq/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @dansalvato's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1c66e4az) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1c66e4az/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/dansalvato') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/dani_remade
huggingtweets
2021-05-22T00:33:45Z
4
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/dani_remade/1614192638303/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1034831543070425088/G-uRCdkZ_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Dani 🤖 AI Bot </div> <div style="font-size: 15px">@dani_remade bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@dani_remade's tweets](https://twitter.com/dani_remade). | Data | Quantity | | --- | --- | | Tweets downloaded | 2270 | | Retweets | 1509 | | Short tweets | 108 | | Tweets kept | 653 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/c0gm6a77/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @dani_remade's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/18pilez8) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/18pilez8/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/dani_remade') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/danaludwig
huggingtweets
2021-05-22T00:25:42Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/danaludwig/1603674265747/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <link rel="stylesheet" href="https://unpkg.com/@tailwindcss/typography@0.2.x/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1046498722815782912/JkZIybb-_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Dana Ludwig 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@danaludwig bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@danaludwig's tweets](https://twitter.com/danaludwig). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>637</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>59</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>6</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>572</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/2pqe574m/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @danaludwig's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/j3gaz7tt) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/j3gaz7tt/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/danaludwig'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets) <!--- random size file -->
huggingtweets/damydothedishes
huggingtweets
2021-05-22T00:23:40Z
4
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/damydothedishes/1614113189615/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1349936650680492033/89ovSSF7_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Himbo Slice 🤖 AI Bot </div> <div style="font-size: 15px">@damydothedishes bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@damydothedishes's tweets](https://twitter.com/damydothedishes). | Data | Quantity | | --- | --- | | Tweets downloaded | 3150 | | Retweets | 510 | | Short tweets | 341 | | Tweets kept | 2299 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3jntufpd/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @damydothedishes's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2awoazq2) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2awoazq2/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/damydothedishes') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/damelonbcws
huggingtweets
2021-05-22T00:20:37Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1363858986903175169/2T0U6RjX_400x400.png')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Bas Wisselink 🤖 AI Bot </div> <div style="font-size: 15px">@damelonbcws bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@damelonbcws's tweets](https://twitter.com/damelonbcws). | Data | Quantity | | --- | --- | | Tweets downloaded | 3242 | | Retweets | 890 | | Short tweets | 186 | | Tweets kept | 2166 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/17sw2i75/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @damelonbcws's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/8x4kzglp) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/8x4kzglp/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/damelonbcws') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/dalailama
huggingtweets
2021-05-22T00:16:59Z
4
2
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/dalailama/1615997106867/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/529214699041067008/fqPBAr5s_400x400.jpeg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Dalai Lama 🤖 AI Bot </div> <div style="font-size: 15px">@dalailama bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@dalailama's tweets](https://twitter.com/dalailama). | Data | Quantity | | --- | --- | | Tweets downloaded | 1664 | | Retweets | 0 | | Short tweets | 1 | | Tweets kept | 1663 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/p9scy18q/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @dalailama's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1acshcvu) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1acshcvu/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/dalailama') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/dabit3
huggingtweets
2021-05-22T00:08:03Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/dabit3/1607128642974/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <link rel="stylesheet" href="https://unpkg.com/@tailwindcss/typography@0.2.x/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1268352530423205889/V6Nz7mIt_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Nader Dabit 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@dabit3 bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@dabit3's tweets](https://twitter.com/dabit3). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3205</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>822</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>449</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>1934</td> </tr> </tbody> </table> [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/19qlkkql/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @dabit3's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/ai4t9ptt) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/ai4t9ptt/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/dabit3'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/dababydababy
huggingtweets
2021-05-22T00:07:00Z
5
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1325253702815510530/aRBRhmOs_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">DaBaby 🤖 AI Bot </div> <div style="font-size: 15px">@dababydababy bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@dababydababy's tweets](https://twitter.com/dababydababy). | Data | Quantity | | --- | --- | | Tweets downloaded | 2970 | | Retweets | 2078 | | Short tweets | 300 | | Tweets kept | 592 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2pxo4nhg/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @dababydababy's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3byz0p03) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3byz0p03/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/dababydababy') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/cyberglyphic
huggingtweets
2021-05-22T00:02:21Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/cyberglyphic/1616616677471/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1371588783070707713/X0k6xQs__400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">cyberglyphic 🤖 AI Bot </div> <div style="font-size: 15px">@cyberglyphic bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@cyberglyphic's tweets](https://twitter.com/cyberglyphic). | Data | Quantity | | --- | --- | | Tweets downloaded | 3174 | | Retweets | 498 | | Short tweets | 340 | | Tweets kept | 2336 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/243v14nf/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @cyberglyphic's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2qa0qgs8) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2qa0qgs8/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/cyberglyphic') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/cyberbully66
huggingtweets
2021-05-21T23:59:40Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/cyberbully66/1616851006786/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1375463332732403714/TP6hwUxm_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">evil succubus 🤖 AI Bot </div> <div style="font-size: 15px">@cyberbully66 bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@cyberbully66's tweets](https://twitter.com/cyberbully66). | Data | Quantity | | --- | --- | | Tweets downloaded | 3195 | | Retweets | 397 | | Short tweets | 570 | | Tweets kept | 2228 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2c5t9ev6/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @cyberbully66's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/e4ld23gl) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/e4ld23gl/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/cyberbully66') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/cute_sayako
huggingtweets
2021-05-21T23:55:06Z
5
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/cute_sayako/1617765258616/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1342492565560430593/Ntm4IL-T_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Sayako Hoshimiya 🤖 AI Bot </div> <div style="font-size: 15px">@cute_sayako bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@cute_sayako's tweets](https://twitter.com/cute_sayako). | Data | Quantity | | --- | --- | | Tweets downloaded | 3228 | | Retweets | 324 | | Short tweets | 1887 | | Tweets kept | 1017 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3cfs9mn2/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @cute_sayako's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1esq77ko) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1esq77ko/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/cute_sayako') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/curtkrone
huggingtweets
2021-05-21T23:52:36Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/curtkrone/1614127926380/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/46209982/curtsmall_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Curt Krone 🤖 AI Bot </div> <div style="font-size: 15px">@curtkrone bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@curtkrone's tweets](https://twitter.com/curtkrone). | Data | Quantity | | --- | --- | | Tweets downloaded | 3203 | | Retweets | 1135 | | Short tweets | 250 | | Tweets kept | 1818 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3o9jhw98/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @curtkrone's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1gge3iwo) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1gge3iwo/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/curtkrone') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/crusaderkings
huggingtweets
2021-05-21T23:45:42Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://github.com/borisdayma/huggingtweets/blob/master/img/logo_share.png?raw=true tags: - huggingtweets widget: - text: "My dream is" --- <link rel="stylesheet" href="https://unpkg.com/@tailwindcss/typography@0.2.x/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('http://pbs.twimg.com/profile_images/1185482861639589889/VEfoJcDk_400x400.png')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Crusader Kings III 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@crusaderkings bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@crusaderkings's tweets](https://twitter.com/crusaderkings). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3234</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>506</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>174</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>2554</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/3qnalhzx/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @crusaderkings's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/34525ldv) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/34525ldv/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/crusaderkings'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/crowonthewire1
huggingtweets
2021-05-21T23:42:26Z
4
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/crowonthewire1/1616665645467/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1298370252028469249/WEARc0H5_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Tolga Esat 🤖 AI Bot </div> <div style="font-size: 15px">@crowonthewire1 bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@crowonthewire1's tweets](https://twitter.com/crowonthewire1). | Data | Quantity | | --- | --- | | Tweets downloaded | 187 | | Retweets | 10 | | Short tweets | 10 | | Tweets kept | 167 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2iu7215s/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @crowonthewire1's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/8zo7rrc0) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/8zo7rrc0/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/crowonthewire1') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/crisprchild
huggingtweets
2021-05-21T23:38:41Z
4
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/crisprchild/1617755013837/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1362975023741472772/bcRrzmub_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Heartbeat 🤖 AI Bot </div> <div style="font-size: 15px">@crisprchild bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@crisprchild's tweets](https://twitter.com/crisprchild). | Data | Quantity | | --- | --- | | Tweets downloaded | 3227 | | Retweets | 130 | | Short tweets | 419 | | Tweets kept | 2678 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/pxtxtm8s/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @crisprchild's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/8dczuuse) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/8dczuuse/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/crisprchild') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/corpse_husband
huggingtweets
2021-05-21T23:31:34Z
7
1
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/corpse_husband/1617737581104/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1366103941675585536/EcMyRRuK_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Corpse Husband 🤖 AI Bot </div> <div style="font-size: 15px">@corpse_husband bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@corpse_husband's tweets](https://twitter.com/corpse_husband). | Data | Quantity | | --- | --- | | Tweets downloaded | 1533 | | Retweets | 35 | | Short tweets | 534 | | Tweets kept | 964 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/183lret6/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @corpse_husband's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3ctkgzjp) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3ctkgzjp/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/corpse_husband') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/coronavid19
huggingtweets
2021-05-21T23:30:27Z
5
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/coronavid19/1608807621950/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <link rel="stylesheet" href="https://unpkg.com/@tailwindcss/typography@0.2.x/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1232060545626497024/ltc63x4__400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Coronavirus 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@coronavid19 bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@coronavid19's tweets](https://twitter.com/coronavid19). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>1618</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>12</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>96</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>1510</td> </tr> </tbody> </table> [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1lgjd18p/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @coronavid19's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1ki9s94y) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1ki9s94y/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/coronavid19'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/cooperquinn_wy
huggingtweets
2021-05-21T23:28:57Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/cooperquinn_wy/1617467984667/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/425749544886755329/_1EJmE-8_400x400.png')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Cooper Quinn 🤖 AI Bot </div> <div style="font-size: 15px">@cooperquinn_wy bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@cooperquinn_wy's tweets](https://twitter.com/cooperquinn_wy). | Data | Quantity | | --- | --- | | Tweets downloaded | 3242 | | Retweets | 452 | | Short tweets | 564 | | Tweets kept | 2226 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/4kx01uhm/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @cooperquinn_wy's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3vg5bxn2) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3vg5bxn2/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/cooperquinn_wy') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/coolnerdfacts
huggingtweets
2021-05-21T23:26:46Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/coolnerdfacts/1614213636356/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1330648314203631619/v2qx0ncL_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">shinji icarly 🤖 AI Bot </div> <div style="font-size: 15px">@coolnerdfacts bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@coolnerdfacts's tweets](https://twitter.com/coolnerdfacts). | Data | Quantity | | --- | --- | | Tweets downloaded | 1711 | | Retweets | 513 | | Short tweets | 128 | | Tweets kept | 1070 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3l96gdy8/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @coolnerdfacts's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/o5cywwmo) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/o5cywwmo/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/coolnerdfacts') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/contrapoints
huggingtweets
2021-05-21T23:25:36Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/contrapoints/1616752707998/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1303044453544976394/U4N_fNXn_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Natalie Wynn 🤖 AI Bot </div> <div style="font-size: 15px">@contrapoints bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@contrapoints's tweets](https://twitter.com/contrapoints). | Data | Quantity | | --- | --- | | Tweets downloaded | 658 | | Retweets | 73 | | Short tweets | 132 | | Tweets kept | 453 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2tehrpnk/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @contrapoints's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/i7tqmbhr) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/i7tqmbhr/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/contrapoints') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/conspiracyb0t
huggingtweets
2021-05-21T23:24:23Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/conspiracyb0t/1618535079312/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1381333613585727489/KjV-Te29_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">conspiracybot 🤖 AI Bot </div> <div style="font-size: 15px">@conspiracyb0t bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@conspiracyb0t's tweets](https://twitter.com/conspiracyb0t). | Data | Quantity | | --- | --- | | Tweets downloaded | 3250 | | Retweets | 0 | | Short tweets | 1603 | | Tweets kept | 1647 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1nfdu4jd/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @conspiracyb0t's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/25ymtdbi) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/25ymtdbi/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/conspiracyb0t') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/confusionm8trix
huggingtweets
2021-05-21T23:21:57Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/confusionm8trix/1616684874152/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1293318707469410304/OfdJ5rPz_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">henry 🤖 AI Bot </div> <div style="font-size: 15px">@confusionm8trix bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@confusionm8trix's tweets](https://twitter.com/confusionm8trix). | Data | Quantity | | --- | --- | | Tweets downloaded | 766 | | Retweets | 52 | | Short tweets | 108 | | Tweets kept | 606 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2otdqnlb/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @confusionm8trix's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1tgtfwi1) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1tgtfwi1/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/confusionm8trix') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/conanobrien
huggingtweets
2021-05-21T23:19:32Z
4
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/conanobrien/1606267014440/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <link rel="stylesheet" href="https://unpkg.com/@tailwindcss/typography@0.2.x/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/730612231021322240/Rl0_QYhL_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Conan O'Brien 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@conanobrien bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@conanobrien's tweets](https://twitter.com/conanobrien). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3241</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>31</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>18</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>3192</td> </tr> </tbody> </table> [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2fdxdxdd/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @conanobrien's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/ffkm78bf) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/ffkm78bf/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/conanobrien'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets) <!--- random size file -->
huggingtweets/comradegoomba
huggingtweets
2021-05-21T23:16:39Z
4
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/comradegoomba/1617758746828/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1377946709163118593/fP4OTS0t_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Felix 🌶🥢🧢🌔🧨🏴‍☠️ 🤖 AI Bot </div> <div style="font-size: 15px">@comradegoomba bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@comradegoomba's tweets](https://twitter.com/comradegoomba). | Data | Quantity | | --- | --- | | Tweets downloaded | 3121 | | Retweets | 974 | | Short tweets | 383 | | Tweets kept | 1764 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1kojhc3k/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @comradegoomba's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/55koa964) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/55koa964/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/comradegoomba') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/commanderwuff
huggingtweets
2021-05-21T23:15:31Z
5
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/commanderwuff/1614170164099/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1363930888585703425/kbXPjWRV_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">CommanderWuffels 🤖 AI Bot </div> <div style="font-size: 15px">@commanderwuff bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@commanderwuff's tweets](https://twitter.com/commanderwuff). | Data | Quantity | | --- | --- | | Tweets downloaded | 2214 | | Retweets | 1573 | | Short tweets | 144 | | Tweets kept | 497 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2a74c2hq/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @commanderwuff's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2f3nzjf3) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2f3nzjf3/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/commanderwuff') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/combatfemme
huggingtweets
2021-05-21T23:14:17Z
8
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/combatfemme/1617903424496/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1278680295114338304/9iD9B8s7_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Наташа 🔪🖤 🤖 AI Bot </div> <div style="font-size: 15px">@combatfemme bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@combatfemme's tweets](https://twitter.com/combatfemme). | Data | Quantity | | --- | --- | | Tweets downloaded | 3219 | | Retweets | 2329 | | Short tweets | 344 | | Tweets kept | 546 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/z3tee2kn/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @combatfemme's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2o836lhn) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2o836lhn/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/combatfemme') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/collywobbledd
huggingtweets
2021-05-21T23:12:30Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/collywobbledd/1614151252321/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1362948147572572161/Pp0Kh-aA_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">nathan(?) 🤖 AI Bot </div> <div style="font-size: 15px">@collywobbledd bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@collywobbledd's tweets](https://twitter.com/collywobbledd). | Data | Quantity | | --- | --- | | Tweets downloaded | 3101 | | Retweets | 1514 | | Short tweets | 400 | | Tweets kept | 1187 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1ytwvko1/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @collywobbledd's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/daoqimwm) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/daoqimwm/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/collywobbledd') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/cocojonesspace
huggingtweets
2021-05-21T23:07:36Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1316993924297334784/rFkGii31_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Cody 🤖 AI Bot </div> <div style="font-size: 15px">@cocojonesspace bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@cocojonesspace's tweets](https://twitter.com/cocojonesspace). | Data | Quantity | | --- | --- | | Tweets downloaded | 609 | | Retweets | 439 | | Short tweets | 37 | | Tweets kept | 133 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1rf16z1e/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @cocojonesspace's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3ppd5jtm) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3ppd5jtm/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/cocojonesspace') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/cnstnce_
huggingtweets
2021-05-21T23:04:49Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/cnstnce_/1608361711919/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <link rel="stylesheet" href="https://unpkg.com/@tailwindcss/typography@0.2.x/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1284828941170049024/F7JsMZKX_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">aalis na ko sa acc na to bye 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@cnstnce_ bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@cnstnce_'s tweets](https://twitter.com/cnstnce_). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3056</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>503</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>748</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>1805</td> </tr> </tbody> </table> [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/zzbcppb9/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @cnstnce_'s tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/32cd3gw3) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/32cd3gw3/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/cnstnce_'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/clementdelangue-julien_c-thom_wolf
huggingtweets
2021-05-21T22:55:12Z
5
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/clementdelangue-julien_c-thom_wolf/1620425745320/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1100512198139498497/utHSJ4st_400x400.png&#39;)"> </div> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1108502565925326850/zPsBf2BI_400x400.png&#39;)"> </div> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1259944219881455617/asyRCk6l_400x400.jpg&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI CYBORG 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">clem 🤗 & Julien Chaumond & Thomas Wolf</div> <div style="text-align: center; font-size: 14px;">@clementdelangue-julien_c-thom_wolf</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from clem 🤗 & Julien Chaumond & Thomas Wolf. | Data | clem 🤗 | Julien Chaumond | Thomas Wolf | | --- | --- | --- | --- | | Tweets downloaded | 3238 | 3240 | 1688 | | Retweets | 1743 | 1014 | 484 | | Short tweets | 297 | 357 | 102 | | Tweets kept | 1198 | 1869 | 1102 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/14f834t6/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @clementdelangue-julien_c-thom_wolf's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1b9cejln) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1b9cejln/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/clementdelangue-julien_c-thom_wolf') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/clar_rah
huggingtweets
2021-05-21T22:51:36Z
5
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/clar_rah/1616669392957/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1356981273064054786/MDWBALP2_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">clara 🤖 AI Bot </div> <div style="font-size: 15px">@clar_rah bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@clar_rah's tweets](https://twitter.com/clar_rah). | Data | Quantity | | --- | --- | | Tweets downloaded | 140 | | Retweets | 8 | | Short tweets | 12 | | Tweets kept | 120 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2un2vj37/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @clar_rah's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1i2jul1h) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1i2jul1h/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/clar_rah') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/ckinzthompson
huggingtweets
2021-05-21T22:48:16Z
5
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/ckinzthompson/1616644271573/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1229936832764358656/RDxpoKaU_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Colin Kinz-Thompson 🤖 AI Bot </div> <div style="font-size: 15px">@ckinzthompson bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@ckinzthompson's tweets](https://twitter.com/ckinzthompson). | Data | Quantity | | --- | --- | | Tweets downloaded | 503 | | Retweets | 228 | | Short tweets | 24 | | Tweets kept | 251 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2tph4rst/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @ckinzthompson's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/xoo3o9mc) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/xoo3o9mc/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/ckinzthompson') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/ciphersbane
huggingtweets
2021-05-21T22:46:08Z
4
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/ciphersbane/1617767732812/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1339425166879846402/cz2uFrU7_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">꧁༒ ripple star 🤖 AI Bot </div> <div style="font-size: 15px">@ciphersbane bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@ciphersbane's tweets](https://twitter.com/ciphersbane). | Data | Quantity | | --- | --- | | Tweets downloaded | 3205 | | Retweets | 1853 | | Short tweets | 471 | | Tweets kept | 881 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2iovn6wx/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @ciphersbane's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/311rulis) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/311rulis/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/ciphersbane') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/cindersthereare
huggingtweets
2021-05-21T22:43:45Z
7
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/cindersthereare/1616675711050/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1373682632286896133/ejdKO_bO_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">arne ness 🤖 AI Bot </div> <div style="font-size: 15px">@cindersthereare bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@cindersthereare's tweets](https://twitter.com/cindersthereare). | Data | Quantity | | --- | --- | | Tweets downloaded | 1548 | | Retweets | 138 | | Short tweets | 86 | | Tweets kept | 1324 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1gmybfyv/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @cindersthereare's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1aog9e99) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1aog9e99/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/cindersthereare') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/chumphreys1999
huggingtweets
2021-05-21T22:38:44Z
4
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/chumphreys1999/1617766114142/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1332400588248682496/On_IvHC7_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">cam🍀 🤖 AI Bot </div> <div style="font-size: 15px">@chumphreys1999 bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@chumphreys1999's tweets](https://twitter.com/chumphreys1999). | Data | Quantity | | --- | --- | | Tweets downloaded | 1280 | | Retweets | 370 | | Short tweets | 177 | | Tweets kept | 733 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1lp1b1t5/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @chumphreys1999's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1sog79lw) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1sog79lw/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/chumphreys1999') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/christianreber
huggingtweets
2021-05-21T22:36:38Z
4
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/christianreber/1603809204163/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <link rel="stylesheet" href="https://unpkg.com/@tailwindcss/typography@0.2.x/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1086645278067830789/XqFPR8S9_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Christian Reber 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@christianreber bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@christianreber's tweets](https://twitter.com/christianreber). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>2399</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>1076</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>347</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>976</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/3mu0oy0d/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @christianreber's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/31hsw1pc) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/31hsw1pc/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/christianreber'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets) <!--- random size file -->
huggingtweets/chrissyteigen
huggingtweets
2021-05-21T22:35:31Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://github.com/borisdayma/huggingtweets/blob/master/img/logo_share.png?raw=true tags: - huggingtweets widget: - text: "My dream is" --- <link rel="stylesheet" href="https://unpkg.com/@tailwindcss/typography@0.2.x/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('http://pbs.twimg.com/profile_images/1243351734757425152/e8JZwf03_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">chrissy teigen 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@chrissyteigen bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@chrissyteigen's tweets](https://twitter.com/chrissyteigen). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3211</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>187</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>458</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>2566</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/3rq9c0fg/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @chrissyteigen's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/bsu3mmey) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/bsu3mmey/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/chrissyteigen'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/chipzel
huggingtweets
2021-05-21T22:31:21Z
4
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/chipzel/1620116607583/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1351977667977994247/WQaeeUds_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">🌈titzel✨ 🤖 AI Bot </div> <div style="font-size: 15px">@chipzel bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@chipzel's tweets](https://twitter.com/chipzel). | Data | Quantity | | --- | --- | | Tweets downloaded | 3239 | | Retweets | 600 | | Short tweets | 875 | | Tweets kept | 1764 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2fbonam4/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @chipzel's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/8i1yf5s3) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/8i1yf5s3/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/chipzel') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/chick_in_kiev
huggingtweets
2021-05-21T22:29:08Z
5
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/chick_in_kiev/1614096527753/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1349926213205323776/jKgAjVE6_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Talia Lavin 🤖 AI Bot </div> <div style="font-size: 15px">@chick_in_kiev bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@chick_in_kiev's tweets](https://twitter.com/chick_in_kiev). | Data | Quantity | | --- | --- | | Tweets downloaded | 3185 | | Retweets | 2343 | | Short tweets | 121 | | Tweets kept | 721 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3ns7ik8b/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @chick_in_kiev's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/26rtvpc9) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/26rtvpc9/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/chick_in_kiev') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/cher
huggingtweets
2021-05-21T22:26:58Z
8
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/cher/1614129199723/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1201634247775264768/MxO2TCOj_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Cher 🤖 AI Bot </div> <div style="font-size: 15px">@cher bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@cher's tweets](https://twitter.com/cher). | Data | Quantity | | --- | --- | | Tweets downloaded | 3216 | | Retweets | 142 | | Short tweets | 566 | | Tweets kept | 2508 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1udtrx5t/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @cher's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3hrjcq92) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3hrjcq92/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/cher') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/chenweihua
huggingtweets
2021-05-21T22:25:52Z
5
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/chenweihua/1619283089103/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/485435685814300672/dOZBbo0S_400x400.jpeg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Chen Weihua (陈卫华) 🤖 AI Bot </div> <div style="font-size: 15px">@chenweihua bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@chenweihua's tweets](https://twitter.com/chenweihua). | Data | Quantity | | --- | --- | | Tweets downloaded | 3246 | | Retweets | 1972 | | Short tweets | 97 | | Tweets kept | 1177 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3nn5asnt/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @chenweihua's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/29y23q19) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/29y23q19/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/chenweihua') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)